The case where a) you don't care about the in-memory representation of your struct and b) you care a lot about being able to pack into the absolute minimum memory space, but not enough to make sure the compiler actually packs the fields (depending on architecture and optimization settings, they might not!) is vanishingly small.
The more frequent perceived use for bit-fields (in the situation where they actually work) is to pack into a serialized data format, such that memory or a data stream can be accessed elsewhere. In that case, "the compiler can do whatever it wants with your data packing" is pretty useless, since your "elsewhere" might have a different compiler that does a totally different thing.
Optimization settings should not affect memory layout as that is specified by ABI (and large part of the “art of structure packing” is about manually reordering struct fields because the compiler cannot do that however obvious the optimization would be).
And as for the second part: anything that writes sizeof(struct foo) bytes of struct foo is inherently non-portable. If you portably want to (de)serialize something you want to write the thing explicitly, very often the compiler will optimize it to more direct implementation. (And well, this is only portable to platforms where CHAR_BITS == 8)
Anything that affects the actual instructions executed on the actual chip they're executed on may make what works here not work there.
Optimization that does not affect instructions is no optimization at all. Bitfields are an extremely fragile part of implementations. Trust it at your own risk.
What I'm saying is that the case where you want to use less RAM for a bit field but you don't actually care if the compiler allocates less then an addressable line of RAM for that bit field (because it actually just might not) is pretty empty.
Edit: I know it's hard to read a whole sentence at once, but I made that same point directly up there too.
If "foo" is defined as part of an API/ABI that's used in multiple compile units you will always care, since otherwise a random change in "implementation defined" bitfield encodings on some obscure architecture might break your build. Bitfields are a misfeature in most real-world cases.