> Perpetuates the idea that there's "binary" and "text" [...]
Well, there is binary, and there is text. Sure, all text - like "strawman" ;) - is binary somehow, but not all binary data is text, nor can be even interpreted as such, even if you tried really hard ... like all those poor hex editors.
Text is text. Text is encodable as binary. If text was binary, that encoding would be unique, but it isn't. Latin-1 encodes "Ü" differently than UTF-8 does, and even the humble "A" could be a 0x41 in ASCII and UTF-8 or a 0xC1 in EBCDIC
Everything is representable as binary, but not everything is binary. The abstract concept of 'A' has no inherent binary representation, and 0x41 is just one of the options. Representing Pi or e calls for even more abstract encoding, even though they are a very specific concept. Text is not binary, but text has to be encoded to binary (one way or another) in order to do any kind of computer assisted processing of it. But we tend to think of text in abstract terms, instead of "utf-8 encoded bytes", hence this abstraction is useful.
What if the computer isn't binary, but it needs to talk to a binary computer? Then you definitely can't go "oh, this text is binary anyway, I can just push it on the wire as-is and let the other end figure it out".
What? Is a ZIP file not binary because it’s not a valid tar.gz file? Text just means “something that can be printed” and by this definition not even all valid ASCII sequences are text.
Well, there is binary, and there is text. Sure, all text - like "strawman" ;) - is binary somehow, but not all binary data is text, nor can be even interpreted as such, even if you tried really hard ... like all those poor hex editors.