Aoccdrnig to a rscheearch at Cmabrigde Uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe.
Aaclltuy, taht is not ture. It smees lkie it oghut to be, but taht is in lagre prat due to olny vrey sghilt mginnug dnoe to the txet. If you try haderr, and alppy a mroe aeggirssve taafimnoorrstn to the txet, you wlil fnid taht it is no legnor esay to raed the txet. Eaceillpsy ntoe how the legnor wdors in tihs salmpe are far haderr to raed tahn tsohe taht are sehortr; it is the oceeimnnprse of the salml, facinnotul wdors taht ldens ielstf to esay iaeeinoprrtttn, and legnor wdors (mroe haeilvy maeglnd) bcemoe ecdeegilnxy dcffiilut caeeghllns to pluzze out waht tehy wree oagiillnry ideenntd to reeenprst.
(I didn't opt to randomly choose the middle letters, but instead sorted them, using reverse alphabetical order when the word came out the same. This makes some words insanely difficult--functional and interpretation are impossible to pick out so mangled, and I wrote the entire paragraph before pasting it).
If you thought that was too hard to read, here's the original:
Actually, that is not true. It seems like it ought to be, but that is in large part due to only very slight munging done to the text. If you try harder, and apply a more aggressive transformation to the text, you will find that it is no longer easy to read the text. Especially note how the longer words in this sample are far harder to read than those that are shorter; it is the omnipresence of the small, functional words that lends itself to easy interpretation, and longer words (more heavily mangled) become exceedingly difficult challenges to puzzle out what they were originally intended to represent.
FWIW I still found the heavily mangled version fairly easy to read, maybe a 20-30% slowdown max and it got easier as I went.
I speedread though so maybe it's because I'm already used to not actually focusing on letters and taking in the gestalt instead. It's kind of like how it's easy to tell the equality of two sets written in different orders up to about size 4, and past that it gets tricky. This is why the classic tick mark system uses 4 ticks and then a slash for the fifth too.
These tools are so important, yet I would bet that less than 1% of developers can actually use them and even less would have an idea on how to reimplement or improve them. Sometimes I feel like we are standing on a house of cards about to collapse.
I get that it’s kind of hip right now to talk about how developers these days are less capable than the mythical prior age where magical rings were forged and the elves still flourished in middle earth. Can you back up your 1%-or-less estimate with anything at all? Binutils is certainly a solid baseline set of tools, but given how frequently $insert_core_tool_but_in_rust/golang/etc shows up on HN, the risk that we forget how to rub sticks together to make fire seems low.
My line of work is compilers, but I could name two dozen people who could reimplement binutils from scratch, including me.
It would take some time, and years to get it all (they are massive, especially when you get to obscure targets and non unixy operating systems), but it would not be that difficult from a know-how perspective.
Most of the problems they solve are well understood. They are terrific tools, and very mature, so getting the long tail of features right might take some time.
But overall, not that hard for many people.
For example, LLVM has alternates that aren’t quite drop in compatible, but are close.
The house of cards analogy doesn't work much in real life because most things in life aren't that simple and fragile. It's of course possible to find single points of failure in complex systems, but we tend to identify them and work around them before they become a big problem.
In this situation, there are plenty of C programmers out there, so most developers don't need to know anything about these tools.
That applies to a lot of things. Microprocessors, keyboards, screens, hardware in general, compilers, OS, databases, CLI tools, text editors, browsers, drivers, etc. I like to think of it as a great achievement, that I don't have to think about these tools most of the time. Still, enough people are thinking about them to make actual progress on them. We may be standing on a house of cards, but lots of people make sure everyday that it won't fall.
What that means is that you are not in full control of your entire stack...do you have an idea how to reimplement, improve, or even just fix your keyboard's controller? Your link to the internet, from the way "call before drilling" protection is laid, to HTTP 2? The way your food is trustworthy; heck, the way your food even exists?
Indeed: nobody is in full control in their own life. Scary. Yet also powerful.
I think the food analogy is a very good one: I know how wheat is grown, harvested, turned into flour, and how to create a bread from that. I also know the same about several vegetables and even eggs and chicken. I know that I have a weak spot when it comes to dairy. I know enough about slaughtering cows and pigs that I would not attempt it myself. And I know enough about health and the risk of various parasites, etc. to understand what I need to read up about when it comes to dealing with meat safely.
I also understand very well that I would have a very hard time feeding me and my family alone. But I knew where to start.
If we apply this metaphor to binutils, how many developers even know what ELF or DWARF stand for and how they would have to use them? How many know what an assembler and linker do and how one would approach to get one if needed?
IMNSHO, the knowledge of assembler and linker is already quite specific - continuing the bread metaphor, "how many people trying to make bread from scratch have a clue about sourdough or yeast?" Yup, I have a nagging feeling that those exist and are fairly important, but how would I have to use them, not much knowledge there.
(Also, why stop at assembly - a macrolanguage, even though it strongly corresponds to a specific architecture?)
Fortunately there are plenty of developers that can reimplement or improve them. If there were a big economic incentive to do so you would see everything rewritten from scratch, redesigned, optimized and polished in two months
Anyone who knows how to use them, their house will still be standing. Thinking about this could cause one to question what "developers", as used today, actually means. Developers of what. A house of cards.
I see a reference to GOLD in the previous message. Does anybody know how that project is doing?
I recall that it was a promising and higher performance for LD, but I heard that it had stagnated, and Fedora stopped using it. Is it getting any better? A better performing linker may be a nice benefit for large C++ projects.
https://sourceware.org/bugzilla/show_bug.cgi?id=28058
Many Linux distributions use strip as part of their package builds, so it can have a large impact in some cases.