Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I would just add that sometimes ideas exceed the technology of their time. So revisiting a design that had deficiencies (weak points, high production costs, bad emissions, etc.) with new tools, materials, etc. can lead to breakthroughs. Not that that's what is happening here, just why some ideas that previously didn't work seem to circle back around.


I feel like software is the poster child for this. My hunch is that a lot of techniques have been dropped for performance penalties that may be on the order of 20-30% (totally guessing here) when Moore’s Law often has covered that gap in mere months.

I read a comment on HN a while back that you can look like a genius in software by going back about 10 years and finding something forgotten. The whole web development shift from server-side to client-side and now drifting back to server-side as if it’s something new seem to validate this. Though in this case it just seems like the extension of the decades long back and forth of going from mainframe with super minimal client, to PC only, to networked, to client/server, to server-side web, etc, etc.


>20-30%

Your total guess is orders of magnitude off.

Most software today is in the ballpark of 10,000x slower than it should be.


That's not what they're saying. They're talking about things that couldn't be done due to performance issues but are now possible thanks to improved hardware.


Software developers rarely question the hardware they are developing on/for. Unless you were developing in the pentium and earlier era, it has almost definitely not been true for you that “the hardware is not capable of running this idea” (ignore certain obvious high performance sectors).


Dunno, there are some pretty large exceptions to that.

The AI boom is almost entirely a thing due to advancements in hardware. You simply couldn't build ChatGPT with 2014 hardware.


10 years ago we designed a system to approximate a chunk of maths we did not have time to do every second, too slow.

If you started from scratch you'd never even consider doing anything other than the original obvious "correct" maths. Approximation be damned.


This factoid about web dev misses the fact that the JS developers today are often the same people who were writing CGI, Perl and PHP, some of us also ASP and later ASP.NET, or some Java. We always knew there are good parts about the old ways and merging the two was always the goal - it just takes some time to get the client side right and then to get the merging right.

This especially applies to React core devs. Remember XHP?


It’s hard to tell if you can improve technology until it’s good enough (jet engines) or if it’s a futile dead end.

Kelly Johnson of Skunkworks had a keen mind for differentiating between them.


It is indeed and sometimes it can go either way depending on the time point. A good example is what eventually became the PW GTF, the PW8000, discussed here:

https://cornponepapers.blogspot.com/2006/04/short-life-and-u...

Just a little bit to early but when they came back to the idea they ended up with a technology that they use on all their new (big) commercial engines.

For anyone that doesn't know the name Kelly Johnson I recommend "Kelly: More Than My Share of It All". A rare person who combined technical genius with an ability to get large scale things done.


Adding to the list of deficiencies, Apex seals!!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: