> There's lots of psychological and anthropological studies behind the fact that most experts in various fields excel due to pattern recognition not reasoning.
Pattern recognition in experts comes from combination of theoretical understanding and a lot of practical problem solving experience (which translates into patterns forming in way of neural paths) - not the other way around. If you dont understand the problem you are solving, then yes maybe you'll be able to throw a pattern at it and with a bit of luck solve it (kinda like how LLMs operate), but this will not lead to understanding. Memorising patterns isolated from theoretical backgrounds is not something that will create an expert in a field.
There are tons of devs in same bracket, just not the most vocal ones. I could be described as one of them. In most corporations big enough, this is the only way to keep doing development instead of management, unless they have the grow-or-get-fired mentality.
As soon as I would step up one more level, I would be often responsible for team deliveries. Another step and team may not get bigger but various political pressures grow immensely, its much easier to get fired there. While compensation not that much. And most work time would be spent on meetings and working in MS Office products, not that much development, hardly any creative work.
At the end its just an empty label that is up to you to consider for its worth, to join the rat race or not. Even with my lower position I've managed (rather successfully) teams when needed. I get cca same compensation as 2 levels above with less tenure at the company, way more than any peers and in highest paid region in Europe. I get 10 weeks of paid leave by company due to working on 90% contract. So what is there to strive for - much higher daily stress? Having after-work or weekend calls? Unpaid overtime/weekend work that come with higher positions, although required rarely? Work moving into boring endless calls and discussions, 0 creativity unless you consider churning out excel spreadsheet or powerpoints a creative endeavor? Hardly achievements, rather destructive failures.
No thank you, if I can make the choice. Quality of life, happiness and all that.
Yepper. Trail of Tears, German/Japanese internment are all primary education topics. Now interestingly, I don't think Bush has made it into the history books yet, but I don't have kids, so can't verify current day education materials.
What I find interesting is the bits we leave out. Like we touch on the Banana Republics, but the annex of Hawaii and how that was skulduggerously done is completely skimmed over.
Thanks, that's a reasonable argument. Some critique: based on this argument it is very surprising that LLM work so well, or at all. The fact that even small LLM do something suggests that the human substrate is quite inefficient for thinking. Compared to LLMs, it seems to me that 1. some humans are more aware of what they know; 2. humans have very tight feedback loops to regulate and correct. So I imagine we do not need much more scaling, just slightly better AI architectures. I guess we will see how it goes.
Libre/OpenOffice is a C++ application which uses an homegrown cross-platform GUI toolkit. There were only some minor components written in Java (like mail merge), and I believe LibreOffice has replaced some of them with native code.
LibreOffice is derived from OpenOffice.org which is derived from StarOffice which predates Java. When it was acquired by Sun and open-sourced, they added some optional components implemented in Java, but the core application is not a Java application. The GUI is not Swing but their own custom GUI framework (not based on Java).
I mean I see this attitude the same as "Microsoft has taken over the world, no other software will ever succeed" back in the past. Turns out Microsoft didn't take over the world and it was just part of a cycle.
Those icons were well-designed for the newly computerized office employee of the day. The new school of icons are made by graphic designers for other graphic designers.
Don't get me wrong, I have been surprised by witnessing a thrown punch even in a very nerdy upper-middle-to-just-posh Cambridge pub (seen exactly over the course of about 9 years), but even that wasn't at the level alleged (IDK if found guilty) in the referenced case.
These two suggestions are fine, but I don't think they make fixtures really that much better--they're still a morass of technical debt & should be avoided at all costs.
The article doesn't mention what I hate most about fixtures: the noise of all the other crap in the fixture that doesn't matter to the current test scenario.
I.e. I want to test "merge these two books" -- great -- but now when stepping through the code, I have 30, 40, 100 other books floating around the code/database b/c "they were added by the fixture" that I need to ignore / step through / etc. Gah.
Actually the history of real people is my main area of interest :-). I stand by what I said, but I way understand you have to sort of blur your vision and take the bigger 60%, this is not the 99%, also the article was specifically about aesthetics, which is inherently a more rose colored glasses approach. I’m not sure that there’s any era I’d rather live in than today (though this is a nuanced question, since you wouldn’t know better, and I do think we’re in sort of a local minima so for sure I’d rather live in like the early 2000’s and maybe before, probably no earlier than auto-bill pay, digital banking and modern dentistry lol.). But there are many eras I would like to travel to for the aesthetic.
It started modern esports. There were gaming competitions in the 80s, but there weren't team houses, coaches, analysts, big money sponsors, regular huge events, dedicated TV channels, players in prime time commercials and dating actresses and pop stars, etc... Brood War hit in Korea like nothing before or after it. There were literally three full time, 24/7 TV channels showing Starcraft content at it's peak. No other game has ever done that.
> > Nobody has developed a real alternative. It seems like most companies are more than willing to leave this entire market to Microsoft.
> The number of humans that are literate enough in business, marketing, communications, and software development to pull this off are extremely few and far between right now.
I think the problem is different: those who are capable of pulling off such a task commonly lack the "business credibility" that is necessary so that C-level executives would buy a product from them.
Getting the skills in all these disciplines is a much-more-than-fulltime job. If you spend all your time cramming, you simply don't have time to build this "business credibility".
You'll have a smaller base of users that don't want AI slop, but will keep using your AI anyway even if it's there.
But what you lose is the large paying corporate customers that demand 'soup de jour' that end up going to VScode or whatever, and you may never get them back.
Building software is hard, being profitable at it is even harder.
In the UK, but not in London, but my order online sometimes because local shops do not have everything I want, it takes time to drive into town and shop at a supermarket, so when I am busy I order online.
Like a lot of people who work from home there is big difference between the time required to shop, and taking a few minutes away from my desk to get some stuff from the door to the fridge.
Ah, got it. This sounds like more of a "repugnant conclusion" sort of problem where if you care about the well being of people who exist, then it is possible to have too large of a population.
I really appreciate the ability at Costco to scan with my phone as we pick up items. Check out becomes a breeze. But I absolutely hate self-checkout grocery stores unless I just have a few items. The idea that I'll run a cart full of groceries through self-checkout is insane. Not only do they routinely not have accurate bar codes requiring some sort of lookup from an attendant. I'll have things which require human clerks to "approve" anyway like wine. In addition, my self-checkout lines don't have the full conveyors like the human checkout lines. So everything has to be moved from cart directly to bag and there isn't enough bag space so you have to start putting bags into the cart which still has groceries. The whole thing is a mess and I hate it.
I think they keep coming back to this because a good command of math underlies a vast domain of applications and without a way to do this as part of the reasoning process the reasoning process itself becomes susceptible to corruption.
> LLMs are not calculators. If you want a calculator use a calculator. Hell, have your LLM use a calculator.
If only it were that simple.
> I mean, no not really, digital computers are far easier to build and far more multi-purpose (and technically the underlying signals are analog).
Try building a practical analog computer for a non-trivial problem.
> Again, if you have a deterministic solution that is 100% correct all the time, use it, it will be cheaper than an LLM. People use LLMs because there are problems that are either not deterministic or the deterministic solution uses more energy than will ever be available in the local part of our universe. Furthermore a lot of AI (not even LLMs) use random noise at particular steps as a means to escape local maxima.
No, people use LLMs for anything and one of the weak points in there is that as soon as it requires slightly more complex computation there is a fair chance that the output is nonsense. I've seen this myself in a bunch of non-trivial trials regarding aerodynamic calculations, specifically rotation of airfoils relative to the direction of travel. It tends to go completely off the rails if the problem is non-trivial and the user does not break it down into roughly the same steps as you would if you were to work out the problem by hand (and even then it may subtly mess up).