Hacker News new | past | comments | ask | show | jobs | submit login

There are two ways this could work out:

- LLM-assistance helps solve 80% of programming tasks, so 80% of programmers lose their jobs

- LLM-assistance provides that exact same productivity boost, and as a result individual programmers become FAR more valuable to companies - for the same salary you get a lot more useful work out of them. Companies that never considered hiring programmers - because they would need a team of 5 over a 6 month period to deliver a solution to their specific problem - now start hiring programmers. The market for custom software expands like never before.

I expect what will actually happen will be somewhere between those two extremes, but my current hope is that it will still work out as an overall increase in demand for software talent.

We should know for sure in 2-3 years time!




I like your optimism, but in programming at least in US unemployment so far already rose higher than average unemployment overall.

ML supercharges all disparity, business owners or superstars who made a nice career and name will earn more by commanding fleets of cheap (except energy) llms while their previous employees/reports get laid off by tens of thousands (ironically they do it to themseves by wecoming llms and thinking that the next guy will be the unlucky one, same reason unions don't work there I guess...)

And to small businesses who never hired programmers before, companies like ClosedAI monetize our work for their bosses to get full products out of chatbots (for now buggy but give it a year). Those businesses will grow but when they hire they will get cheap minimal wage assistants who talk to llms. That's at best where most programmers are headed. The main winners will be whoever gets to provide ML that monetize stolen work (unless we stop them by collective outrage and copyright defense), so Microsoft


I'm not sure how much we can assign blame for US programming employment to LLMs. I think that's more due to a lot of companies going through a "correction" after over-hiring during Covid.

As for "their bosses to get full products out of chatbots": my current thinking on that is that an experienced software engineer will be able to work faster with and get much higher quality results from working with LLMS than someone without any software experience. As such, it makes more sense economically for a company to employ a software engineer rather than try to get the same thing done worse and slower with cheaper existing staff.

I hope I'm right about this!


> my current thinking on that is that an experienced software engineer will be able to work faster with and get much higher quality results from working with LLMS

> than someone without any software experience

- So you are betting against ML becoming good enough soon enough. I wouldn't be so sure considering the large amount of money and computing energy being thrown into it and small amount of resistance from programmers.

- Actually someone doesn't have to be zero experience. But if someone is mostly an llm whisperer to save boss some yacht time, instead of engineer, someone is paid according minimal wage.


No matter how good ML gets I would still expect a subject matter expert working with that ML to produce better results than an amateur working with that same ML.

When that’s not true any more we will have built AGI/ASI. Then we are into science fiction Star Trek utopia / Matrix dystopia world and all bets are off.


> would still expect a subject matter expert working with that ML to produce better results than an amateur working with that same ML.

Subject matter expert yes. Subject matter is not programming though, it's whatever the thing being built is about. (So if talking about non-tech companies that never considered hiring programmers before I think they still won't.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: