Hacker News new | past | comments | ask | show | jobs | submit login

Just today I had GPT4 implement a SwiftUI based UI for a prototype I’m working on. I was able to get it to work with minimal tweaks within 15 minutes even though I know next to nothing about SwiftUI (I’m mainly a systems person these days). I pay for this, and would, without hesitation, pay 10x for a larger model which does not require “minimal tweaks” for the bullshit tasks I have to do. Easily 80% of all programming consists of bullshit tasks that LLMs of 2024 are able to solve within seconds to minutes, whereas for me some of them would take half a day of RTFM. Worse, knowing that I’d have to RTFM I probably would avoid those tasks like the plague, limiting what can be accomplished. I’m also relieved somewhat that GPT4 cannot (yet?) help me with the non-bullshit parts of my work.



If it handles 99% of your tasks (making a smart boss fire you), know that you helped train it for that by using it/paying for it/allowing it to be trained on code in violation of license.

Even if 80% of programmer tasks in an org (or worldwide gig market) can be handled by ML, already 80% of programmers can be laid off .

Maybe you have enough savings that you just don't need to work but some of us do!


There are two ways this could work out:

- LLM-assistance helps solve 80% of programming tasks, so 80% of programmers lose their jobs

- LLM-assistance provides that exact same productivity boost, and as a result individual programmers become FAR more valuable to companies - for the same salary you get a lot more useful work out of them. Companies that never considered hiring programmers - because they would need a team of 5 over a 6 month period to deliver a solution to their specific problem - now start hiring programmers. The market for custom software expands like never before.

I expect what will actually happen will be somewhere between those two extremes, but my current hope is that it will still work out as an overall increase in demand for software talent.

We should know for sure in 2-3 years time!


I like your optimism, but in programming at least in US unemployment so far already rose higher than average unemployment overall.

ML supercharges all disparity, business owners or superstars who made a nice career and name will earn more by commanding fleets of cheap (except energy) llms while their previous employees/reports get laid off by tens of thousands (ironically they do it to themseves by wecoming llms and thinking that the next guy will be the unlucky one, same reason unions don't work there I guess...)

And to small businesses who never hired programmers before, companies like ClosedAI monetize our work for their bosses to get full products out of chatbots (for now buggy but give it a year). Those businesses will grow but when they hire they will get cheap minimal wage assistants who talk to llms. That's at best where most programmers are headed. The main winners will be whoever gets to provide ML that monetize stolen work (unless we stop them by collective outrage and copyright defense), so Microsoft


I'm not sure how much we can assign blame for US programming employment to LLMs. I think that's more due to a lot of companies going through a "correction" after over-hiring during Covid.

As for "their bosses to get full products out of chatbots": my current thinking on that is that an experienced software engineer will be able to work faster with and get much higher quality results from working with LLMS than someone without any software experience. As such, it makes more sense economically for a company to employ a software engineer rather than try to get the same thing done worse and slower with cheaper existing staff.

I hope I'm right about this!


> my current thinking on that is that an experienced software engineer will be able to work faster with and get much higher quality results from working with LLMS

> than someone without any software experience

- So you are betting against ML becoming good enough soon enough. I wouldn't be so sure considering the large amount of money and computing energy being thrown into it and small amount of resistance from programmers.

- Actually someone doesn't have to be zero experience. But if someone is mostly an llm whisperer to save boss some yacht time, instead of engineer, someone is paid according minimal wage.


No matter how good ML gets I would still expect a subject matter expert working with that ML to produce better results than an amateur working with that same ML.

When that’s not true any more we will have built AGI/ASI. Then we are into science fiction Star Trek utopia / Matrix dystopia world and all bets are off.


> would still expect a subject matter expert working with that ML to produce better results than an amateur working with that same ML.

Subject matter expert yes. Subject matter is not programming though, it's whatever the thing being built is about. (So if talking about non-tech companies that never considered hiring programmers before I think they still won't.)


Thing is though, I work in this field. I do not see it handling the non-bullshit part of my job in my lifetime, the various crazy claims notwithstanding. For that it’d need cognition. Nobody has a foggiest clue how to do that.


    1 Fire 80% programmers
    2 Spread the 20% non-bullshit parts among 20%
    3 Use llm for 80% bullshit parts
For now big companies are afraid to lay off too many so they try to "reskill" but eventually most are redundant. No cognition needed:)


Truth be told, most big tech teams could benefit from significant thinning. I work in one (at a FANG) where half the people don't seem to be doing much at all, and the remaining half shoulders all the load. The same iron law held in all big tech teams I worked in, except one, over the course of the last 25 years. If the useless half was fired, the remaining half would be a lot more productive. This is not a new phenomenon. So IDK if "firing 80%" is going to happen. My bet - nope. The only number that matters to a manager is the number of headcount they have under them. And they're going to hold onto that even if their people do nothing. They are already doing that.


You switch topics. There are useless people. Not talking about them. Ignore useless people.

You and your good useful programmer coworkers do 80% llmable bullshit, 20% good stuff. So among you, if your boss is smart he will fire 80% of you and spread 20% non-llmable work across remaining people. You hope your coworker gets fired, your coworker hopes it's you, and you both help make it happen


Fire everyone and make themselves redundant? Please. You're also assuming the amount of non-bullshit work would stay constant, which it won't. I'm doing a ton more non-bullshit work today thanks to LLMs than I did 2 years ago.


> Easily 80% of all programming consists of bullshit tasks that LLMs of 2024 are able to solve within seconds to minutes, whereas for me some of them would take half a day of RTFM

> I'm doing a ton more non-bullshit work today thanks to LLMs than I did 2 years ago.

Logically this means either there is more non-bullshit tasks in total or some of your coworkers were fired so your workload is the same...

Are you paid more for doing more difficult work, adjusted for inflation?


I enjoy difficult work in my area of expertise a lot more, and dread boilerplate work, and work in unfamiliar domains that takes time for RTFM and trial and error. As to my pay, let’s just say I’m not complaining, especially when I get to do more of the stuff I enjoy. Also: work expands.


> I enjoy difficult work in my area of expertise a lot more

Real question: is it difficult work if that's exactly the part you like and you are not paid more when you do it more? What makes it difficult-- just the fact that LLM can't do it this year yet?

I wouldn't call my work "difficult". Boring parts can be hard but with the right stack there are very few. Stuff like back and forth to understand customer requirements is difficult but that's not even my job.

> let's just say

I didn't ask how much you get paid exactly, I asked if you get paid more (adjusted for inflation) for effectively doing more work now thanks to LLMs.

> work expands

And if pay doesn't you may ask yourself if LLMs are eating at your pay:)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: