Hacker News new | past | comments | ask | show | jobs | submit login
In a leaked recording, AWS CEO tells most developers could stop coding soon (businessinsider.com)
56 points by rainhacker 32 days ago | hide | past | favorite | 60 comments



So many bad takes in that article. It will be interesting to look back in 5+ years to see how things play out vs this unlimited optimism.

> "everyone is a programmer now"

I’ve heard this about so many things. Various tool they make everyone a programmer, or everyone a DBA. Nice dreams, that never seem to play out.

Being a programmer isn’t about the syntax, it’s about breaking problems down, so they can logically be built back up in code. I have yet to see anyone without an extensive background in programming write good spec for what they want code to do. How many assumptions are we comfortable having AI make?

On my last project I was given 1 sentence of direction, and the people giving the direction truly thought that’s all they needed to say… or it was the extent of their understanding of the topic. It took thousands of lines of code, backed by a bunch of testing and design decisions, informed by 15+ years with the company and the various personalities involved, to make that 1 sentence a reality in a way that would make sense for the organization. Call me a cynic, but I don’t see AI doing a good job with something like that in a world where “everyone is a programmer.”

I did try putting it in Copilot at the start, just to see what it dumped out. It gave me maybe 40 lines of broken code. It was the blog post version of how to do it, not an enterprise solution.


I think the enterprise stuff is programmers become more tool-assisted, so fewer programmers are needed. A team of two seniors and 6 juniors can just be two seniors. As you said, it's mostly breaking down a problem and knowing what's needed, then telling it what components are needed. 'Make me a function that does this and that' I use it all the time to make things I know I can make but it makes it, if it makes it a weird way I modify the prompt to tell it not to make it that way. Good for frameworks for skeleton code and simple hacking things together at sysadmin level and tinkering with things. Not sure about full fledged program projects though.

Helps devs get 'unstuck' if they get the writers block. It's absolutely changing the game for marketing, bizdev, and programmers now. Intel layoffs ~20,000, IBM ~24,000. Kind of scary.

Smart people with better tools can be a dominant force. So, yeah programmers might be looking for newer skillsets.

Operations and Sysadmins I don't forsee ever changing, especially with AI.

As you said, will be interesting to look back in five years and see.


I did see a take from someone (it may have been Alan Kay, but I could be misremembering) who said no one should layoff programmers due to AI. If AI makes programmers more productive, and everyone is getting it, they will need everyone to keep up with the competition.

2 seniors + AI may be able to have the same output at the 2 seniors + 6 juniors you mention. But if their competitors keep all those people and add AI, will they accelerate past the company when the layoffs by moving faster? Cost savings aren’t so great if they are at the expensive of remaining competitive in the market and retaining customers.

This is the perspective I hope takes hold. I think the layoffs you mentioned from Intel and IBM were very premature, if AI was the only basis for them.


AI was absolutely not the basis for the layoffs at Intel, they are doing bad full stop. I don't know enough about IBM, but I strongly suspect that one is more influenced by macroeconomic conditions and general slowing down of the economy than anything else and AI is the convenient excuse to point to.

> If AI makes programmers more productive, and everyone is getting it, they will need everyone to keep up with the competition.

This assumes that there's competition. When money is expensive to borrow, companies stop throwing shit at the wall and seeing what sticks, they start being more conservative with where they expend their capital. Efficiency will absolutely be the name of the game for 2-3 of the next 5 years. I don't really see AI being a huge part of it. Writing code doesn't take up the majority of my time as a senior engineer.


jevons paradox


try https://github.com/OpenInterpreter/open-interpreter. sysadmining stuff just got a whole lot easier. I've been admining Linux systems for over two decades, but I give LLMs a shot at dealing with some stupid Unix shit so I don't have to. it won't get 100% of the problems, but I can ask it to fix the mess made of, say, my system SSL certificate authorities, and it won't judge me for saying SSL and not TLS and it'll go off and try the first couple of fixes it basically the same way I would, saving me time while I work on something else.


I've had very mixed experience with LLMs (Perplexity) and sysadmin stuff. More often than not I ask questions for specific CLI invocations and it will literally just make stuff up. This is even after (supposedly) Googling the documentation and analyzing it.

Even today I got terrible advice from Claude about calling Close() on a database in Golang. This kind of stuff would screw over a junior Dev who didn't know better:

"You're right to question this, and I apologize for the oversight in my previous response. Let me clarify: If you're using a connection pool (which is typically the case with sql.Open), you generally don't need to (and shouldn't) call Close() on the *sql.DB object after each operation."

I feel like all AI is doing atm is giving me extreme paranoia from being gaslit so much lol


My work is trialing Microsoft’s Copilot for 365. So I have a built in chatbot in teams to ask various questions among other features.

Asking about specific Microsoft documentation it will just immediately bail and tell me to look that up on my own because it was built on training data up to 2022 and may not be fully up to date on the latest documentation. It won’t even link to the page last time I tried. It makes up PowerShell commands and suggested completely out of date options.

So Microsoft’s own AI assistant cannot even provide accurate information about the Microsoft products it is integrated with let alone anything else.

It can be useful for a short script but anything beyond that it is slower and less reliable than doing things myself.


I don't disagree. There's still value in having eg Linux sysadmin skills, because the LLM still doesn't always get to the solution, but I kick off the job and come back a bit later and don't need to try the basic stuff myself, which means I spend more times on the harder problems.

For some reason I don't run into hallucinations as much as other people seem to, assumedly because I'm on well trod paths, but being lied to like that is always a fear. I asked it about argocd and it told me there is a command line program for it and I didn't believe it and had to Google it for myself, which didn't save me time or energy, but I got to ask it how to do things instead of hooting hoping the documentation had the right example.


Except AI was correct this time.

"It is rare to Close a DB, as the DB handle is meant to be long-lived and shared between many goroutines."

source: https://pkg.go.dev/database/sql#DB.Close


No. That was the reply after I corrected it, as in it originally told me to call Close after every *sql.DB handle.


If we're replacing 2 seniors and 6 juniors with just 2 seniors, where are you gonna get more seniors when the 2 retire?


The same way that currently most HR departments only care about people with at least 10 years experience in exactly the same stack the company is using, and more often than not, training is not even part of the picture.


That problem isn't in scope. That's a long term issue, we care about returns we can deliver in the next 2 quarters.


> If we're replacing 2 seniors and 6 juniors with just 2 seniors, where are you gonna get more seniors when the 2 retire?

Happily drive off a cliff?


With a golden parachute nonetheless.


> Operations and Sysadmins I don't forsee ever changing, especially with AI.

Why not?


I've worked operations most of my working life, it's, well it's 'expect the unexpected' territory always, there's always stuff you can never be prepared for and have to really play with the cards you're handed and hope for miracles. Sysadmins find job security in knowing they can fix things right then and there but let them break so they can go fix them, I say that half jokingly, keyword on half.. I love sysadmins. But, sysadminning and trusting AI deployers with a focus on security and stability is a long time out, especially with needing people to ensure those systems stay operational.

When I say operations I don't mean devops, there's that too, but I mean the actual operations of an organization. The team leaders, go-getters, etc. Everybody has their niche.


I'll be afraid when AI takes a bug report with that typical one sentence problem description (like 'Paypal payments are not being processed, we loose 30k a day'), and fixes the problematic one line of code that the AI wrote before.


They despise dependencies so much. Half of there life is to get rid of them - declared as "risks" and "depending on some guy down the hall who codes really well" just makes them loose sleep at night.

And they put so much effort into this side-quest. No code, low code, everyone codes, simple languages, uml-programing,graph-based programing, so easy my intern can do it, outsourced programming, code by specs, all just to get rid of that dependency that can not be- and it never works out. The complexity was inside the company and its product all along.


I guess dependency is powerlessness and thus depending on programming, without the ability to control your destiny, is the equivalent of power thirsty sociopath torture. Makes ones life that one notch more beautiful, going into work..


I’m so glad I left. The execs there are comically incompetent.


Agreed. It's this kind of delusion that gets promoted to the top. Meanwhile talented persons capable of building SOTA solutions are let go in droves.

Amazon ain't it. "Democratizing AI" is a cover for the fact that Amazon has no models worthy of contention, so they have to save face by funding and serving up other company's models.

One could argue, "oh they have that guy from AllenAI." But where are AllenAI's code generation models? Where are AllenAI's LLMs? Nowhere.

Amazon leadership is horribly incompetent.


I am missing something, what is being said that signals incompetence?


> I am missing something, what is being said that signals incompetence?

You could start at the first quoted remark: "If you go forward 24 months from now, or some amount of time — I can't exactly predict where it is — it's possible that most developers are not coding."

1. Doesn't seem to demonstrate understanding of what developers actually do.

2. Poor understanding of the limitations of the technology he's talking about.

3. Totally unrealistic timeline.

4. The kind of claim that has been made countless times, about countless technologies, and has never panned out yet. That doesn't mean it won't ever pan out, but it means it warrants a huge amount of skepticism.


OK fair enough. There was some salesmanship and exaggeration going on for sure, but I thought the follow up quote clarified what he meant, or at least the way I read it, which is that code assistants will significantly reduce the amount of boilerplate coding that users are having to write by hand, which will give engineers more time to spend on the design part of the job. I didn't read this as saying your job is not safe because AI is coming, but maybe I am looking at this with tinted lenses


At a very basic level, the only way the CEO could say most developers would be replaced is if he has not used state of the art code assistants. It’s either a lack of understanding the role of a developer, or of machine learning, or both.

And last time I checked AWS’ AI assistant won’t even handle IAM policies. So at least those jobs should be safe.


To be fair I also really struggle with AWS IAM policies and I'm a pretty senior engineer with a lot of experience.


I have yet to see a significant increase in productivity from AI. I wonder where they are extrapolating from?


Executives are the type of workers that have seen the increased productivity - it helps them write emails faster and summarize and digest long, mostly LLM-written, emails faster than before. Therefore, they think, it translates to everything else; writing code is just writing a foreign language for them, so, they figure, LLMs will do it, too.


I assumed he would have been a programmer risen through the ranks but your comment made me question that - and it turns out he's a graduated manager type who's just done exec stuff.

https://www.linkedin.com/in/mattgarman/


They're extrapolating from their beliefs which they must hold given the investments they've already made in a fairly immature technology.

They have to believe that AI will lower their staffing costs and start generating serious revenues because to admit otherwise means they have failed to present an honest picture to their investors.


> They're extrapolating from their beliefs which they must hold given the investments they've already made...

I swear that covers an embarrassingly large fraction of tech industry reasoning.

And everybody thinks they're so smart.


They're extrapolating from their PowerPoint slides, the single source of truth from which all else is manifested.


Depends on what you mean by significant. I’d say that at the very least I save 5-10min per hour using copilot when coding, and several times I’ve had tasks like “convert this untyped schema less json solution to a typed solution” which was basically solved immediately and entirely by copilot with one or two minor corrections. Also unit tests pretty much writes themselves these days.

Overall I would say that it’s not a revolution but definitely significant at this point. With and without copilot is as big a difference as coding in notepad vs an IDE when it comes to productivity gains for me.


I turned staff engineer a year and half ago, almost stopped developing, from time to time I have to, especially to take over tasks of people on summer vacation, having ChatGPT to help with small python scripts and GO apps was very helpful and time saving. I would not use it for full development tasks, but for small things is easy to understand what's wrong (and so far, there was always something wrong).


There are two sides to the story. Look at the unemployment rates or recent layoffs, a 2% difference is enough to disrupt a lot of things. You don't need a lot.

AI tools has made a significant impact already. But that significant number isn't '100%'.

On the main topic, of course you can't replace a team of 8 compromised of seniors, mids, and juniors with just 2 seniors. Nowhere near that.


Never underestimate the urge to be seen as a /thought leader/


Maybe they're extrapolating from people who did find a way to turn this new technology into a useful tool for themselves. That's the hacker way, after all.


yeah all I've seen so far is a very expensive but actually limited in features QoL upgrade that works for a very limited subset of the enterprise workforce.

(mostly copilot).


I’m waiting for the story where a developer says that AI is equally able to replace senior managers; I can only imagine the blustering and outrage that would result!


"If debugging is the process of removing software bugs, then programming must be the process of putting them in"

I am happy to change my title from Software Engineer to AI Software Debugger if it means more money and prestige.


AWS CEO is just catching up now. Most developers have stopped coding since several years now and have been dancing around with JSON, yaml, helm charts, terraform and docker files.


They managed to code in text files instead of text files, the devils !


In the last 100 years, software adoption went from non-existant to being used everywhere, so even through programmer productivity was increasing, we stil always needed more programmers. But maybe the growth of programmer productivity will be higher than the growth of software demand at some point?


If "programmer productivity" increases then I'll spend 90% of my time thinking and talking to domain experts instead of 80%. Being able to write executable software is not and never has been the problem. But, sure, if it ever becomes easy to write working software then you can expect programmer salaries to fall precipitously, at least to the level of nurses or any other occupation that requires training but essentially anyone can do.


Interesting, for me, it has been the opposite. You spend some time identifying the problem, meet with a few customers (if it's your own business) or have a chat with your employer. Once you get an idea of what is needed you get to work. I'd say 90% of the time is spent implementing for me.


What do you mean by "implementation", though? Notice I said 80% of my time includes thinking. In other words, only around 20% is typing, which is the only thing LLMs help with. For me, most of my time "implementing" is thinking.


Thing is that the new crop of LLM AI systems are far more suited to replace the vagueries and multiway interpretable and broadly generic outputs of positions like sales and marketing, and yes, management, than they are at replacement in engineering based environments where output and performance is highly specific, knowledge very fractal and volatile, and unforgiving in the slightest.

I' productively using LLM coding assistants daily, but if I had to choose between having to go with the unmodified LLM output of codebase or a marketing plan, it would not even be a question.


I've got a tinfoil hat that says this AI hype cycle is ~20% no-kidding tech, and ~80% fig leaf for offloading expensive employees.


I actually have a different hypothesis. The amount of professional developers is increasing exponentially since the invention of computers, at a pretty steady rate. It’s only possible because of increased tooling. I’m hoping that ai is powerful enough to keep this trend going


This is why 'leaders' should spend some time in the trenches, doing the job of those they 'lead', so that they have at least a basic understanding of the actual difficulties, limitations, possibilities, etc.


In other news, Amazon investors just found out the AWS CEO could be replaced by a chatbot and nothing of value would be lost.


I agree, although we should note that the constant here is that I and this CEO both think jobs we aren’t doing could be replaced by bots.

But, really, I do think I’m right and they are wrong…


Worrisome. If I were heavily dependent on AWS I’d start thinking about plan B.



They have to say that to make the number go up. All these companies have heavily invested in AI.

AI is just another tool whose output depends on the skill of the user. You can't put people without domain knowledge in front of a LLM and get good results. It's very good at producing output that's looks good enough to convince non experts that it can do the job.


CEO could stop manage sooner


In the meantime, CEOs will increase their pay by 10x-100x, because, obviously, they do the most important work out there. /s


I'm sorry, please don't downvote me for a low-quality reply - but my only reaction after reading that is ... "sure, Jan."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: