"AI is amazing. I’ve trained an AI to detect model railroad locomotives and their types. I’m a daily user of AI to let it write code for me."
But AI will take away all coding. Sure, some people will write code, like some people today use a mechanical typewriter. Like some artists use clay. But most of what happens in computers in the future will not be code but executed, self modifying, self optimizing, AI models.
"more and more time to think of what I want build"
This is not how AI will work. It's not I want something to build, the AI will just do the things. There will no longer be things to "build". We will no longer think of code as something that exists, but things that just happen.
You will say "But I could tell the AI what film to create, and some scenes, and a rough story, and it will create that film" - but what if AI creates much better, more powerful, exciting, better films than those you could imagine? Films no human ever thought about?
Again, like the typewriter, some people will tell the AI fragments of a story to create a film. But most media content will be created by AI, for consumptions, on the initiative of AIs, not on the initiative of humans.
In the mid 90s I wrote a philosophy paper at university about an AI that generates random images (trigger by my first digital camera, a Olympus C-800L) and then interprets them (making some estimations on the speed it could do that, generation and interpretation). That AI has basically seen it all, an alien killing JFK, me on the moon, you and me drinking a beer, and things we could never imagine.
[Edit] Like there people writing new games for 8bit computers today. They exist, but it's a niche.
You're making a lot of assumptions and predictions there. Not to say that you're wrong, but we don't yet know if you're right. My opinion is that we're seeing the shape of the limitations of these LLMs. I use them a lot for coding, it's a sea change. But you still have to understand what you're doing, they don't produce accurate code all the time, and I'm not convinced they'll be able to just insert working code into existing complex systems.
> but what if AI creates much better, more powerful, exciting, better films than those you could imagine?
Well, something to deal with when it can. Right now it cannot do this.
But you still have to understand what you're doing, they don't produce accurate code all the time, and I'm not convinced they'll be able to just insert working code into existing complex systems.
I fully agree. I have been using Copilot for a while and the errors it makes in more complex domains (algorithms and data structures more complex than 'write me a quicksort') are just stupendous. Even at a more basic level, it regularly makes Rust syntax errors that my 10yo daughter has no issues spotting.
It has been really awesome for writing boilerplate code though. Especially with in-context learning, writing one example and then letting the model extrapolate it to other methods, functions, etc. works great and saves me a lot of time.
It's hard to predict the future. But the current best models feel more like a much better IntelliSense (though API hallucination is a serious issue) than something that is going to replace good programmers anytime soon (unless your task is writing boilerplate code).
Yes we do, but there are things we can talk about with certainty and things we can talk about with probabilities. I'll certainly get older and die. But AI won't certainly take all coding jobs. It's highly likely to change all coding jobs. We can discuss the changes with a higher degree of interest and thought. The author seems very certain though that AI will take all coding jobs.
A lot of things seem to have a great potential and then peak a lot lower. AI producing very impressive but very short videos gives us a sense that it has the potential to produce whole, coherent films that tell a story.
There will always be public phones. Sure, mobile phones will grow. But they are expensive. We estimate the total demand for mobile phones to peak at 900.000 [0] All public phones? That is a big assumption we can't make with certainty.
The drivers have always been convenience and cost. AI is the ultimate in convenience and cost.
I guess there will be programmers needed for legacy systems in Fortran an Java. But new companies 5 years ago didn't use Fortran or Java for new projects, the coming companies will not hire coders for their products.
> We will no longer think of code as something that exists, but things that just happen.
Which is very scary: we are modelling things after nature and ourselves with our own information: this makes LLMs sometimes eerie human and thus fallible. So when you say there will be no more software, I think you are saying that some future ai/AGI will perform the work without it coding that work. So instead of writing code xy it will just ‘do’ xy ‘in its head’ and use the answer for the rest of ‘the problem’ it is solving. That is going to be just as terrible as humans doing the same thing: they will, ever so often, make mistakes and if you see a ‘program’ as being 100000s of these decisions every run, this will be terrible.
Now it is possible we will come up with some AI that does not have this issue, but we don’t even have a plan of attack on how to create that; I am your age and I am going to say we are both dead before it happens. However, because a lot of companies like terrible ideas if they save money (support chat bots is an example), surely this domain will be tested with the short term generations of AIs.
We've had tools for generating code for a long time. They're mostly used for initial project setup and XML configured Java applications, and macro stuff at the fringes of software development.
If "AI" built an application, who is liable when there's damage?
> This is not how AI will work. It's not I want something to build, the AI will just do the things. There will no longer be things to "build". We will no longer think of code as something that exists, but things that just happen.
That’s a _huge_ assumption. There’s really, at this point, very little reason to think it will come to pass; if it does, it likely won’t be due to anything particularly similar to the current crop of generative AI stuff.
There's no downhill slope from where we are now to "AI does all the work." It's an uphill battle, just like automation at large, and it will never truly end.
Software has been eating the world, and AI will eat the world, but humans will shape how that happens, and there will always be new opportunities, new challenges, and new levels of abstraction.
AI will enable us to solve bigger problems. Picture beautiful cities integrated with nature: flower gardens, fruit trees, self-maintaining ecosystems. Picture spaceships capable of interstellar travel and asteroid mining. Picture advanced personalized AI doctors who can detect disease just by scanning your face and can cure you using AI-discovered techniques.
There's a huge amount of coding (AI assisted) to bring us to that world. I understand the nostalgia and fondness for memorizing syntax, but there are more exciting problems to solve, and we're on the verge of solving them. If programmers are problem solvers, then this is shaping up to be our most exciting era yet.
You aren't too old. I've been out of school for ~10 years and AI makes me feel exactly the way you've described it. I'm trying to navigate it and find an exciting space to carve out for myself, but the future feels unclear in a way it never has before. I'm confident in my own ability to build great things, but to your many points, AI has the unique potential to do everything better than me.
I don't want to say this makes me feel hopeless, because it genuinely doesn't. I'm working on a new startup, I'm starting a family, and I'm doing well.
I just want to build things that matter, that people find useful, and I've been having a hard time figuring out how I'll do that in a world with sufficiently advanced AI. Hobbies have never been enough for me, and never will be. It needs to be more than fun to energize me - I need to feel like I'm moving a needle.
For now, all I know is that I don't know. It feels nearly impossible to guess what the future holds.
The current generation of AI can't do the truly difficult things better than you: finding a good niche, making the decisions about how you want to differentiate, building relationships with your customers, choosing which innovations to pursue and when to build on the features you already have.
I run my own startup too, and I wish it could help me with those things. So far the most I can get from it is things like generating some HTML, and even then it gets it wrong half the time.
That will change in the future, but it will be gradual, and it will help you execute on ideas faster and better.
IF (a big if!) all your forecast will become true, I don't think what people will mourn most at the time is losing the fun part of coding/developing SW.
Like those laborers who went jobless after the waves of industrial evolution, they should have been planning earlier for other jobs and skills, rather than focusing solely on the fulfillment from making goods.
But yes, like with the Waymo destruction and people booing AI SWSX and the recent study the UK will at least lose 8 million jobs to AI, the changes are hard to imagine.
Like watching Terminator where Sarah is running around searching for a public telephone.
Yep. I don't know what's left for coders to do when AI is doing the coding, but I wouldn't be interested.
A painter gets in it for the painting, not painting management or painting business strategy technical directorship or touching up photographs taken by the new machine.
PS: I feel this way and I'm way younger than you. ;p I don't think it's an age thing as much as a "we were sold the idea that we could code" thing.
I feel like the time frame for this is 20+ years out. It feels premature to mourn the end of coding, although we can sense it coming in the medium future.
It’s not too soon to start planning how society should handle the radical changes in employment due to AI. It seems like that is where our focus should be.
> But AI will take away all coding
If AI takes away all coding, who will program new AIs? And when you want to tell an AI what to make, you have to tell a computer in very precise language what you want it to do, otherwise you might have unintended effects, which sounds a lot like coding to me
Before computers, there was no programming and you didn’t worry. After computers there maybe something totally new and special and you’ll never think about “coding” again.
Maybe you’ll use AI to create entire new universes, simulated only perhaps but who knows how wild and interesting it can get.
Will people still derive pleasure from creating things if there is an AI that is much better at creating things than you?
Take game development for example, I enjoy writing games but would I enjoy it as much if any random person could create a game much better than anything I could make myself? I’m not sure. It would be like performing to an empty room. Without any audience, is there pleasure in creation?
Kurt Vonnegut, in a letter response to a high schooler asking for advice:
Here’s an assignment for tonight,
and I hope Ms. Lockwood will flunk you if you don’t do it:
Write a six line poem, about anything, but rhymed.
Make it as good as you possibly can.
But don’t tell anybody what you’re doing.
Don’t show it or recite it to anybody, not even your girlfriend or parents or whatever, or Ms. Lockwood. OK?
Tear it up into teeny-weeny pieces, and discard them...
You will find that you have already been gloriously rewarded for your poem.
You have experienced becoming, learned a lot more about what’s inside you, and you have made your soul grow.
Interestingly that sounds like something that would work better for introverts, but either way you've got to find some work to pay the bills, and that means providing something that other people perceive as valuable.
Humans can't even agree on what it's best to create.
Try asking ChatGPT if you should become vegan, for example.
AI is a mediocre mashup of millions of ideas. It needs a driving force telling it what to be and how to react.
"Then there will be AIs trained to create certain things" -- and that's the point. People's ideals go deep, and achieving them is time-consuming and labor intensive, even with the help of AI. Someone, many someones, will need to be there guiding it every step of the way, through robotics, through politics, through religion. And that's a kind of programming.
As I wrote
"AI is amazing. I’ve trained an AI to detect model railroad locomotives and their types. I’m a daily user of AI to let it write code for me."
But AI will take away all coding. Sure, some people will write code, like some people today use a mechanical typewriter. Like some artists use clay. But most of what happens in computers in the future will not be code but executed, self modifying, self optimizing, AI models.
"more and more time to think of what I want build"
This is not how AI will work. It's not I want something to build, the AI will just do the things. There will no longer be things to "build". We will no longer think of code as something that exists, but things that just happen.
You will say "But I could tell the AI what film to create, and some scenes, and a rough story, and it will create that film" - but what if AI creates much better, more powerful, exciting, better films than those you could imagine? Films no human ever thought about?
Again, like the typewriter, some people will tell the AI fragments of a story to create a film. But most media content will be created by AI, for consumptions, on the initiative of AIs, not on the initiative of humans.
In the mid 90s I wrote a philosophy paper at university about an AI that generates random images (trigger by my first digital camera, a Olympus C-800L) and then interprets them (making some estimations on the speed it could do that, generation and interpretation). That AI has basically seen it all, an alien killing JFK, me on the moon, you and me drinking a beer, and things we could never imagine.
[Edit] Like there people writing new games for 8bit computers today. They exist, but it's a niche.