It feels ironic if the only thing that the current wave of Ai enables (other than novelty cases) is a cutdown of software/coding jobs. I don't see it replacing math professionals too soon for a variety of reasons. From an outsiders perspective on the software industry it is like it's practioners voted to make themselves redundant - that seems to be the main takeaway of ai to normal non tech people ive chatted with.
Many people have anecdotally, when I tell them what I do for a living, have told me that any other profession would have the common sense/street smarts to not make their scarce skill redundant. It goes further than that; many professions have license requirements, unions, professional bodies, etc to enforce this scarcity on the behalf on their members. After all a scarce career in most economies is one not just of wealth but higher social standing.
If all it does is allow us to churn more high level software, which let's be honest is demand inelastic due to mostly large margins on software products (i.e. they would of paid a person anyway due to ROI) it doesn't seem it will add much to society other than shifting profit in tech from Labor to Capital/owners. May replace call centre jobs too I guess and some low level writing jobs/marketing. Haven't seen any real new use cases that change my life yet positively other than an odd picture/ai app, fake social posts,annoying AI assistants in apps, maybe some teaching resources that would of been made/easy to acquire anyway by other means etc. I could easily live without these things.
If this is all it is seems Ai will do or mostly do it seems like a bit of a disappointment. Especially for the massive amount of money going into it.
> many professions have license requirements, unions, professional bodies, etc to enforce this scarcity on the behalf on their members. After all a scarce career in most economies is one not just of wealth but higher social standing.
Well, that's good for them, but bad for humanity in general.
If we had a choice between a system where doctors get high salary and lot of social status, or a system where everyone can get perfect health by using a cheap device, and someone would choose the former, it would make perfect sense to me to call such person evil. The financial needs of doctors should not outweigh the health needs of humanity.
On a smarter planet we would have a nice system to compensate people for losing their privilege, so that they won't oppose progress. For example, every doctor would get a generous unconditional basic income for the rest of their life, and then they would be all replaced by cheap devices that would give us perfect health. Everyone would benefit, no reason to complain.
That's a moral argument, one with a certain ideloogy that isn't shared by most people rightly or wrongly. Especially if AI only replaces certain industries which it looks like to be the more likely option. Even if it is, I don't think it is shared by the people investing in AI unless someone else (i.e. taxpayers) will pay for it. Socialise the losses (loss of income), privatise the profits (efficiency gains). Makes me think the AI proponents are a little hypocritical. Taxpayers may not to afford that in many countries, that's reality. For software workers we need to note only the US mostly has been paid well, many more software workers worldwide don't have the luxury/pay to afford that altruism. I don't think it's wrong for people who have to skill up to want some compensation for that, there is other moral imperatives that require making a living.
On a nicer planet sure, we would have a system like that. But most of the planet is not like that - the great advantage of the status quo is that even people who are naturally not altruistic somewhat co-operate with each other due to mutual need. Besides there is ways to mitigate that and still give the required services especially if they are commonly required. The doctors example - certain countries have worked it out without resorting to AI risks. I'm not against AI ironically in this case either, there is a massive shortage of doctors services that can absorb the increased abundance Imv - most people don't put software in the same category. There is bad sides to humanity with regards to losing our mutual dependence on each other as well (community, valuing the life of others, etc) - I think sadly AI allows for many more negatives than simply withholding skills for money if not managed right, even that doesn't happen everywhere today and is a easier problem to solve. The loss of any safe intelligent jobs for climbing and evening out social mobility due to mutual dependence of skills (even the rich can't learn everything and so need to outsource) is one of them.
> If all it does is allow us to churn more high level software, which let's be honest is demand inelastic due to mostly large margins on software products (i.e. they would of paid a person anyway due to ROI) it doesn't seem it will add much to society other than shifting profit in tech from Labor to Capital/owners.
If creating software becomes cheaper then that means I can transform all the ideas I’ve had into software cheaply. Currently I simply don’t have enough hours in the day, a couple hours per weekend is not enough to roll out a tech startup.
Imagine all the open source projects that don’t have enough people to work on them. With LLM code generation we could have a huge jump in the quality of our software.
With abundance comes diminishing relative value in the product. In the end that skill and product would be seen as worth less by the market. The value of doing those ideas would drop long term to the point where it still isn't worth doing most of them, at least not for profit.
It may seem this way from an outsiders perspective, but I think the intersection between people who work on the development of state-of-the-art LLMs and people who get replaced is practically zero. Nobody is making themselves redundant, just some people make others redundant (assuming LLMs are even good enough for that, not that I know if they are) for their own gain.
Somewhat true, but again from an outsiders perspective that just shows your industry is divided and therefore will be conquered. I.e. if AI gets good enough to do software and math I don't even see AI engineers for example as anything special.
many tech people are making themselves redundant, so far mostly not because LLMs are putting them out of jobs, but because everyone decided to jump on the same bandwagon. When yet another AI YC startup surveys their peers about the most pressing AI-related problem to solve, it screams "we have no idea what to do, just want to ride this hype wave somehow"
Many people have anecdotally, when I tell them what I do for a living, have told me that any other profession would have the common sense/street smarts to not make their scarce skill redundant. It goes further than that; many professions have license requirements, unions, professional bodies, etc to enforce this scarcity on the behalf on their members. After all a scarce career in most economies is one not just of wealth but higher social standing.
If all it does is allow us to churn more high level software, which let's be honest is demand inelastic due to mostly large margins on software products (i.e. they would of paid a person anyway due to ROI) it doesn't seem it will add much to society other than shifting profit in tech from Labor to Capital/owners. May replace call centre jobs too I guess and some low level writing jobs/marketing. Haven't seen any real new use cases that change my life yet positively other than an odd picture/ai app, fake social posts,annoying AI assistants in apps, maybe some teaching resources that would of been made/easy to acquire anyway by other means etc. I could easily live without these things.
If this is all it is seems Ai will do or mostly do it seems like a bit of a disappointment. Especially for the massive amount of money going into it.