The rise of website builders like Wix, Squarespace, and Shopify has drastically changed the web design and development industry. As a co-founder and lead developer of a small design/development team, I loved my job and was passionate about creating beautiful and functional websites. However, as more and more clients turned to these DIY platforms, I found it increasingly difficult to justify our higher prices. While our work was more sophisticated and better optimized for SEO, clients often didn't see the value in paying £10,000 or more for it.
Ultimately, I realized that our trade had become commoditized, and I made the difficult decision to sell the business and move on. I transitioned to a new role as a Product Manager, and over the years, I've climbed the ranks to become a CTO at a scaling startup. Although I miss the thrill of being a developer and creating websites from scratch, I've found new challenges and fulfillment in my current position.
If you're facing a similar situation, I'd suggest exploring other avenues to keep your passion for development alive. Consider taking on side projects as a hobby, collaborating with industry friends to start an indie project, or even teaching others about web development. Just be sure to carefully review any non-compete agreements with your current employer before pursuing any new ventures. Remember, although your job may have changed, your passion for creating great websites can still thrive in new ways.
It's not just that the budgets are harder to justify; it's also that prospective clients consider themselves to be experts. Having your judgement and expertise consistently overruled is disheartening and you often end up just following instructions, sapping any joy from the (diminishing and cheaper) role.
I started my web business 25 years ago. 5ish years ago I started photographing and filming travel content, primarily with a drone. More fun, more interesting, better feedback, etc. When building my own web projects though, I still love it.
In many cases, the client is right. I remember a college buddy who designed websites complaining about a problem client. Apparently he wrote their site in Silverlight, and the restaurant wasn’t thrilled someone had to download a plug-in to see their hours.
LOL my first "production" website was made around 1997/8 and it required people to install FLASH — not a hit, and no traffic either — because I just wanted it to have this certain featureset =P
Some have very skewed notions about UI/UX that don’t conform to how most users (and pertinently, their customers) use web apps; but they will insist on it.
That is what the web has become. A platform to selling products.
But why did it have to be so? Why couldn't it also have been a medium for expressing beauty?
It still HOSTS a lot of art - art in the form of images.
But the number of truly beautiful artistic WEBSITES - in CSS, HTML, etc, are few and far between.
Hell, I remember the heyday of Flash - before Apple smothered it. There were some INCREDIBLY gorgeous websites with absolutely incredible animations and visuals.
Of course they were dogshit for accessibility, discoverability, information presentation, etc. Not to mention the apalling security/performance of Flash. So I don't wish the whole web was like that.
Some may know better for very specific parts of their business, but in the majority of cases, I would say they do not. More often than not, they're the ones pushing for some overwrought site with scrolljacking and needless animation or side-scrolling and the like.
The gripes about web sites you see on HN - they're the things clients insist on.
I think the effect on the job market of (things like) midjourney is much more drastic and steeply immediate than that historical example (not sure if it's a true story or hypothetical? not sure if it was written by chat-gpt?).
I mean, I'm sure many developers over the years have found their jobs disappear, and many contracting companies have gone out of business due to changed market. But at least through now, even with the layoffs, jobs developing websites are still plentiful. (Perhaps that won't be true in the future, sure).
I feel like OP is probably right that they are going to have a lot of trouble getting a job creating 3D art, that the job market has _drastically_ shrunken almost overnight, beyond the effect that wix/squarespace/etc had. (Although of course even in your example, it's a shrunken market)
All I was really getting at was that it sapped the love (and money) out of the job, like the OP was suggesting for them and Midjourney. Once your art/talent has become something that either the common person or a machine can do, your only option as a career is to move on and diversify. It’s unfortunate, but has been happening for decades in all industries. Anyway, just sharing my experience of it happening to me.
> the words have a warmth that generated text lacks
Reminds me of my philosophy teacher in high-school that gave me the best grade of the year and praised how "personal" it felt, for an essay I had bought online for 3 euros.
The irony of it still makes me chuckle 15 years later.
Using this tool on the comment yields a 77.6% fake score. To give (pretty limited) contrast, a response from ChatGPT gives 99.9% fake, and another comment from this thread gives 0.1% fake.
I assume that means the comment text was GPT-assisted, with some moderate editing from a human.
I checked this tool on some of my comments. Mostly it reported around "100% real". However https://news.ycombinator.com/item?id=35259575 this comment was generated by ChatGPT and it was also reported as 97.6% real. And this comment: https://news.ycombinator.com/item?id=35286822 was reported as 88.79% fake but I wrote it myself. So I wouldn't trust this tool too much, even from my very limited testing it wasn't very reliable.
I asked chatGTP to "write a hacker news like comment on the rise of website builders effect on web design and development industry and how to found joy in other roles and keeping programming in side hobby" and it gave me a 97% real comment by the score on this site, not convincing.
I think it is likely the complete opposite of trouble. It helped them write their comment faster and in such a way that it was easy to understand. This is one of the best use cases for chatgpt. Rather than spend minutes trying to get the right wording, tell chatgpt what you want to say in a few short notes and get a well formed coherent text.
Now if this is a good thing, that’s up for debate - although overall I personally would say yes.
I think the point here is that for a group of people (I would argue most) the primary goal of a comment is to broadcast your ideas, with the language being used to convey them of secondary concern, of which this part could be delegated to AI.
The GPT-detector is fascinating (and seemed to work pretty well on some inputs I tested)
On the other hand, this next paragraph is scored as 84% fake and I'm quite sure Churchill didn't have Chat-GPT to help:
> We shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender, and even if, which I do not for a moment believe, this island or a large part of it were subjugated and starving, then our Empire beyond the seas, armed and guarded by the British fleet, would carry on the struggle, until, in God's good time, the new world, with all its power and might, steps forth to the rescue and the liberation of the old.
IMO, we must not be too quick to conclude that any given text was created/assisted by an LLM based on a scoring algorithm alone. (A low GPT likelihood is probably reliable [for now, until that starts being gamed].)
If you offered me even odds, I'd wager that the subject comment was 100% hand-written.
We are the hollow men
We are the stuffed men
Leaning together
Headpiece filled with straw. Alas!
Our dried voices, when
We whisper together
Are quiet and meaningless
As wind in dry grass
Or rats' feet over broken glass
In our dry cellar
Or Reddit. Let's not forget what LLMs are trained on. It's not just Wikipedia and some official text corpora, it's Reddit dumps and other regular Internet conversations. If you learned how to write English on-line, there's a chance you've internalized a style similar to how LLMs often respond.
(In fact, I often feel like I sound too much like ChatGPT myself.)
100% truthfully, I wrote it all myself, but tweaked the opening paragraph with ChatGPT to improve my grammar slightly. It’s a great tool, and one that should not be dismissed, just like spell checkers and the grammar checkers of old in the likes on MS Word. Maybe it does come across a bit ChatGPT-like which is unfortunate I suppose.
As for the suggestions of me asking it to write the comment from prompts, then I’m afraid that’s 100% wrong.
I’ve noticed a large amount of responses from chatgpt start the second/third paragraph with “however, “ or a similar adverb. It will always strive to provide a “balanced” view, even on heavily one sided subjects.
When DALL-E 2 was released, I remember reading lots of people here in HN saying it would never take the jobs of artists. Well, seems like this argument is aging badly. This is exactly the type of conversation we need to have as a society. Instead of deluding ourselves that our skills are impossible to be reproduced by a machine, we should strive to build a system with values in which we can find a way to build a good live knowing that everything we do will eventually be better reproduced by AI.
AI won't take a thought workers job. It just sucks the life out of it. Like the person who wrote this post, I make bespoke software. I tinker, I experiment, and I produce high quality results in reasonable timelines. Executives could not give a shit about that, except that those qualities in me are a good signal that I am dependable. Now, if most companies allow AI bottom performers will speed up and their quality will likely slightly increase. As with most things, those on the top end of the spectrum will need to regress to the mean to compete. The result will be that mediocrity takes over because the AI will be in charge of the boilerplate, which is often the foundation for the rest of your code. You can either find enjoyment in the boilerplate being rote and finding places to spend your time in a higher level of abstraction or you can let what used to be your passion sail away like a little red balloon on a windy day.
I don't understand how you can hold this position considering not only where we are but also the rate of progress in AI. It has been taking thought workers job and it will increasingly do so in the future until the surplus cognitive workers offer is just so minimal that it's not worth to automate it.
First, of all, why it has to be "totally replaced"? It's enough for positions to be dramatically reduced, and most such "though workers" tasks getting done automatically, e.g. under the final supervision of much fewer people.
Second, why it has to be "thus far"? If it happens in 1 or 5 or 10 years does it disqualify it?
Because above you claimed it will "never take thought workers' jobs". Suddenly the goalposts moved so that it has to have happened "totally" and "thus far" to qualify as taking their jobs?
I'm witnessing engineers on my team and myself saving a lot of time in development and in operations (scripting).
I have no doubt in my mind it is reducing the need for more engineers and has probably already contributed to a reduction of them and I think it's just getting started.
Compilers and high-level languages assuredly reduced the number of engineers needed, right? No: we just started making much more complicated things and the basic level of "this isn't even a serviceable solution" has gone up so high that if you sat around and wrote a trivial program designed to be used over a teletype--which is got much easier as you can do it in C instead of assembly--you'd lose in the market to your competitors who are willing to put in the effort to build a modern fancy reactive single-page whatever's-hot-now web app.
The reality is that companies are going to continue allocating a similar percentage of their budget to tech that they were before, and what we DO might change a ton; but, frankly, I've been doing this now for decades, and the stuff I used to do 25 years ago (omg: a full quarter of a century of professional software development ;P) already often just isn't a thing anyone does anymore. Every time my job gets easier, the expectations for the products I architect also increase, because we're all in a rat race attempting to compete for the same customers to defend our margins.
Hell: I remember when the first step to deploying a new service--even if you were a company that only had a handful of people--was to attempt to predict how much capacity you needed and then buy a ton of parts so you'd be able to sit around and build machines for your server room to run your new product. I was involved in a project back in like, 1998/1999, for a company with all of maybe five people, and someone's job was just sitting around building the server room... that specific task is now a few lines of YAML configuration and a couple of accounts (AWS and Twilio <- back in the day, we had to have custom T1 lines brought in not for data but for bulk parallel phone service to hook up to the Dialogic boards we needed to simulate phone lines ;P).
The concept of cloud-computing entirely changed that part of the business--having a small handful of companies manage all the servers at scale and run numerous jobs on shared hardware, in some sense reducing the cost to the point where you can both outsource this and pay less--to the point where doing that almost sounds insane; and yet, I'm sure there are more people getting paid more to deploy and manage ever more virtualized servers in the cloud than there ever were people (including me!) literally screwing around with screwdrivers in the ubiquitous server rooms. You could argue that my job got "replaced", with all of the skills I had to deal with IRQ balancing and automated power backup systems being for naught... but, somehow, I'm apparently still relevant.
I'll also admit that I honestly kind of enjoyed doing that work?... and yet, I also was quite happy to never have to do it ever again. I'll never forget the day I took the first real-world server I was in charge of and just like, uploaded it to 1&1 where I had an early software-based hypervisor-like stack I was experimenting with (UML, aka User Mode Linux, which is no longer something people bother with due to the advent of Xen/KVM/etc.); it just worked and I was like "omg I'm never building another server again, am I? that was it: it's over". (It then turned out that I did build another complex hardware solution in 2016 to start doing work with GPUs, but there was only a small window of time there before cloud services caught up.)
So like, sure: at some point the AI is going to entirely replace what we do--not the specific frustrating coding tasks we do today, but the entire concept of our jobs--but, the issue at that point isn't going to be me figuring out what I'm going to do next to make ends meat... it is going to be that I'm not a serviceable warrior in the war against the machines as the world devolves into some kind of Terminator-inspired hellscape; because, if you can manage to do my job in its totality, then you frankly can also do the job of everyone who has ever hired me and you can do the job of every investor who has ever given us money to do those things... we are all obsolete and, if AI actually gets as good as so many people here fear it will, at best we all become pets :(.
> companies are going to continue allocating a similar percentage of their budget to tech
I think your argument requires a paradigm shift like teletype -> web to soak up the new productivity. I don't think we have that today. For how many purposes have we reached peak consumer software capability? i.e. global reach to publish text, images, video and make purchases is the entire use case of most media and commerce businesses. If you gave every company double the amount of dev resources tomorrow, do they even have anything to build? Even for those with deep feature roadmaps, how much of that stuff is just nice-to-have things that won't move the revenue needle anyway?
Maybe we will get that new platform from AR or direct brain interfaces... but without it, the situation looks more like how the human role of "computer" just vanished... sure we then got programmers, but they weren't necessarily the same people.
Would being the pet of an AI-run civilisation be so bad though? Iain M Banks's Culture novels explore basically this concept, and his vision of utopia is quite appealing.
Is Jernau Morat Gurgeh the Player of Games, or a mere pawn moved by Minds beyond his comprehension?
The Culture appeals to me, but there are literary critics who find even The Culture, let alone the grander environment beyond it, to be dystopian. And I definitely don't want to wake up dead with my mind in one of Joiler Veppers' simulations.
I honestly don't like how we treat our pets (and especially so here in the United States). I keep imagining that the situation will end up feeling--at best--like this episode of (The New) Outer Limits. :(
> Suddenly the goalposts moved so that it has to have happened "totally" and "thus far" to qualify as taking their jobs?
I was asking a question. I don't think anyone can confidently quantify the effects of AI at this point so I was seeing if I was missing something. What I said is just some word vomit about what I'm seeing on the ground as an engineer. I'm happy to accept a refutation as we gather more concrete data. The reason I asked the question the way I did is because as I stated there is no doubt AI will impact our jobs, but I'm understandably more concerned about less jobs because of AI. I also have the experience of writing a lot of automation, and hearing this same style of rhetoric when we started doing that heavily. For all the scare around automation, it did end fact end up creating more jobs and typically made lives easier.
Jumping to "shifting goalposts" over that is pretty trash conversation.
I have seen at least one outsourced programming job slow down to the point where it’s effectively no longer outsourced. Previously there were two engineers working, and now there is one as the work isn’t needed anymore.
That in a very real way is a replaced job.
The remaining job is running large scale data loading for a startup and really now is just eyeballs to make sure errors are caught. No longer requires much skill either.
I think some degree of low end engineering work in the outsourcing world is at risk.
Companies are slow to move on things, but I think "video game concept artist" is done. An art director can now sit down in front of Midjourny and pump out years and years of a concept artists work in just an afternoon.
When electric sound amplification came along, it put most professional musicians out of business. Instead of an orchestra all you need is a band of 3 or 4.
Making music expanded though. In the sense that more people played, there was more variety, and indeed more professional musicians today than 100 years ago. (the bar for entry is also lower, and most are permantly broke - take that as you will.)
Some orchestras still exist. But they're pretty rare.
So yes, alot of job descriptions and job titles will change. Making indie games just got a lot easier. But where you see loss, many others see opportunity.
AI is opposite of human creation. Its definition of anti-creation. Its a spell that locks visual culture to year 2025. And it can only do this by illeagly stealing all the human work.
You don’t think we can make it come up with variations, and use people’s feedback on what’s interesting to figure out what new avenues are promising? It’s not that hard to do this sort of exploration, it’s not limited to just mimicking what it’s seen. If anything, I think we’ll see an increase in the pace of evolution of art as it becomes economical for normal people to get personalized art.
> Its a spell that locks visual culture to year 2025.
What a fantastic way of putting it.
I'm constantly reminded of _the Matrix_, with Agent Smith telling Neo that 1999 (the apparent year inside the Matrix) was the peak of human civilization.
It’s nice that the barrier became lower but people still need jobs to afford food and rent. I doubt that UBI will come anytime soon so less jobs that pay enough is a serious issue.
Agreed. It feels like a lot of the people burning hard on AI seem to think that when the jobs are gone some benevolent benefactors (who exactly?) will decide to institute UBI for all because....why exactly?
It seems much more likely to me that all that will happen is there will be a lot more poor people...
it's not like "all jobs" will be taken by Robots and AI. That will be of little consolation though to modern-day buggy-whip-makers.
Jobs are a way of "adding value" to society. But over the last 120 years we've figured out how to add a Lot of value with minimal people. 95% of people used to grow food. Now it's like 2%.
People moved to factories, where today robots and automation rule. People moved to hi-tech, and leveraged that tech. Now AI is moving into that space.
In truth, we now create way, way, more value than we need. We need to figure out how to distribute the value created. Somehow I doubt that the US will be the leader here; things like UBI are too far against the "American Way".
When we see UBI emerge, in whatever form that takes (basically basic human needs met) it'll likely come from places that value community over individuality. Places that rate their success by their poor, not their rich. Places that celebrate achievements which don't require the exploitation of others.
I was just pointing out one job that I could see in my industry. I for one welcome the wealth of interesting new indies games with great looking graphics we'll soon see.
People will be able to take more risks and do more interesting projects. It will be awesome.
You are wrong, I have worked for many years as an artists and level builder in video games. Bioshock is the most famous game, and Void Bastards the most recent.
Could you instead use AI to reduce drudgery in your work, or to iterate faster between ideas until you settle on a final peace? What if you have something in mind that is hard to prompt for, or the tool just refuses to generate what you had in mind? I would try to lean into it as a tool to speed up some of the tedium, instead of fear it, if possible
I'm not in the camp that fears it, I think it will be an awesome tool and I can't wait to use it in my next game. I've been playing with Midjourny a lot already, but have been watching Stable Diffusion and the control net stuff closely as well.
I could see drawing a couple of characters and getting an AI to extend the style to more. Especially if you can give it feedback. But the idea that you could get years of concepts in an afternoon is silly. That's like saying I can generate hundreds of business plans using ChatGPT.
There are still professional, high quality hand knitters in the world.
Of course, having people hand knit garments used to be the only way to get a knitted garment at all.
Then we invented industrial knitting machines, and those hand knitters found their roles had changed. Instead of knitting a whole garment, they would be closing up the toe on the socks, or doing finishing work on a sweater. Of course, companies didn't need anywhere near as many knitters under this system, so a huge proportion of them lost their jobs.
Then the knitting machines got better. They could close the toes on the socks themselves, could do most of the finishing work automatically. Some of the remaining knitters became industrial knitting support workers, but most of the actual knitting jobs dried up.
But there are still professional, commercial hand knitters, even today! They test hand knitting patterns for the hobby market. Make the samples up for photographing, and make sure that all the sizes come out right.
They number... Dozens? Maybe? And most of them treat it as a side gig, despite being the absolute pinnacle of hand knitting talent, since it pays terribly.
A job doesn't have to have been totally replaced to be effectively replaced. As we find ways to hand over larger and larger pieces of the work to an automated system, the number of real roles in that field diminishes, until it eventually becomes infeasible as a career choice.
This is what a lot these digital content creation jobs are heading. Gradual obsolescence.
You could say this about every efficiency improvement in any field ever. If everything still had to be written in C and we didn't have fancy IDEs and access to a vast ocean of libraries, cheap services doing the work for you and returning results over an API, etc., we would need many many many times over more software devs to have everything software based in the world that we do now.
But do you think if none of these things had changed, and everyone was still doing nearly everything themselves, with a limited subset of libraries in use, writing everything in C, we would have all of these applications, services, pieces of software, etc? That all of these viable businesses employing people would still exist?
I doubt it. And like the poster from the reddit link, a lot of software devs truly did enjoy a lot of that work that they did. I've talked to plenty of greybeards who will wax nostalgic over spending months writing things that a pip install and import solve in a few seconds now.
If we double the efficiency of a team, do we fire half of them? Or do we double the amount of products we're releasing? Or double the amount of features for one product?
I think we have far more software devs today because of the advent of higher level languages, large library selections available with package managers, feature-rich IDEs, etc.
Is this always the case, for every industry? Obviously not. There are a lot fewer people making furniture by hand today than there was before automated manufacturing, and any individual only needs so many chairs. But there is, of course, still demand for hand built furniture using traditional joinery techniques.
"Art" is a big space, and I don't think that the impact will be universal across every part of it. For games in specific, I think we'll see something more like the software development side - people will make more and more unique assets vs. reusing them, we'll see the artwork be more detailed and "populated" in general. We'll see more and more smaller studios putting out work with higher quality and more art assets. Same thing for other media-related positions. And the people who are reliably selling prints for people to hang on walls, collect, etc., I think are also likely quite safe. I think people relying on twitter/instagram/etc. commissions to draw people's D&D characters or anime waifu are likely in a lot of trouble.
Do you think it takes as many people to build whatever software you want today vs in the 80s? I agree more software can be built with the improvements, but per unit of functionality, way less people are needed, or they can do it much faster, which in both cases mean you spend less of your revenue on labor, which means "less jobs".
>which in both cases mean you spend less of your revenue on labor, which means "less jobs".
You can't view things in the void like this, though. Reducing the overall cost of doing something makes it viable for more businesses, which means more jobs. We have more jobs in software development overall now because of how much more productive it is today, which opens up so many more viable use cases for it. We have more CG artists for movies, games, etc. now in large part because more efficient workflows and more powerful software have increased productivity.
Will it be the same thing with generative AI? I think so, at least for the foreseeable future.
their point is simple: during the last 20-30 years we had efficiency improvements by orders of magnitude. Still we didn't have developers jobs reduction in any way. We evolved into doing more and more high-level and high-scale jobs. So no, -10% of time needed to build the same unit of functionality won't lead to -10% reduction of workforce. It will change the scope of said workforce.
They did, and so did high-level languages, and not to mention the entire profession is first and foremost about automating other peoples' jobs away. Thing is, nobody is really tracking how improvements for software developers are replacing people, because this only manifests at slowing the absurdly high growth rate in this industry.
Yes they did, what are you thinking would have happened if all we had was assembly? To build any meaningful software you'd need armies of people and it'd take way longer and it wouldn't be as good.
At some point, you saturate the market for whatever’s being produced and then prices drop and you have diminishing marginal returns for production so the economics of making more doesn’t make sense.
I think Google has a weird pathological case maybe. So, I'm not sure I would use them as a forecast.
I think R&D is good in any company. Sure, the nature of R&D is that most of it is failure; that's just the way it is.
AI makes the R&D iteration shorter and cheaper, IMO. The more you try, the more likely you're to succeed (aside: I should listen to my own advice sometimes lol).
I hope they are right. But one thing that seems to be conflated in this debate is the comparison of AI with previous disruptions. The tools we have built before were taking some specific parts of the process and making it more productive. They were not trying to mimic human cognition in a way that can be applied to so many different processes at the same time.
Amen. AI will make us all "managers" of some sort. Goodbye to getting in flow state and trying to build something yourself, and say hello to a gazillion meetings from morning until evening figuring out what to build and what prompt to feed to the AI.
AI is going to make our bullshit jobs even shittier.
Instead of grinding away over a piece of code, I will get to talk to people more?
Doesn't sound like such a bad deal to me.
I get your point, of course. I just think communication can be like a craft, challenging and rewarding. I try to frame things positively, it helps me stay motivated at work.
I appreciate your attitude, and it is something we all have to pick-up eventually. When you pose it as "talk to more people", that does sound better. But I see it as a job with more pressure and tension, more politics, and less freedom to experiment.
Think about the average day of a manager/management consultant/VC (whose job it is to talk to people, make deals etc). I always looked at their careers and walked away thankful that a programmer career exists.
The person who wrote the post says that they can produce the same quality content in "2-3 days" instead of "several weeks", and that there are two people working in essentially the same position.
If the amount of content to be produced remains constant, then from a purely financial point of view the company should be looking at cutting one of them. AI would have then taken their job.
For the amount of content to be produced to not remain constant either the studio would have to go for increased art quality, or scale up the rest of the business to keep up with the new art productivity. It's not clear they'd make that decision over cutting their art department spending in two. At the very least their job is at risk.
Oof. Firing isn't free, comrade. Reducing redundancy to a single point of failure has two major effects: the latency penalty becomes quadratic as workload scales, and if any person quits, you lose the whole team.
In general, creative roles like artists are the sum value of their education and experience, not just their measured output. If you don't possess art history vocabulary, you're going to be very limited in prompting or designing more advanced embeddings and LoRA or training models. I'm seeing this a lot right now - people don't want to share their prompts because they don't want everyone to know how much the model carried them and how little any actual "prompt engineering" went into it.
The smart thing for their boss to do would be to scale projects and have one of them focus on managing and filtering the new volume of assets. Since it sounds like they like working from scratch, even adding external time to "research tooling and methods" would probably keep them happy until they find the challenge in their new duties.
> people don't want to share their prompts because they don't want everyone to know how much the model carried them and how little any actual "prompt engineering" went into it.
Right. Because this whole "prompt engineering" is just another job title, not an applied science, and then again, prompts can and will be automated away.
The majority of managers would jump at the opportunity to reduce their number of direct reports? Owners might want to cut salaries but managers want to increase direct reports.
>> The result will be that mediocrity takes over because the AI will be in charge of the boilerplate, which is often the foundation for the rest of your code.
Your take is interesting, but I'll provide an alternate point of view from my experience.
I have spent my career working in a low-code environment. There's a boiler plate framework into which people add code.
Since most of all programs are similar (data in, data stored, data processed, data out etc) this allows for high productivity because almost all time is spent on things that are unique to the domain space, not programming the common stuff. It means individual programmers, and very small teams, have written and maintain huge systems.
But what's interesting to me is the boilerplate code. My career has been spent improving the underlying boilerplate, and extending it to new feature areas. This turns out to be a great lever. Improve the boiler-plate and _existing_ programs get better, with minimal effort. Sometimes no effort.
It's kinda like the way I get TLS 1.3 support simply by upgrading my OpenSSL libraries. But at a source level. Which is then retro-applied to existing source code.
Instead of being mediocrity the goal is to always keep improving the boilerplate because that in turn improves a lot of programs.
I say this not to tell you nothing will change. It will change. But change in programming is fun. It stretches the mind, and opens up new horizons. The AI can write code, but that just becomes the next tool, like IDEs were, like code complete, online help or Stackoverflow. I didn't have any of those things growing up, but their introductions allow me to dream bigger, be better, make more.
If you look at it from the bottom, then yes. SAP/Salesforce/IBM DB2/Oracle/OpenERP are ways for company departments to all work on one common set of data.
>AI won't take a thought workers job. It just sucks the life out of it.
It already has started taking them.
Also note that it sucking the life out of them doesn't preclude this, since nobody cares for quality anyway. Neither the companies, who prefer profit, nor most consumers, who will eat up whatever mediocre shit.
Starting in early '00s, it was getting annoying to have to sit in tech meetings with workers who barely had any actual experience designing systems debate technical decision empowered by wikipedia and some randos blog on hype du jour. Before the internet they would keep quiet since they actually knew zip. It is going to be absolutely awful now.
How all this will affect the youth in context of education, higher learning, students' drive, teachers' domain authority etc. is a more critical issue than labor transitions.
It is interesting and important to note that AI (as it is today) disempowers humans in every dimensions except capital.
Question is: would a 2x or 3x developer paired with an AI be more productive than a 2x or 3x developer paired with a team of several 1x and 0.5x developers? Cutting out coordination costs while still getting the grunt work done by someone else seems like it would lead to much higher and more coherent output.
“Reasonable timelines” are not absolute quantities. They’ve always been an equilibrium of whatever the market can bear. It was simply easier to ignore this before, when things changed much more slowly.
> This is exactly the type of conversation we need to have as a society.
Jobs have been disappearing at a high rate for at least a couple of generations now, and have steadily been replaced with new ones, often high paying ones.
We can't control which jobs are in high, low, and no demand, so we're best off swimming with the current, whichever way it flows. 1. Accept that professions come and go. 2. Expect jobs to disappear every decade or so and plan accordingly. 3. Look forward to an awesome new job after your current one is made obsolete. That goes for all of us.
This is what I would call toxic positivity. Maybe with a side of survivor bias. Lots of people don’t get awesome new jobs. They struggle, they fall out of the workforce.
People don’t want to “accept” that’s some thing they have trained to do and gotten good at is going to just go away. It’s a reality, but it’s also pretty fucking awful.
My father used to make a very good living doing sign painting until the late 80s when vinyl signs became cheap and fast to make. They weren’t as good, they didn’t last as long, but they were cheap. And so his good living went away, and he spent the rest of his working years doing jobs that were not well paid. (The full story is more complicated, but the larger point stands.)
When faced with the reality of capitalism, maybe toxic positivity is all there is. I’d like to think that there’s a way that we could plan our society better, and evaluate tech against social costs, but the reality is that in the US we have never done that.
I disagree with your point. The problem is that people will gladly accept changes brought by technology up until the point that it affects them personally. The people that bought a cheap vinyl sign were not forced to purchase them, and yet they did.
This is not a problem with tech, it’s a problem with us. If you like liberty, then you know the choice that people will make, and that’s what is terrifying.
Jobs have been disappearing at a high rate for at least a couple of generations now, and have steadily been replaced with new ones, often high paying ones.
In the past jobs took a while to disappear. People had time to retrain, to change career, and to come to terms with things ending. The difference now is that it seems like you could be in a high demand role today, and the whole industry you work in might not exist any more in a few years time.
The suggestion that this is just like it was in the past ignores that massive difference.
> 1. Accept that professions come and go. 2. Expect jobs to disappear every decade or so and plan accordingly. 3. Look forward to an awesome new job after your current one is made obsolete. That goes for all of us.
What can we do? Hope to save enough to not go into more debt for a second degree?
Hope that second degree isn’t obsoleted either?
As AI becomes more advanced the minimum education and likely IQ required to even get a job that pays above minimum wage will increase.
What are we gonna do about the ever increasing N+1% of people who won’t be cost effective versus an AI?
The solution is the same one we should have had from the start: take some of the responsibility from peoples careers off individuals and place it on society. As mentioned above, this has always been a problem it's just that now it's a problem for intellectual careers and not just laborers.
Education (and re-education) should be at least affordable if not completely free. Unemployment benefits should be enough to live on (if not well) and be freely available to anyone getting an education. If that is "too expensive" then limit it to specific educations which are currently in short supply.
And lastly, tax income from investments exactly the same as income from labour. This part is actually necessary because if automation is a future we actually want (and we should!) then this will increasingly become the _only_ form of income.
How about striding for one of the core societal values that make many advanced democraties (including the European nordics) so much better/happier than the US? It’s called Equality.
Insert strong progressive taxes, especially on wealth, and use it to fund a strong public sector, so _everyone_ has as access to basic goods and services for free, like education/health/transport/… .
I think you vastly underestimate how much money is the rich have siphoned from society, what do you think that printed money goes to? The change impact would be _massive_, suddenly people would have disposable money instead of crushing debts, nd actually stimulate the economy!
Yup. One of the things that's going to become starkly obvious is this: being in ownership of the money machine doesn't entitle you to power and compensation matching the output of that machine.
Feedback loops set in. If you need X amount of power and Y amount of resources to get it going, but it pays you enough for X times 2 power and Y times 5 resources, the first person to get the machine cranking becomes a human version of that AI paperclip maximizer.
And this has obviously already happened…
So now it's just a matter of, how obvious does this need to get before all the world is paperclips with one idiot sitting there, in his limited human perception, thinking he's won.
It really is Star Trek Future or bust. We don't have the luxury of steampunk attitudes towards power and societal structure anymore. Machine assistance and its multiplier effect are too big, and the runaway feedback effects are too obvious.
Do we have to wait until it's no longer Elon Musk or Jeff Bezos or Mark Zuckerberg, who are at least very aggressive ambitious idiots paying high prices for their goals, and instead it's some feckless Sam Bankman-Fried who ends up holding the bag? What's it going to take to illustrate the scale of the problem?
Morally comparing order of magnitude I don't really see the difference between any of your examples. They're all structurally lacking in empathy and every one of those would sell their mother if it got them ahead in the game.
I don't disagree, but there's a profound difference in effort (most notably seen in Bezos vs. Bankman-Fried). This changes the result, and I think it also significantly changes the framing.
Bezos started in the days of Windows 3.1, and already was positioning himself near Microsoft as a force multiplier even though, in those days, he and other humans had to do vast amounts of work and thinking.
If the only requirement for dominating the world is adjacency to AI, the gameplan is still exactly the same but now the work and thinking is what you turn over to machines as a force multiplier. Without that, where is the justification for producing individuals who dominate the world? It becomes, not arbitrary, but purely a factor of who is adjacent to the AI at the right time.
It's how France historically tried to deal with too much inequality, except it was a complete failure and after staggering levels of evil, bloodshed and oppression they simply ended up with a new elite that had absolute power. No inequality was solved.
Other countries didn't go down that road and none regretted it.
Yet it's amazing how quickly some will still exploit any new technology or change to justify reaching for violence. Some never learn.
You'd best have good implementation or they'll be using ChatGPT to aim your guillotines towards their business rivals, further consolidating their power.
This is the challenge. Anything we've got, people like this have tenfold, or a thousandfold, or a millionfold. As long as we're still running on 18th and 19th century societies.
Mind that you're not literally being steered by the billionaires to accomplish their ends, because that has been the history for the last decade or two. The only thing AI brings to the equation is, perhaps, making the process more obvious and providing a toy version of it that anyone can play with.
Before that, we played with the zeitgeist through marketing, big business… and politics. The only difference is that now we can use it to draw pictures or have it talk back to us like it was a person. The billionaires have been 'prompt engineers' for as long as I've been alive.
The „aiming problem“ for me isn’t really an issue. The goal must be that no one can have too much power over others, no matter why/how. So yeah, the first victims might be tricked targets, but it doesn’t end there.
This might sound totally absurd when you hear it first, but I am fully in favor for randomized rulings in short/limited durations! Lottery style, a bit tweaked to have good entropy.
>> so _everyone_ has as access to basic goods and services for free, like education/health/transport/…
Access? UK tries this. High taxes and high borrowing to fund a large public sector. Result:
- People can't reliably see dentists or get doctor's appointments because the nationalized health system can't manage capacity properly.
- People can't reliably move around the country because the public sector transport workers are constantly going on strike, despite Tube drivers earning £56k/yr, only a bit less than that of a software engineer.
Putting the public sector in charge doesn't guarantee access to anything, it can easily lead to the opposite, which is why the USSR was constantly being wracked by bizarre shortages especially for anything that mattered to the general public like consumer goods. And now the UK suffers massive healthcare shortages. Same problem.
In the UK dentists are mostly private, it hasn't increased the quality or availability in the least.
Public transport is actually a fairly small part of the nation's transport, the UK is very car-centric so the strikes largely inconvenience commuters and drive even more people onto the roads, unless they can WFH.
Strikes are also a very recent phenomenon, just in the last 6 months or so, so this is quite a new thing driven by inflation.
Problems with the NHS are very much poor management by the last run of governments where there is a quite a growing suspicion they are deliberately trying to run down the NHS in order to make a case for to privatise it, following e.g. America's disastrous model.
This is a forced imageproblem try to get privatizations look better than state run, by defunding public sectors and claiming all kinds of weird stuff upto straight lies some people want to believe.
Don’t buy into that propaganda, look at the US how privatization ends up: still lots of tax money, additionally people can’t afford stuff, and a few assholes getting insanely rich in the process by siphoning money out of the society.
Try to ensure appropriate money is used to make the NHS effective again, including wages that allows people to live where they work (<1h commute).
Most communist shortages came from being cutoff from international trades with wealthy countries, and having to start from ruins or nothing at all btw. These are different problems, and you might learn in the process why capitalism is going to end soon. :-)
I can’t describe how much better Europe, specifically Netherlands felt due to the social safety nets provided, and do not tell me that The Dutch aren’t hard working people, because they are.
People are so much more relaxed there. 99% of the concern about AI is because we’re living in a cut throat capitalistic society, if we knew that AI didn’t mean starving or losing our homes, we’d be way more relaxed about it, and imo we’d take out time a bit more and think about safety.
It’s the American race to the bottom attitude which is IMO no longer compatible with a highly automated world.
To start with, if the economy breaks, AI breaks.
I saw Microsoft is winging because other search engines are stealing ChatGPT responses, get used to it M$, you own nothing.
I cannot imagine being rich and be surrounded by homeless people, crime, drugs and the like.
The only „solution“ that still avoids realizing the problem is to try to escape into gated communities and shoot those plebs coming too close, ideally robots so no human is at fault conveniently. Or what useful is having that mercedes when most others want to steal or at least damage it?
I, for one, prefer to treat other humans as, well, humans. And by making everyone around me better off, my quality of life also improves.
For simple striking example, how about giving homeless people housing for free, so they can get their stuff together again? You can argue about how „they„ don’t deserve this all day long or simply do it and eliminate a problem https://world-habitat.org/news/our-blog/helsinki-is-still-le... - and even save a lot of money in the process as a society!
> The only „solution“ that still avoids realizing the problem is to try to escape into gated communities and shoot those plebs coming too close, ideally robots so no human is at fault conveniently. Or what useful is having that mercedes when most others want to steal or at least damage it?
You say this like it shows how it's not a plausible outcome or shows the error of this path, but unfortunately that's exactly the American plan. Yes, that. Is what is literally already happening in USA. It is pretty nightmarish. The rich don't seem to think it's a problem, or don't seem to think there's any available option more desirable to them.
The great thing about fascism (the extreme form of conservativsm when going more rightwing) is that it’s always a matter of in-group vs out-group, THE stereotype of thinking of the right-wingers.
Like, we have the true values and they are immoral. This thinking naturally Leads of subselects within the in-group that are even more true than the others… and it gets recursively smaller going towards fascism.
So ultimately you end up with a tiny in-group (the führers are now some billionaires), that still want to backstab the others secretly, who become paranoid over time, isolate themselves to minimize contact with these lesser human beings left behind… and loose touch with reality.
this is where it ends, usually, one way or another.
But the important part is that it actually DOESNT MATTER what they think - at some point people are frustrated enough and simply take it all away from them… the tipping point is when even public security staff (like police) is willing to let it happen - mostly to reduce suffering from family/friends/… .
Self-righteousness, violence, or tendencies to authoritarianism can be found on both left and right. But fascism means a certain kind of far-right populist politics, not just these attributes in isolation.
words do have definitions. every political direction can have authoritarian traits, still fascism is the extreme form of conservatism. So is communism as the extreme form of socialism, to compare it with left-wing words. The third political direction is liberalism that ultimately leads to anarchy (no state at all). Both American parties are in the area of liberal-conservative… as a European it simply looks like slightly rightwing (dems) and bordering fascist (reps).
Political classification is a complex topic though, it needs multiple dimensions to somehow get to something useful as a mental model.
I am „green“ first and foremost, and while the solarpunk or eco-socialist movements surely have socialist aspects, the „green“ main focus isn’t even covered in previous models. Green is neither left nor right nor liberal, it’s… future oriented? Dunno.
When you think about it,
All wealth, every single bit of it has come from the same earth we’re all born on. This is the one truth which binds us all, including the AI, it’s too a child of this Earth, which means all people should have fairly equal access to at least basic protections the Earth provides us with. That’s all there really is to say about it.
Yes! The world where you end up in a random position with zero influence over the outcome if you could pick it before you were born is the one you should strive to create. That one has true equality.
I like that idea, but I don't even think it has to be "perfect equality", just if you're not inclined to strive for more, or you lose your job to a computer, you don't have to go home and tell your family that there live is about to get much worse.
Um...Python, C++, Linux, Erlang, mobile phones (Nokia), Spotify, HTML...
Having worked for both European and US companies, I would say that a) innovation is mutual (we learn from each other) and talent/mediocrity are pretty much evenly distributed.
The US has a more "dynamic" economy in some sense - for example the VC ecosystem is far better funded - but is detrimental in others (Byzantine health care system deters would-be entrepreneurs).
And yet if we take a snapshot literally right now:
Guido van Rossum - Microsoft Distinguished Engineer
Bjarne Stroustrup - professor of Computer Science at Columbia University (City of New York)
Linus Torvalds - Nationality American/US since 2010 (should I write GNU/American/US? Someone educate me).
Erlang - I don't know anything about Erlang so I've got nothing to say.
Nokia - brutally ejected from the market by the Americans, bought out and sepukku-ed under Microsoft leadership. Now phones may as well be an Asian technology. Oversaw the collapse of European leadership in the mobile space, to the point where it is going to be a case study in how to fail at innovation.
Spotify - Leading EU tech success story. Makes up 50% of the counterexamples, along with ASML. Not detectable as a serious tech company when compared to Google, Microsoft, Amazon, etc, etc.
HTML - I mean, we'd probably have managed without the Europeans on this one. Although useful, HTML isn't a very technical accomplishment. Anyone could invent this, including most CS undergrads.
Most stuff you cheer here is basically a few tech giants out of Silicon Valley. The science kickstart for all that came from a bunch of scientists (like Von neumann) you got when winning WW2 from europe. In a time when many potential competitors were still rebuilding from ruins.
This is over now, europeans tend to no longer go to the us but still produce excellent scientists, Asian countries catched up and both continents start to leave the US behind in most aspects as a desirable place to live, except for 2 things: Military power and billionaires.
Enjoy your 5 tech giants while they last, but also keep in mind that these are international now and draw from nonamerican intellectual power also.
That's the point though. Van Rossum, Stroustrup and Torvalds made their name early on in their careers or even at college, while still in Europe. Sure, later on they emigrated to the US for the money, but that was on the back of their accomplishments.
Of course Nokia was eclipsed by foreign competition (mostly due to terrible management), but it was still innovative in its time. Just as US companies today face competition from Asia.
> Although useful, HTML isn't a very technical accomplishment. Anyone could invent this, including most CS undergrads.
Yes, but they didn't, did they? It's easy to look at something with 20/20 hindsight and say "anyone could have done this". Back in 1989 Berners-Lee had that insight.
> HTML ... Anyone could invent this, including most CS undergrads.
Yes, For a Linux user, you can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem. From Windows or Mac, this FTP account could be accessed through built-in software.
"Europe is irrelevant" is such a weird take for a country built by Europeans, populated by Europeans, using European legal system, European language, etc etc.
Ever played some civilization (the game)? A mostly military focussed society can force others into unfavorable agreements, but while doing so other areas of society are lacking, amounting to huge problems later on when
a) blackmailing for some reason stops working
b) non-military related businesses are vanishing
c) increasing %share of your population has a degrading life quality
d) military costs become unsustainable for questionable ROI
… doubling down on military is a death spiral, especially when the funding money comes from slashing public services.
That’s the total opposite most people around me (many immigrants from the US!) experience for the like last 2 decades actually, where have you been to that draws such a funny picture?
Europe has more % Entrepreneurs/Self-employed people than the US, lots of great companies even though not leading in the richest top5-ranking, and the work ethic is focussed during working hours instead of chitchatting/looking busy 12h+ per day.
Maybe you are also fooled by one critical thing: US companies are muuuuch better at marketing than European ones, outshining them so you tend to see exclusively US blogvertising and not where real innovation actually happens: all around the world now and the US actually falling behind.
Could also be the advantage of a unified market helps US startups scale faster, maybe someone else can speak to that better. Its been influential where Ive worked, at least.
I also think at least US tech has significantly larger firms, but I dont think thats directly in line with societal good. Seems pretty fucking cancerous to me. Cant speak to industries outside of software though.
Europe (mostly, at least) does have a unified market on paper, but not culturally, which makes marketing, localization etc. harder - a commercial that sells well to Germans might go down like a lead balloon in Spain.
The biggest advantage I've seen in the US has been the VC ecosystem. For all we deride VCs - and sometimes (particularly given events over the past few weeks) with very good reason, the money spigot is just much more generous in the US. Of course that will change in the new low interest rate economy, but that to me seems more the deciding factor than silly claims that Americans are somehow more innately "innovative" than Europeans.
Money as a forcing function seems to come to an end, also I really am laughing right now how the US tries to suppress TikTok so it doesn’t overtake all these social media companies seemingly unable to compete :-)
There is also a language factor here. 100 years ago, English was not the ‘lingua franca’ of the world, traveling in Europe you were better off with French, German or even Latin.
US marketing currently happen to be understood almost everywhere and native speakers of English have an advantage. A company does not necessarily need to execute better, but through effective communication they may be able to pretend they are.
There are lot of examples of ideas being picked up and reimplemented in the US, where the original was better but where the adaptation got better traction. The film industry is full of it.
Why should ingenuity not be evenly spread around the globe as well as hopes for a better future? The US represents one particular policy for wealth distribution that happen to work well for, in particular, VCs.
This is the key point. The single unified market with 300+ million people speaking the same language is the advantage that will make all the other ones more or less irrelevant.
Sorry? You think the American economy is so broken that people 'can't afford stuff' anymore?
That there is an absurdly increasing wealth gap is a fact, but that doesn't mean that - for the time being - US households as consumers are still spending lots of money. And if you're in the B2B side of the market then it's even easier.
I said dimishing, not gone already. A shrinking middle class is a real problem, leading to all kinds of issues, like reduced spending power - something that until recently was hidden by people going into debt massively… which now turns into even more problems soon.
All the while you can now do stuff in English all over europe increasingly well and legislations are getting normalized between countries, ignoring the odd brexit events here and there.
Ultimately what I am saying is: the factor exists and is significant, but getting lower for the US and other parts of the world are catching up rather quickly.
This comment can be read as you being happy for the demise of the US. It’s wonderful to see how many actual allies we have, who the minute we trip up they’re immediately on scene saying “i told you so” and ready to pick the bones.
You are reading that wrong, and it even sounds you want to read it that way.
For sure Europeans are not keen to have to deal with china or Russia as the new world powers instead of the US. Really! But sometimes friends have to call out each other when things go in a bad direction.
But most Americans probably have to deal with the fact that they really aren’t the greatest country on earth any longer, except for military spending, gun deaths and prisoners. This realization is needed before one can really make steps into improving things again.
If the US was half a competent at city planning as Europe it would be a no brainier for me. Unfortunately the US cities that are remotely walkable are also the most expensive. Phoenix / LA / Texas car dependency? HELL NO
Yes a walkable city is a massive life quality factor, but one can only „get it“ after experiencing it. Add good public transport on top of that and you can see why european/asian cities are so popular nowadays
> What are we gonna do about the ever increasing N+1% of people who won’t be cost effective versus an AI?
Fairly tax companies' profits.
And then you can afford to just feed 10% of society even if they don't contribute a thing. Also, free world-class education seems to do wonders in Europe ;)
Low corporate tax is fine. Fairly tax the recipients of those profits. Warren Buffet should pay more on his share of Apple's profit than some retiree with Apple in their portfolio. A large corporate tax ends up being regressive.
That magical Nordic system is called "social democracy", it's of course rooted in socialism, and it's so old that one of the first dark memes amongst left-wing groups ("You killed Rose Luxemburg!) is about them.
It's strange how those who say the Nordics aren't really socialist will be the first to dismiss their policies as socialist when they're suggested for the US.
Neither does communism. Communism implies ownership and control of the economy by the people at large, but the presence of a state is, supposedly, a transitional phase.
> What are we gonna do about the ever increasing N+1% of people who won’t be cost effective versus an AI?
Their quality of life will increase because of all the cheap stuff we can make with ai and robots.
They will coexist and compete with robots.
Upward mobility should be better, given the access to advanced technology.
Worst case scenario, they can all band together outside of the advanced civilisation, grow their food and live a separate life.
There is no need to make high iq people to pay for low iq to live in a high iq society.
We're already seeing a divide of cities for rich people and cities for poor people. It will just be more pronounced.
> Their quality of life will increase because of all the cheap stuff we can make with ai and robots.
Ahh, the old "modern poor people are better off than medevial kings since we have microwaves and supercomputers in our pockets". No matter that they are all stressed, living paycheck to paycheck and a single (treatable) disease from crippling debt.
Don’t forget all the people who laugh in your face and tell you you don’t actually know your newly learned skills, and reduces you to a bartender fraudulently entering their field.
Its been like that forever. Factory workers had to adapt when the jobs went to Asia. I don't see how comfortable office guys should get any other treatment.
It used to be learn to code then maybe learn to weld or plumb it something that is harder to automate away.
While fundamentally true, this comment is astonishingly tone deaf in light of the entirely unprecedented amplitude and timescale at which the shift is happening this time.
"Just prepare" does not work when years become days.
Naively positive. People whose jobs are outpaced by technology end up working "dead end" jobs that nobody else wants or fall into poverty if they are not lucky enough to be able to retire. Turning around in your 40s, 50s, or 60s to find new employment in a new field? This is exceptionally challenging.
> Jobs have been disappearing at a high rate for at least a couple of generations now, and have steadily been replaced with new ones, often high paying ones.
You don’t feel like you should quote some data for this claim?
Not really, have a closer look at anything outside. The Sun and rain and weather constantly erode the condition of human creations.
Try fixing some roads and bridges to start with. They're in terrible shape in most of the US. Any city could probably employ 10 times as many gardeners to take care of the plants and build nicer things.
Elderly need taking care of, and young people teaching. Houses need cleaning. Shoes need shinning. Endless list really.
Stop the traffic on a road to fix it without local administration permission? Do you think getting such permission is easy? How are you going to pay for the materials to do the repairs? Do you think it's easy to get a job as a gardener?
I think you need to read other’s thoughts more charitably. I doubt he was suggesting bored artists and call center workers grab a hard hat and pick a road. The point is that AI makes it easier for humans to do work, and there’s an endless amount of work we as a society want done. Avandon capitalist frameworks of wages/capital/“deserved poverty” for a minute, and just consider a group of humans and a pile of tasks that need to be done.
>> This is exactly the type of conversation we need to have as a society.
Comments like this always make me wonder: How would that happen? Here in the U.S., there's relatively little appetite for job-saving labor protections and, once you leave the regulatory sphere, everyone -- employers, employees, consumers of products and services -- is consistently self-interested.
Not saying that sort of societal conversation is impossible; just saying I don't have the imagination to see how it could happen with actionable results in the 2023 version of the U.S.
It doesn't even really matter what the US regulations are. If a firm in the Philippines can run Midjourney for pennies to generate content that would take hundreds or thousands of dollars in illustrator costs, there's simply no way to mandate labor protections in the US that would keep customers from outsourcing the work.
The gradient is just too strong, and you can't keep a pedigree on every image you publish that originated from a design firm.
We’re creating a new global average standard of living with disastrous consequences.
The poor areas rapidly gain wealth but at great cost to the environment and the whole thing is dependent on a never-ending fire hose of capital and tech transfers from the “wealthy” areas.
The wealthy areas, already unable to offer many citizens the ability to raise a family, are desperately engaging in extreme financialization, ludicrous political distractions, and “make work” shell games just to present an increasingly unconvincing veneer that society is still functioning and fully worthy of participation.
Western civilization is the metaphorical Biblical statue from Daniel: After spending decades replacing the support structures with ever cheaper materials, we’re now finally in the “feet of iron and clay” stage, and it’s likely AI will complete the metaphor by representing the boulder that smashes into the weakened base and topples the entire statue.
I realize this all sounds hilariously alarmist, but I have yet to see a positive spin on the impact of AI that isn’t ultimately the same old decoupled Pete Peterson style nonsense of “it’s gud because worker productivity number goes up”: as if we don’t already have two generations of human beings that grew up in an age of massively increased productivity yet cannot afford a family nor a home to put them in - a state of existence that even some medieval serfs would pity.
My weirdly accelerationist hope is that AI advances quickly enough (and management stays short-sighted enough) to cause a meaningful political coalition to form between the already-marginalized blue collar workers and the newly disrupted paper-pusher/cubicle class.
>> The poor areas rapidly gain wealth but at great cost to the environment and the whole thing is dependent on a never-ending fire hose of capital and tech transfers from the “wealthy” areas.
While I sympathize with your angst at the coming future, a future where the US isn't the highest, I see that for most of the world your prediction is both highly desirable, and inaccurate.
Firstly, let's dispel the great-cost-to-the-environment myth. The US has done, and continues to do great harm to the environment. Indeed many developing nations are skipping the harmful phase and embracing new tech like solar , better urban planning and so on.
Also, while it's fun to posit that thd globe depends on US capital, there's a bigger picture in play. TikTok is developed without western capital, suddenly its a surveillance risk (because Facebook and Google wouldn't spy on me).
Sure there's environmental cost to development, but complaining about the environment of others is a bit rich for the US (as it approves drilling in Alaska). The US accounts for 25% of global emissions, and about 5%ish of thd World population.
So yeah - you want your job yo be remote? Be careful what you wish for.
I agree with most of your comment but comparing Facebook or Google with Tiktok remains a classic "both sides" false equivalency that takes away from other valid points that you make. Yes, US social media companies too have to comply with government requests, which have at times been abused. No, the scale and impact that this has is in no way comparable to Chinese tech companies and their relation to the government.
There is an "easy" solution - a regulation that would remove copyright protection from AI-generated art, or perhaps even art that has an AI-generated output somewhere in the pipeline. Art could still be AI-generated for efficiency, but large studios would be forced to keep humans on a payroll for accountability.
I don't think the conversion we need to have is about how to stop it, its about what we do with all the unemployed artists, accountants, journalists, truck and taxi drivers.
Do we pay half of them to dig holes, then pay the other half to fill them?
We just need to keep the economy wheel turning while everybody finds new meaningful work.
I agree with that sentiment, just a small correction:
It's not the rich who will decide; it's the powerful. In the classical sense of "capacity to impose will" – up to and including the capacity to turn off {machines, humans, institutions}.
There's a strong correlation between wealth and power, but those groups are not identical. And neither are their staying prospects in the coming years.
On seconds thoughts, interestingly, that difference between "wealth" and "power" also becomes relevant when quantifying your "most" in They can do without MOST of the "we".
Like, how many people do the wealthy/powerful actually need? What level of automation will allow maintaining their lifestyle, including:
1. Biological needs: Food & a sufficiently diverse pool of mates for themselves and their children.
2. A social hierarchy large enough to flex their power beyond mere biological survival (very important to primates). I'd expect this number to be strictly smaller than for 1), so likely not a concern: as long as there are enough people to biologically sustain a population, there are enough people to subjugate.
3. Technological scaffolding to produce, maintain and bootstrap (in case of calamities) said automation. Currently humans are a necessary physical substrate for the continued inflow of energy that automation needs. I don't have the numbers but the footprint to run even a single power plant must be tremendous: a large and complex society, once you include 2nd order effects. Without energy, the machines don't go "VRRROOM" and neither does AI.
And if the human population drops too low to interfere with any of these points, the automation was self-defeating. It doesn't matter how rich or powerful you were.
I mean, I'm sure someone somewhere ran the numbers and has a plan ready, while the rest of us wax lyrical about "just find a new job LOL" and UBI. Although to be fair, there are also preppers, who've taken the above analysis to its logical conclusion.
The true limit is item 2 on your list. Eventually AI can take care of the food production portion of 1 and most of item 3. The fact is that energy and secure storage are much less of a problem for AI and robots than people.
For population replacement, you can get by with a couple thousand people. That's really only about a dozen large tribes. If the tribes are kept suitably insular and antagonistic (with either ritualized external marriage or slave-taking to ensure proper genetic circulation) that society might even be stable. The trick would be keeping the population stable.
E.g. currently mostly the middle class pays for home improvements, which creates jobs for construction workers. Remove the middle class and there's a lot less demand for construction workers - so what are the IT people turned into construction workers going to do?
The low wages cannot go any lower. As it stands, poor workers in the US already don't make nearly enough to survive. They just keep increasing their debt until they either climb the ladder or die of an overdose or some other tragedy.
It didn't take OP's job, though. It gave them another tool that produces better output, but which they don't happen to like (in contrast to their colleague who doesn't have issues with it).
It didn't take the job in the sense that the OP is still employed. It changed the job into something that barely resembles what the OP was hired to do. It's not quite constructive dismissal, but many people choose fields for reasons of maximizing income. It's hard to stay motivated doing something you hate, especially when a week ago your job had you doing something you love. And that says nothing about the OP's ethical quandry about IP laundering.
I might be too cynical, but given the reported productivity gains, it's quite possible the company decides they can do with a single artist. Or if less skilled artists can generate similar results with suitable AI prompts, it'll probably drive wages down, making the OP expendable. I'd imagine at the very least it sets a ceiling on advancement.
The reality is that an artist in a mobile games company is hired for their technical skills, with very little creative freedom. The real artist is usually the art director or a similar position responsible for stylistic/creative/conceptual decisions.
If your job is mostly technical, be ready to re-learn and reshape your skills when your tool goes the way of cel animation. I personally had to do this at least three times in 25 years, not entirely from scratch of course, but I had to relearn the vast majority of my skills. And I think I'll need to do more of it to stay relevant. So to me it's pretty normal.
The work of a creative professional is to deliver meaning. If you can't separate this from your tools and are only able to enjoy one specific tool, you have to question yourself what you're doing in the field. (yes, I'm also being too cynical)
I've had to do something similar in a much shorter career as well (about 10 years or so). I think the rate of change is much quicker these days too. It was still always within the tech field, but for better or worse my career has been incredibly diverse. Not sure if I'll be able to keep up when I'm older (entering my 30ies now) or not, but so far I've always seen it as personal growth opportunities.
I personally had to do this at least three times in 25 years [...] So to me it's pretty normal
Very relatable, but it seems unwise to just consider it normal. If you find yourself having to retool 3 times within 25 months, you'll be well on the way to burning out.
25 years, not months; it wasn't that hard. It wasn't just retooling though - aerospace engineering to SE in gamedev, to the management in the unrelated creative area, to ML.
Yes, I know you said 25 years. I was trying to point out that with the accelerating technological capabilities, you could find that need to retool happening faster and faster.
You're not being cynical. You're being overly optimistic here about creative professionals.
The creative professional IS a target for replacement. What LLMs do best is specifically what the "creative professional" does. Better than coding or mathematics, LLMs excel at creative English text. This is because creative output is trivial when compared with technical skills.
I don't think the "creative professional" will get replaced though. Because usually the task is so trivial that positions like these largely only exist because of politics. You convince people around you and yourself that your "creativity" has no peer.
It amounts to this: I want a photo-realistic animation of a cowboy with blue skin with a laser cannon riding a unicycle through the grand canyon with a dwarf chasing him. <<This sentence is trivial to come up with, ANYONE can come up with thousands a variations and tweak the sentence in various ways. No special talent needed here.
A creative director simply comes up with high level instructions (like the one above) THEN the ultra hard part is making that high level instruction a reality. That is HARD and as of now it looks like both skills are being replaced by AI.
But like I said, the directors' position is in actuality largely political. Thus his position is safe even though the actuality of what he does is EASY and therefore a prime target for replacement.
> A creative director simply comes up with high level instructions (like the one above) THEN the ultra hard part is making that high level instruction a reality. That is HARD and as of now it looks like both skills are being replaced by AI.
This is the opposite of what I see in reality right now. The hard part is to make the artists do exactly what you want, down to the subtle but meaningful details. That happens because the high-level instructions don't have enough capacity to deliver the full meaning. This is the same with generative models: extremely hard to control with the "prompt engineering" gimmick, it's only fine when you are OK with random output. Besides, they are purely functional and lack feedback mechanisms the creative director usually employ with artists.
That's why people are trying to make techniques more complex - large animation houses experiment with training their own model architectures and software. This is the way 3D CGI was born - simple at first, and yes it got plenty of doomposting at first (we are getting replaced by computers!), until it was clear that the field becomes extremely technical and complex, so it even has dozens of specializations inside it.
All entertainment is ultimately based on novelty, as human brain is really good at distilling meaning from the ocean of information. If you start with little meaning (your example - "a cowboy with blue skin with a laser cannon riding a unicycle through the grand canyon with a dwarf chasing him"), people get bored - no matter how much randomized artsy-looking stuff you add around it.
If you think that AI can generate all meaning that is relevant to people, I don't buy it, as it doesn't have the same training material. It's trained on the result and is forced to reverse engineer what moves people; reverse engineering is a fundamentally more costly and opaque task.
Train the AI exclusively on well lauded texts. Then you can get a generative AI that moves people.
The problem with "reverse engineering people" is that you don't need to, the things they like in any era follows predictable and generic patterns. These patterns can be encoded into AI provided we find enough curated training data.
> The hard part is to make the artists do exactly what you want, down to the subtle but meaningful details.
This is because the artist can't read your mind. It has nothing to do with your skill or the artists skill level. That feedback loop you are alluding to is more a reflection of the unclarity of your instructions or your imagination.
You thought what you imagined looked good but the artist in following your directions created something that showed you how flawed your imagination was.
> The problem with "reverse engineering people" is that you don't need to, the things they like in any era follows predictable and generic patterns. These patterns can be encoded into AI provided we find enough curated training data.
That only works to a certain degree. For the infamous example, try making GPT output the correct number of asterisks, you will get mixed results. You don't have any problem with typing exactly 1589 asterisks because you run the stateful counting algorithm in your head. GPT has no idea about the algorithm - it has to reverse engineer it from the text, and can only extract the vague correspondence between a number and a string about this or that length. You don't give humans examples to reverse engineer, you teach them to count.
This is a simplest example, it might even learn to count eventually, as it's far more capable in certain aspects. But as the dimensionality of the task grows, the amount of resources and training data required to reverse engineer it grows much faster.
Sure, it can spot some patterns and that can look good, but some things are just plain invisible in the result - you will have a hard time making it learn higher level concepts because they highly depend on hardwired things like the dumber part of neural circuitry and biochemistry in humans, which the model doesn't have.
It's like trying to make a photo in a dark room - no matter how you improve the sensitivity of your camera, you might not have a single photon in it.
> This is because the artist can't read your mind.
Yes, this is what I mean by the limited capacity of a simple textual description. It's a fundamental limitation - natural language is just poorly suited for the detailed conceptualization. A sketch, or a conceptual diagram, or other higher order control methods have far more capacity to explain your intent, and that's the direction those models move to. At which point their usage is nothing like "type something simple and receive the result".
The asterisks thing is another issue. LLMs don't need to do this to replace directors.
>Yes, this is what I mean by the limited capacity of a simple textual description. It's a fundamental limitation - natural language is just poorly suited for the detailed conceptualization.
Except LLMs can accept sketches as input. The higher order methods of communication are covered by encoders.
I think you're conflating imagination and creativity with taste. Your example might be cute and funny, but as you yourself alluded: it's completely bland and tasteless. It's maybe good enough for a "meme dump" but that's all.
Maybe I'm not sure what point you're trying to make either?
All fiction writers suck? I don't know how much fiction you read, but it's incredible varied. There's a lot writers need to think about and control in the reader (and account for multiple profiles) outside of just an idea. But the ideas themselves need to be coherent.
Creative directors suck? It's incredible difficult being s creative director having to organise multiple people to perform coherently, and even much harder for whole teams.
I think it's natural for humans to reduce and simplify things we don't engage with every day, or our brains would be overloaded. So we just handwave it off. We do it to other people too unfortunately... Remember how complex your life (internal and external) is; other's lives are equally complex and nuanced.
>Creative directors suck? It's incredible difficult being s creative director having to organise multiple people to perform coherently, and even much harder for whole teams.
Difficult in terms of effort. Not difficult in terms of skill. Make no mistake the quality of a movie is more the sum of the quality of the parts then it is the creative director. Who wrote the script? Who did the digital effects? Who did the lighting? Who did the editing?
The director did the hard work of picking the people and issuing orders.
>I think it's natural for humans to reduce and simplify things we don't engage with every day, or our brains would be overloaded. So we just handwave it off. We do it to other people too unfortunately... Remember how complex your life (internal and external) is; other's lives are equally complex and nuanced.
Except I am more or less a director. Not one for movies but for a company.
There is a difference between handwaving something off versus being delusional about your own role within the world. Directing is hard work, management is hard work. But none of these things are skilled work.
>All fiction writers suck? I don't know how much fiction you read, but it's incredible varied. There's a lot writers need to think about and control in the reader (and account for multiple profiles) outside of just an idea. But the ideas themselves need to be coherent. own position.
It is certainly easier to create an entire space opera in writing then it is to do it via a movie. Writing is skilled work in terms of one skill only: your ability to write. Every other aspect of it is hard work but, unfortunately, unskilled work.
I realize there are complex plots, paradoxical stories and imaginative settings and the pacing of a story is important as well. But all of this doesn't really require skill just time and deep thought to come up with. Plenty of the most popular authors never had a writing background or talent.
I would say again that these director positions while in principle are easily replaceable they are not in practice due to politics. A director or CEO is where he is mainly due to politics. Politics is unfortunately a skill with aspects that not only need to be imitated by AI but imitated by robotics as well, and it's simply a gateway into the role with no relation to the actuality of the job requirements.
Its hard to imagine a computer issuing the same exact orders as a director. But with LLMs computers are really close to doing that in principle. The issue is as I mentioned the social and political aspect of directing that cannot be replaced yet.
Jobs change and hopefully peoples’ skillset changes and gets better when they do.
This job was different 5 years ago and this entire animation process would be unrecognizable 50 years ago. Tweening didn’t ruin animation it just lowered the barrier of entry. This too will lower the barrier of entry so our future will be more high quality animation. Some of it will be lasting and some of it will be appreciated for mere seconds for throw away content such as ads.
Maybe when everyone gets home from their boring prompt engineering jobs they can go back to hand rolling animation stacks or whatever other workflow is currently getting destroyed by AI for the “fun” of it.
It turned OP from a craftperson to a machine operator. Their less-skilled colleague (either due to lack of talent, experience, or application) is now cruising past them them because they don't care as much as OP did about the work itself. OP is still employed, but now some of the necessary grunt work has become the main task. And that, too, will be easier done by machine before long.
I‘m hoping, perhaps unreasonably, that the work in the short to medium term actually gets better. I‘m currently working on something very interesting after having worked an complete shit a few months last year: Shit where a designer draws up the exact thing and then it is redundantly implemented on mobile by overpaid developers. Let that be done by the AI. Good riddance.
We never needed to be this productive anyway, we are completely swarmed with content, to the point that if we stopped producing today we would still be consuming new media after 100 years.
This process has been happening for at least a couple decades (arguably more) without any AI, though. The entertainment industry almost ran itself into the ground that way, long before those things emerged. Or rather, the demand made it to.
I have to agree. If I get hired somewhere and one day I get suddenly forced to use Mac or Windows to run the Adobe suite I would hate the situation but most people would argue that's just the Industrie standard
I just wish I was part of the first generation to benefit from post-scarcity, and not the generation that is going to have to go through the turmoil of getting to post-scarcity.
Oh don't worry my guy, the odds that we make it to post-scarcity at all are not super high. Just vibe with the slide down the slippery slope, it's all you can do.
Any time this argument gets trotted out it's using a baseline of European post-agrarian feudalism. There were definitely peoples in the past who enjoyed a high quality of life for many generations, but they tended to be small groups concentrated in areas of abundance, and were usually among the first to be exterminated by outsiders.
The fact that the industrial revolution created better life for most people doesn't prove an AI revolution will do the same.
The classical counter-example are horses - the industrial revolution increased the demand for horses, but once the cars became cheap horses numbers and jobs drastically decreased.
Agency is only half of the equation. The other half is opportunity. With Industrial Revolution, horses had neither. With AI revolution, people may retain agency, but I doubt there will be many opportunities available.
I guess we can get AGI and that will lead to great and numerous inventions. Of course how power works in that world is extremely uncertain. Most likely power would flow to a corporation or an authoritative government unfortunately. Not to the people.
>> I remember reading lots of people here in HN saying it would never take the jobs of artists. Well, seems like this argument is aging badly.
And yet nobody lost his job. He wasn't fired. He is unhappy because automation equalized the playing field and mediocre artists became good enough with the right tools. The technology enabled him and other artists to be more productive. It feels similar to the best ditch digger getting angry at the invention of excavator, because he doesn't stand out anymore.
He's also angry because now his job has completely changed. He enjoyed spending his time making 3D models. Now that expectations have shifted because of these tools he has to use them. He went from making things from scratch to writing prompts instead.
Okay so? Back in the day people trained up for 2D art, and then suddenly not long after 3D graphics came around. Imagine working on Final Fantasy 6, and then being told you now have to do 3D art for Final Fantasy 7.
Amazingly, despite 3D graphics, in the year 2023 there are now more employed 2D pixel artists than there were in the 90s. They are still making new games for the NES!
So while I’m sad for this person’s loss of love for their current job, I’m sure they will be able to find work.
I understand where you're coming from. I don't think this person really fears losing their job. It's more that their current job, and probably all future ones, will be done in a way that they dislike. That is a valid complaint.
There are probably some employed pixel artists that liked creating things pixel by pixel and are disliking the fact that their job is radically changing since these tools create an expectation for faster output.
I apologize if I came across arguing with anyone.
I just wanted to call out that to me - it doesn't seem like this person thinks they are going to lose their job. They did their job in a way that they really enjoyed. That's changed.
Does that happen all the time in other industries? Yes. Is it valid to point out that this has happened before? Yes. Will this person be able to find work? Yes.
This person is just expressing their frustration. With the way all these tools are progressing, I imagine we are going to be seeing posts like these more in the future.
They could fire him and only keep the other artist though. It's likely they have more capacity now than needed, which if they halve, they can spend more money on a good developer.
Interesting development, these machines have little value but for economic gain. No one employed, means no one will pay to keep the machines alive which means we go back to where we were before the machines? AI art is just as “worthless” as human art. So at the moment these machines depend on an economy.
Art was actually never worth anything until we put a price tag on it ? People were never paid for cave paintings. Maybe what we’re actually going to witness is the end of the idea of money , which I know sounds absolutely absurd, but what else happens ?
If machines become sentiment, that’s cool, but at that stage they’re no longer “our property” to force art generation upon, so then what to do we do ?
Player does not care if graphics or code of the game are created by a person or AI. For them it matters how much fun they are having and for companies how low the cost are.
i highly doubt no one will be employed even in the advent of a true AI singularity. Just differently employed - there will be reasons why someone would pay for your labour.
Yeah, my theory is there will just be more shit to do, more startups, more tech, more than ever before. More ways to scale your business, spend your free time, yada yada yada.
Also most western countries have severely aging populations, there’s are big problem there which automation can help with, if it’s applied right.
I still would argue against AI being able to do art better.
There’s a line of thought that art is specifically meaningful because it’s humans thinking about things, but let’s leave that aside for now.
Generative models are revealing how many jobs in e.g. entertainment are essentially mechanical, from management’s POV. So far, they’ve just happened to require an artist’s education to do well. Now, the craft part is going out the window, and artistic creativity, if it was ever required, is being pushed out of production roles.
This is going to change the type of person who works in entertainment. I expect, while by technical metrics we will probably see quality stay the same or even improve while output increases, we’re also going to see a kind of McDonaldsification of entertainment, even more than we already see, because we’re pushing creative people out of the industry.
TL;DR: Process matters. Art is notoriously difficult to manage and predict what will or will not be successful , in large part because it’s a classic greater-than-the-sum-of-its-parts thing. Devalue the process, devalue the end product.
> When DALL-E 2 was released, I remember reading lots of people here in HN saying it would never take the jobs of artists. Well, seems like this argument is aging badly.
And yet, the more I play with these image models, the less I worry they might replace all artists.
Granted, it will commoditize some skills, such as photo bashing, mockups, and visual exploration.
It is a "creative" tool in a way, in the sense that a prompt can give unexpected results, such as associations that were uncalled for. But as much as you can replace higher-paid video game concept artists, you'll still need people to operate these text-to-image tools. We are past the initial discovery days, where all the results look incredible, and even the public will grow more discerning. Artists can't ignore these tools. As tools for the individual, they can be a lever. As a full-time, corporate job? Not so much. That is to say, as a job, this is not different from many other bullshit jobs. Prompting all day long is as fun as filling boxes in Excel. That will probably attract a new crowd of less specialized workers, and artists will have no choice but to adapt.
Digital artists could find new ways of expression using traditional art (a luxury), diversify, and engage in content creation. There are many alleys to confront pessimism, despair, and the fear of becoming obsolete.
Artists thrive in adversity and uncertain situations because they are problem-solvers first. Imagination is their job.
Photography made portrait artists redundant, but it didn't eliminate artists. It will change the nature of an artist's job and kill some of the joy in some cases, but it will trigger creative responses, too.
Yes, it changed his job in to something a 6th grader could do with little to no training. In other words, it’s now a meaningless, low-value job that won’t pay well.
There’s too many people in this thread claiming this. This artist did not lose their job, correct. This artist is currently underemployed. Most of us can predict the rate of AI improvement, and realize soon his job will be entirely replaced once they can bridge the gap.
My wife and I were recently talking about hiring some artists to do some work for us. However with generative AI, we decided to instead have her do the work using the AI tools instead. The result, less artists hired.
This is how it’s going to be from now on. The only hope is if humans find more opportunity with this power.
I think the ones you are referring to are still capitalist as companies are owned by an elite "capitalist" and the workers have no say in their workplace.
This is a nice way of putting it. It just shows how people have a tendency to deny a reality that may replace everything they stand for.
What you will see as AI continues to get better along the obvious trendline is these "denial" arguments become more and more specific. First the AI will "never" replace jobs, then it will "never" replace specific jobs involving "programming skills" or "skills related to what I do" until eventually when the reality of it all is too all encompassing the arguments will evolve into attacks at AI for being "low quality" or something along those lines.
But if AI progresses to the point where the quality of the output becomes undeniably superior to human output the "denial" arguments will inevitably shift to the REAL argument. The heart of it all. What is the purpose of being alive if AI is doing everything? Should we ban it for the sake of the economy for purpose?
These arguments of course only occur based off the assumption that the technology will progress so quickly that the repercussions will be hitting everyone like a freight train. If the progression slows down enough then there won't be much opposition. Only acceptance as it slowly assimilates into our society without people noticing the change.
If you in your heart are one of these people who has no worry about AI taking over jobs because AI is simply too "stupid" perhaps consider the fact that my description above fits you. Are your arguments evolving along a similar trendline? If so, consider shifting your perspective a bit.
Comments like this always read as "we can already extrapolate that everything we do will be done better by a machine soon". Pushing back against this argument isn't just ignorance or avoidance of change. It just asks the relevant question of whether we can be so sure that AI does everything "better". But how dare we challenge the hubris of tech bros, right?
Naw there are many possible futures. There is nothing saying the trendline is absolute. However...
The future predicted via extrapolation of a trendline is unfortunately more probable and more realistic then a future predicted via mistrust of AI.
Artists have already formed lawsuits against companies that own LLMs, this one in the article already involves an artist complaining about his job being more or less replaced.
You have to be next level delusional not to consider the extrapolation to programming.
Oh, I do believe programming as a profession is at risk and will change a lot, if not rendered obsolete. What I'm talking about is this idea of "just get used to the fact that there is no human skill that won't be replicable by AI in 2-10 years". It's a very bleak view of the future and our own biological complexity. We need to remember that we are the ones inventing the AI in the first place. We are limited by our imperfect ability to understand ourselves. It will get better, sure, there will be emergent properties, but there's no need to reject the inherent value of humanity even if it happens to produce less economically viable output.
Not everything will be replaced, but you can extrapolate that much of what we do will be replaced.
The thing that is harder to replace is the versatility of the human form. Manual labor can't fully be replaced because robotics have yet to catch up.
>there's no need to reject the inherent value of humanity
There's no fundamental rejection here. Capitalism simply selects the most efficient methodology. If humans arent the most efficient methodology for a given task then capitalism eliminated that methodology. That's the logical extrapolation. You subjective opinions on humanities worth is irrelevant to the most likely outcome.
Capitalism and its value system is subjective, too. It's not set in stone. I believe we can still steer away from profit as the sole driver of, well, everything, if we want to.
Historically speaking, from crypto to AI, the market constantly evolves towards the next most profitable thing.
Only regulatory systems like the government has the tendency to temper such things (see the Fed and rising interest rates). However do note that capitalist entities have infiltrated the government and have huge sway over it's regulatory policies meaning that anti-business regulatory policies are unlikely to occur.
All of this just means that my conclusions are most likely going to play out. Barring some event that will cause intense negative public reaction.
> However do note that capitalist entities have infiltrated the government and have huge sway over it's regulatory policies meaning that anti-business regulatory policies are unlikely to occur.
Maybe this is the case in the US, the EU is known for being much more strict in its regulations, which is often ridiculed by the rest of the world. Those same people are going to hope for regulations once we see the effects of the current Wild West that is AI.
Agreed. The EU is ridiculed but it's also one of the happiest places to live. The relationship between the US and EU is almost parasitic with the EU simply feeding off most of the business innovation coming from the US.
I don't think the world needs this much constant innovation. Additionally the innovation itself can be disruptive. The US will doggedly pursue profits even if those profits involve technology that can cause the US to eat itself. If AI replaces all jobs and nobody has any money to buy stuff who will the companies sell shit too?
People with a good understanding of different styles of art and framing/angles/etc should be able to create the "right" prompts for a customer much much faster and get better results than people who don't know art. Generative AI like Midjourney will completely upend the art industry (it already is). But I don't think it's the death knell for the study of art.
Here's a solid example of that in the context of creating game assets for a retro computer game: https://hpjansson.org/blag/2022/08/16/adventure-game-graphic... The author clearly has actually studied art and is able to come up with prompts and concepts that I'd never be able to dream up. They're able to get the AI to generate what they want immediately with specific prompts like:
> "mexican hacienda on a sunny day, surrounded by plains, color painting by Charles Sheeler"
I have no idea who Charles Sheeler is. But if someone knows 1,500 different artists/photographers/genres/styles and the nuances of them, they can immediately select the right look for the client. It's not a panacea that will allow artists to keep doing what they love, but it is a bright spot and something to focus on for what the future skillset looks like for those who generate art.
I think the OP wasn’t disappointed about the end result, which is now higher quality by their own admission. They can certainly use their skills and experience to engineer prompts that will return the highest quality work, no doubt.
What made them sad is they no longer have the joy and satisfaction of developing those models in a 3D rendering software. This is what they trained for.
In many ways, this is like the initial trauma of becoming a manager. Learning to be content with the end result and the success of the team, than your direct individual contribution. The difference here is they didn’t ask for it.
I fully empathize with them. It makes me sad to think what they’re going through. My advise to them will be to look for other ways to get that satisfaction - either through a hobby or considering a different role/profession. For better or for worse, the world is changing rapidly and I can’t see it going back to what it was.
Some people really love drawing 2d sprites pixel by pixel. Now people make 3d models to render down to 2d sprites.
Some people really love drawing textures. Now people use Substance to make them procedurally.
Some people really love doing polygon modeling for making characters for media. Now people do it with sculpting.
Some people really love and took pride in creating high quality meshes that animated well with clean topology. Now remesh and retopo tools can fix all of that after the fact.
Some people really love hand animating things like explosions and smoke. Now Houdini simulates all of it for you.
Every major efficiency advancement in any field is going to eliminate work that some people loved and trained for. Someone trying to do CG work for a game or movie based on the skillsets they loved and practiced 20 years ago is going to be doing something completely different if they want to be competitive today, and be buried if they don't. Does the OP lament the fact that the workflow he is a part of eliminated work that other artists would have done before them, and very likely spent as much time and love mastering as they did the 3d modeling?
I empathize with them - I've had former careers peter out like web design - but they should also recognize that they are beneficiaries of the same sort of change that is currently upsetting them. It sucks, but it is how progress has happened for a long time now.
I'm getting whiplash from the pace of change: I think of pixel art rapidly rendered from 3D models as the disruptive new technology that's devaluing traditional drawing skills!
> What made them sad is they no longer have the joy and satisfaction of developing those models in a 3D rendering software. This is what they trained for.
I think the point is that this artist can still experience that joy, but they just can’t get paid for it anymore. That is indeed sad, but that’s life.
On the other hand, maybe in a post scarcity society, this artist won’t have to work to survive, and they can do all the 3D modeling they want without having to worry about rent and food.
> On the other hand, maybe in a post scarcity society, this artist won’t have to work to survive, and they can do all the 3D modeling they want without having to worry about rent and food.
The scary part to me is that there’s scant evidence that a post-scarcity society will be allowed to develop. It seems more likely that those at the top will reap the entirety of the fruits of productivity gains enabled by employing AI, much as they’ve been doing for the more incremental (but still massive) productivity gains of the past several decades. Very little of the pie will be shared with the working class.
I think the point is missed here.
the post was made by an artist who really likes his craft, and now with the generated models coming to the picture he feels like his work diminished to simply polishing after them.
I am a coder who really likes coding, getting results quicker will not make me happier.
it is different in a work environment, but still, a lot of people simply enjoy the path and not only the outcome.
It's the same with every person who shows up with AI music composition. "Finally, you can be free from the tedium of composing music and get to the fun parts!"
as a hobby music producer, this thing make me sick :)
actually I think this problem started earlier, with many _musicians_ using pre made beats and auto tune. right now it just takes it to a whole other level.
music should be a tool of expression, not one to gain fame & social acceptance
> music should be a tool of expression, not one to gain fame & social acceptance
I don’t get it then why you care, let alone care enough to make you feel sick. Nobody is stopping you from using music as a tool of expression. You do you, yam in your basement or wherever you want. I bet that you will find like minded people who appreciate what you do, but even if not you yourself are saying “music should not be a tool to gain fame and social acceptance”.
I guess I over exaggerated due to fast typing.
it does not _make me sick_ I just don't understand the reason.
a bit of context:
I've been producing music for ~15 years now, purely as a hobby (due to incompetence on my part to up a level).
I have friends which I helped and collaborated with for years on and off,
and some of them seem to me like they are stuck in a loop of wanting to succeed without actually expressing themselves. they seem to be stuck in the mode of _this is how it's done, and this is what I want and anything else is incorrect_.
even when there are many other options to make it sound better and make it richer.
so correction: what makes me _sick_ is the fact that some people try to copy other successful musicians without even considering doing something original, purely because they want the same level of _fame and success_.
There are still masses of people making music without “pre made beats and auto tune” they just don’t get played in places like the radio because the radio is mostly for sanitized stuff that appeals to mainstream trends. I don’t expect most of the artists I listen to because an AI can produce a generic trap beat or something. Of course, there may be more garbage to sift through.
This seems congruent with a thought I've had, which is that generative image models might enable people commissioning art to be more coherent about what they want. I only have FOAF-tier knowledge of commissioning artwork, but my understanding is that a major source of frustration on all sides is that a lot of work is often wasted because someone commissioning an artist has only the faintest spark of a concept or doesn't know how to communicate it effectively. I imagine a fair number of professional artists would be receptive to something along the lines of "here are my prompts/models, here's the set of the outputs I have opinions about and here's what I do and don't like about them; please use your expertise to compose this into something that actually works".
The market of people who care enough about those aspects to pay an expert is, I'm guessing, much smaller than the market for people who will be fine with whatever they can come up with themselves.
If we can have an infinite variation of personalized stock photos, I'm betting that the "good enough" side will win.
I'm still not scared about my job since I provide specific solutions (C++ architectures) that an AI cannot produce (yet), but it's worrying for everyone and every kind of job at the same time.
If you don’t like the taste of the people who want to pay for art I suggest doing what is traditional and either (a) paying for it yourself, or (b) taking over or setting up the government funding for artists so that people who share your taste are in control.
They do if they lose their jobs. Art isn't something you must buy. I like to buy art, but if I lost my job, it's one of the first things I'll be cutting back on. And the less people are buying, the more the remaining artists have to charge to keep being able to afford to do art, which just spirals more people out of the buyers market.
I’ve been fiddling with midjourney recently. There’s definitely a learning curve to it, but ironically gpt has been helpful for generating prompts. I expect the edge that artists have in this regard to erode further with time
> People with a good understanding of different styles of art and framing/angles/etc should be able to create the "right" prompts for a customer much much faster and get better results than people who don't know art.
When the Bing image creator released a couple of days ago, I used ChatGPT to craft a prompt for it.
> In one sentence describe the art style of impressionism
< The art style of impressionism is characterized by visible brush strokes, emphasis on the changing effects of light, and an emphasis on capturing the fleeting moment.
Then used that in my prompt for the creator:
> okinawa street on a rainy night, visible brush strokes, emphasis on the changing effects of light, and an emphasis on capturing the fleeting moment
The point is you knew to use "impressionist" as a concept. ChatGPT probably wouldn't have helped you with that kernel. It's not clear whether GPT added any value at all in your use case as you just replaced "impressionist" with "{synonym of impressionist}". How would you repeat this for:
> okinawa street on a rainy night, in the style of fauvism
> okinawa street on a rainy night, in the style of constructivism
> okinawa street on a rainy night, in the style of De Stijl (Neoplasticism)
> okinawa street on a rainy night, in the style of later Expressionism, e.g. George Grosz and Otto Dix
If you weren't aware of these stylistic options at the time?
> The point is you knew to use "impressionist" as a concept.
I could have surfaced, or learned about it 20 seconds earlier from a different ChatGPT conversation (throwing vaguely related questions at GPT[1]), or something as old fashioned as a search engine. Maybe turning up the "novel" kernels of information that you won't get from LLMs without knowing to ask will be its own thing in the future. A few Pinterests for prompts are already out there if I'm not mistaken.
If I just wanted results, I'm sure I could have. I just thought it was interesting to see if this more abstract description would map to results that would be considered impressionism.
>I have no idea who Charles Sheeler is. But if someone knows 1,500 different artists/photographers/genres/styles and the nuances of them, they can immediately select the right look for the client
Don't overthink it. The SD GUI I've seen people use has a button that appends a random artists name at the end of the prompt. You mash that button until you are satisfied with the style.
>People with a good understanding of different styles of art and framing/angles/etc should be able to create the "right" prompts for a customer much much faster and get better results than people who don't know art.
Strong disagree. It's a completely different discipline. One doesn't "know art", they know a specific flavor of it, and 3d modelling is not... sentence constructing.
As someone with no artistic knowledge or skill who has been playing around with this stuff, I totally agree. I have found a large part of the learning curve to be getting a sense of how to describe the things in the first place.
It does lower the barrier to entry significantly for anyone wanting to get into it, but it's still going to be the case where some people are demonstrably better at it than others.
Of course developers will just solve it by making to be UX such that it uses some genetic algorithm to help you iterate to the exact thing you want even if you don't know how to describe it.
The exponential part of this ride really feels like it's coming in to full view.
Some people say there isn't enough empathy to go around for those creatives that are seeing their jobs either endangered or automated to a degree they've become sweatshop janitors with little prospects to be anything else.
I wonder if that sort of empathy was so plentiful when the automation was shredding the prospects of farmers, miners, truckers and other blue collar workers?
Make no mistake, generative AI is going to affect the creative work market massively over the next few years.
People will need to keep adapting and perhaps some humility would be in order when somebody else is going through hardship as the market for his or her work profile is in dire straits.
No, there was no empathy for them. And I feel there's no real empathy for artists, either.
Before the eeeevil robot came to be, no one gave much of a damn about artists or their opinion. Suddenly we are instrumental to humanity's culture, but people is too busy spreading hatred to listen to any artists that aren't in agreement, what makes me feel it's less about being an artist and more about being emotionally-charged "useful idiots" for anti-AI agendas.
Have you seen AI outputs carefully? I have for the last seven months, and if an artist can be replaced by it, maybe that artist deserves it. I say this as an artist, 36 years of experience.
The worst is that this will be considered a hot take and mean-spirited by the immensely biased HN crowd, but lifeless art can be replaced by lifeless art, and that's the artist's fault for just coasting from check to check without putting much effort, therefore producing empty, emotionless art. We all know this, we all call it out when it happens, but for some reason this entire AI topic is making people forget it's a thing.
This is no different from that lazy guy at the office that just does the bare minimum to not get fired. Getting rid of them may give a more talented and unique but less popular artist a chance to get a job.
> if an artist can be replaced by it, maybe that artist deserves it. I say this as an artist, 36 years of experience.
This is such an insane take to me at this point. I've used a few of the AI art generators and there is close to zero chance I will ever hire an artist or a graphic designer again (yes, I've hired artists).
I've seen the shortcomings, and sure a very expensive artist can fulfill my needs at an A+ level 100% of the time. But why would I pay so much money to a human artist when after a few minutes and a couple prompts I have a B+ or A- result? As a person with finite money, that last teeny bit of quality is not worth hundreds or thousands of dollars when compared with a very good result for essentially nothing.
People are saying the same thing about programming. That if you're set to be replaced by this you're not "very good". I cost a LOT of money. I am good at what I do. AI is essentially free at that too and learning at an insane rate.
I am not talking about GPT and LLM models. I have no opinion on those. I have an opinion on art because it's something I know intimately.
Art is a form of expression available to all, it's not just art for business or commission. That only the business side of art is being discussed is ignoring what art actually means. Not everything is business, and I loathe "time is money" kinda worldviews, honestly.
I'm already able to create excellent models of any subject I desire in 15-45 minutes, with remarkable quality aside from bad hands, and usable forever once trained, then how come I'm still drawing? (more than usual, in fact)
Understand that and you'll understand my argument.
> Before the eeeevil robot came to be, no one gave much of a damn about artists or their opinion. Suddenly we are instrumental to humanity's culture, but people is too busy spreading hatred to listen to any artists that aren't in agreement, what makes me feel it's less about being an artist and more about being emotionally-charged "useful idiots" for anti-AI agendas.
Artists spread disinformation and hate too, in fact I'd count them as one of the primary sources of those things. Mere existence of political cartoonists is the simplest example if we go by the literal term "artist". Artists are rarely if ever divorced from their work either, so I don't think they can claim that the art really takes on it's own life.
I do agree with you on everything else you have said.
> "We're going to make it clear that we don't want to forget those people," Clinton said. "Those people labored in those mines for generations, losing their health, often losing their lives to turn on our lights and power our factories. Now we've got to move away from coal and all the other fossil fuels, but I don't want to move away from the people who did the best they could to produce the energy that we relied on."
Famously what got quoted was this part:
> "we're going to put a lot of coal miners and coal companies out of business."
>I wonder if that sort of empathy was so plentiful when the automation was shredding the prospects of farmers, miners, truckers and other blue collar workers?
Twitter was full with snarky "just learn to code" comments.
> wonder if that sort of empathy was so plentiful when the automation was shredding the prospects of farmers, miners, truckers and other blue collar workers
Yeah I remember the whole "learn to code" movement that mocked people who were losing their jobs.
And when it was about journalists losing jobs, it quickly got banned as "harassment". By the same people who a few years earlier made fun of blue collar people who lost their jobs.
Can you give some sources for the "learn to code" thing? I only remember that it was a reaction to the people who wanted to change open-source (perceived as "destroying from the inside") without giving concrete solutions.
The discussion is creating 3 groups:
First group is sure AI will be an enabler and it won't replace them;
The second group is saying AI won't replace them at least in the near future (1~10 years);
Then we have the third group, which is basically the sentiment the op is living rn: AI is taking away their hopes, dreams, and livelihood.
Now, I think we'll see the strongest counterculture in human history against AI and the biggest social unrest to date. I'm talking about massive protests , massive regulations, etc... just because no one, specially the government, is prepared for what's coming.
And then industrial revolution didn't take out jobs, rather created more.
So did computers in the 80s and 90s.
I remember typesetters in our locality were protesting in the 80s that computers were taking off their jobs. They started something like "burn the computers" movement.
More people were later employed as composers than there were typesetters.
Those typesetters were likely right, tho. Sure, Industrial Revolution and computer revolution ended up creating more jobs than they took out - but what everyone conveniently doesn't mention is, those new jobs went to different people.
These kinds of transitions are large scale wealth redistribution. Your random typesetter 20 years into their career, with experience, good pay, and whole life built around economic conditions they spent those decades working hard to achieve, is not going to jump into a role of senior offset printer specialist, or DTP team manager. They're going to become a junior designer at best, most likely fall out of industry and into a junior whatever role. With commensurate drop in salary.
If you're over 30, imagine dropping down to whatever you earned when you were 20, and tell me that people protesting automation don't have an argument. And remember: it's only your career that resets; your health and obligations and remaining years to live do not.
> And then industrial revolution didn't take out jobs
It absolutely did take jobs. It's just that the job loss in one area was compensated by the growth in other areas.
However hoping that it'll be the same with AI is incredible naive. The job shift in the industrial revolution was possible since there were still lots of tasks left that the machines couldn't do. The areas that can't be done by a machine now are dramatically shrinking with advanced AI. There is not much area left where jobs can shift too.
The creative jobs can be automated just the same as everything dealing with digital data. Skilled trades, carpenter, plumbers and electricians might still be safe for some years, as robots still struggle with climbing a ladder or even just stairs, but those are areas that don't seem to be destined for rapid growth. Whatever new job you can think of, will probably be automated away before a human ever get a chance to touch it.
The most impressive part with ChatGPT and friends after all isn't just that they are reasonably good at what they are doing, but how universal they are. ChatGPT didn't need to be meticulously programmed to do the things it does, it learned most of that by itself just from the data feed into it. Meaning there really isn't any domain that you can't automate with AI in the long run.
Exactly. The ramifications of this tech will brake the current system for sure. Why someone will spend 15 years of his/her life studying medicine if by that time AI will replace him/her?
The more we keep going down the rabbit hole, the more we can see there's no utopia at the end of the tunnel. This is no hyperbole, we don't know what to expect if you automate everything.
Just look for some answers here:
- "Blue collar jobs are safe". Really? What happens when everyone is an electrician, plumber, taxi driver, etc...?
- "I use it to amplify my productivity". What happens when AI is so good that it can literally swap you
- "Just create a business around it". There's no competitive edge anymore. Everyone can do "everything" and won't need any expertise at all when AI gets "there".
I'm a SWE but I'm trying to purchase land and invest in things that AI can't replace, like food production, hehe. Agriculture will make a strong comeback this decade. Forget airbnb, a farm is where the $$$$$ is going to be, if not, ask Gates why is he buying land and becoming the biggest "farmer" in the US.
I'm not a conspiracy person, but I can see the "You'll own nothing and you'll be happy" getting dangerously close to reality.
> I'm a SWE but I'm trying to purchase land and invest in things that AI can't replace, like food production, hehe. Agriculture will make a strong comeback this decade. Forget airbnb, a farm is where the $$$$$ is going to be, if not, ask Gates why is he buying land and becoming the biggest "farmer" in the US.
Farms in recent years have consolidated. There are exceptionally few farmers feeding all of us today. This will likely only get worse once all their farm equipment is operated by AI as one farmer will be able to manage far more acreage than they were previously able to.
Nothing is immune.
Of course, it's unlikely we replace plumbers anytime soon, but between the incredible DIY-friendly tools in that space (Shark-bite, PEX) and the fact that no one will have any money to pay for their services, they're probably screwed too.
Robotics is far from replacing carpenters, plumbers and etc. They maybe are showing acrobatics and may even replace soon soliders in some cases. But repairing something requires effort that will be magnitudes higher.
Sure, as a whole the industrial revolution might have created more jobs. But what about individuals? Are we expecting some dude in his fifties whose job is going to be taken away by AI to switch careers just like that? That stuff is hard enough when someone is in their thirties. And in my country the unemployment rate is already 13%, there’s hundreds of candidates for any decent job opening.
To those who are in our 20s and 30s they just answer "be a capitalist, become an entrepeneur, use GPT to build a business".
Never have seen what the answer to people in 40s and 50s with family who can't do a career reset. I guess they'll just have to take a huge pay cut and deal with it, I don't know.
Industrial revolution cities were famously a meat grinder though. Disease, poor quality accomodation, reckless safety considerations in early factories meant a lot of those displaced from rural artisan jobs to work in the cities literally died as a result.
Many of the new jobs' employees were from population growth, people that lived in the countryside thanks to the growth in food production that otherwise wouldn't have, then moved to the cities, etc, as sanitation improved a general fall in the death rate prior to the fall in the birth rate (but later than the initial migration to cities).
Nor was the city standard of living equivalent as people move from rural households (admittedly often multi generational ones, so more crowded than modern ones) to overcrowded tenements.
So the story people gloss over is that all the rural cobblers went to the city and became factory workers and everything is fine. But while it might be a reassuring story that _society_ survives and evolves, it's not at all clear that individuals did as well.
Ludditism has been around since at least the 18th century, probably much longer than that, and that’s exactly what this is. It’s not even the hyperbolic “grandpa doesn’t like iPads” type of ludditism, it’s the literal raging against technological innovation because you’re scared it’s going to take your job type of ludditism.
The guy in the OP didn’t even have his job taken, he just had to use a new tool that made his work far more efficient.
The thing is that if AI matches the hype around it, there won't be any more jobs left. The economy depends on the middle class working, earning a living, buying things. The government collect taxes from businesses and its citizens. At risk and low-income people depend on the government to assist them with basic things.
If AI destroys the middle class, then the whole system collapses. No middle class = no spending = less businesses = more low-income and at risk people, and the cycle continues.
We'll basically create a neofeudalist society where the rich will control everything and the people will work for them (food, housing, etc...).
It seems crazy, etc... but not improbable. I'm not a conspiracy person, but I can't stop thinking about this: These advancements align with "You'll own nothing, and you'll be happy" sentence from the WEF.
Yes I’m sure AI will be the very first technology to match its own hype. The technology that destroys the middle class has finally arrived. The days of falsely prophesying that a technology is about to destroy the middle class are over.
C'mon man. It's not that far fetched. I'm not saying that this will happen tomorrow but if ai gets "there", we don't want to see desperation on a global scale.
The idea that technological innovation is going to destroy the middle class isn’t exactly far fetched, it’s more just blatantly stupid. No economy has ever destroyed its middle class through technological innovation. This will never happen. Technological innovation is one of the primary factors that led to the rise of the middle class to begin with…
AI destroying the middle class will always be a political choice (i.e. the powerful have chosen some sort of hyper-extractive techno-feudalism). You could always tax the AI and redistribute the money back to the people affected.
What does have to be changed with the rise of AI is the tax laws. We basically should treat it the same as if an adversary was dumping product under cost (i.e. tax AI "work") and subsidize human workers; basically run a version of protectionism against AI - if we want human work in this post-AGI world to be viable.
There weren't exactly fewer jobs in textiles after the industrialization the original luddites were against. The jobs were just much, much crappier. (And more of them were filled by children).
They weren't "scared it's going to take their jobs" -- they were _seeing_ the jobs they had _actually disappear_ (not hypothetical future fear), replaced by worse-paying jobs with much worse conditions. They were exactly right that their actual lives were going to get a lot harder.
> Despite what you may have heard, the Luddites weren’t technophobes. They were skilled workers, expert high tech machine operators who supplied the world with fine textiles. Thanks to a high degree of labor organization through craft guilds, the workers received a fair share of the profit from their labors. They worked hard, but they earned enough through their labors to enjoy lives of dignity and comfort.
> Nineteenth century textile workers enjoyed a high degree of personal autonomy. Their machines were in their homes and they worked surrounded by family and friends, away from the oversight of the rich merchants who brought their goods to market. This was the original “cottage industry.”
> The factory owners who built their “dark, Satanic mills” weren’t interested in making life easier for textile workers by automating their labor. They wanted to make workers’ lives harder.
> Textile machines were valued because they were easier to operate than the hand-looms that preceded them, and that meant that workers who wanted a fair wage for a fair day’s work could be fired and replaced with new workers, without the logistical hassle of the multi-year apprenticeship demanded by the hand-loom and its brethren.
> As Brian Merchant documents in Blood in the Machine, his stunning, forthcoming history of the Luddites, the factory owners of the industrial revolution wanted machines so simple that children could work them, because that would let them pick over England’s orphanages, tricking young kids to come work in their factories for ten and twelve hour days.
> These children were indentured for a period of ten years, starved and mercilessly beaten when they missed quota. The machines routinely maimed or killed them. One of these children, Robert Blincoe, survived to write a bestselling memoir detailing the horrifying life of the factory owners’ child slaves, inspiring Dickens to write Oliver Twist.
Tech people so obviously loath creatives, its so sad.
> he just had to use a new tool that made his work far more efficient.
He had all satisfaction, creativity, and joy sapped out of his job. He was made alienated, made to feel pointless, in order to commodify a creative pursuit.
Technology is doing what it has always done, rob skilled people of purpose. From the luddites, to potters, shoemakers, etc. They had a skill that was important for other people. Automation made this cheaper, so these skilled laborers were no longer of use. But now everything we have is made out of cheap materials, nothing lasts. Sure its cheap, it has to be cheap for the out-of-the-job skilled worker who is now working a service job, to be able to afford this automated commodity.
This degrades our society, piece by piece, stone by stone. What is the end goal of this automation? The folks who are reaping the benefits of this efficiency will continue to hoard. They'll fight any effort to give back some of those profits robbed from the laborers. We're turning into a society where we just continuously exchange cheap pointless pieces of plastic just to survive. Zero meaning left except for the psychopaths at the top, they're the only ones allowed to derive meaning from their work.
> Tech people so obviously loath creatives, its so sad.
This isn’t what I’m saying at all. His job changed. You know who else has their job change sometimes? Everybody. I remember going to work one day and being told that I was going to be using Angular from now on. On that day all satisfaction, creativity, and joy was sapped out of my job. But I didn’t go and cry about it on Reddit.
> Zero meaning left except for the psychopaths at the top, they're the only ones allowed to derive meaning from their work.
Luddites say this about every single technological innovation that they whinge about.
Perhaps you should have. I think people deserve to feel loss. Alienation is a real issue. Automation breeds Alienation. I much rather feel meaning at work than have a cheaper TV or yet another data mining website that requires yet another account for little benefit.
> Luddites say this about every single technological innovation that they whinge about.
Possibly because it should be concerning that any meaning found in work is being optimized away for someone else's benefit. The weird lack of empathy to admit it happened to you and to decide someone else should feel it to, sorta worries me.
Especially in a creative field. Who does the cookie cutter AI models benefit? It doesn't benefit the consumer who wasted money on a lazy product, it doesn't benefit the people working on the project, as now their working hours are cut due to the soulless optimization. It benefits the boss, the guy extracting this value and selling a lesser product.
Ah, the alienation of labor, I wonder where I’ve heard about that silly idea before…
Having a job isn’t about finding the ultimate fulfilment in life. It’s about two parties exchanging value, that’s it. If you’d prefer to go back to a time before technological innovation made the necessities of life so cheap to produce that they’re accessible to basically everybody, then I’m sure you could find a country that hasn’t yet gone through that stage of development to live in. I’m almost certain that you don’t actually want that though…
> Ah you can afford TVs and corn based foods. Why don't you go back in time so you don't have to do pointless activities for most of your waking hours until you retire and die 10 years later
>And then industrial revolution didn't take out jobs, rather created more.
This is your thinking: "Because all previous technological revolutions created more jobs, it follows that the AI technological revolution will create more jobs."
However there is an obvious difference between all previous revolutions and this one: previous revolutions replaced labor. The AI revolution replaces thought. Even the computer revolution replaced labor, such as "typing things onto paper" or "updating a ledger".
For me, the whole ChatGPT debacle just showed me that 99% (or 100%) of humans are, themselves, just autocompletion generators, with the training model being a religious text, a political manifesto or propaganda masquerading as a school system.
Created more jobs, sure but what kind of jobs? More complicated, mind draining jobs that many, many people are simply not capable of. The general human intelligent cannot catch up with the rate that the market is asking for.
If you take one look at the demographics of developed countries, the reality is that we absolutely need to replace the majority of human labor in order to maintain any sustainable material standard of living. Old age entitlements are a ticking time bomb, with fewer workers in each generation to support the growing share of longer and longer lived retirees. You want social unrest, take a look at France, which is on fire at the moment because the government had the temerity to try and raise the legal retirement age to 64.
Firstly, we are talking in this thread about intellectual, not physical labour. There's no AI or robot forthcoming that will replace (already brutally underpaid) personal support workers or nurses in nursing homes. Instead it's the higher paying or more satisfying intellectual, artisan, and creative work done by those senior's children that are on the chopping block.
Secondly, even if jobs are automated to support the aging population, the economic model we have doesn't facilitate this leading to an improved life. The incentives to automate in our market economy are strong, and produce amazing efficiencies. Unfortunately that incentive is to reduce labour costs, and there's always a period of unemployment and reskilling afterwards. And in places like the US, unemployment = no health care, among other insecurities.
And in the past a lucky few reskilled towards more intellectual, creative, or informational-managerial jobs. Turns out those will be easier to automate than we thought.
Building a robot is expensive. Automating a factory a huge capital investment.
But once the R&D phase is done, reproducing and distributing software is dirt cheap. Machine automation of intellectual tasks will happen far more rapidly and with more brutal results than factory automation ever has.
The Alvin Toffler fantasy of information age prosperity looks more preposterous every day.
> Firstly, we are talking in this thread about intellectual, not physical labour. There's no AI or robot forthcoming that will replace (already brutally underpaid) personal support workers or nurses in nursing homes. Instead it's the higher paying or more satisfying intellectual, artisan, and creative work done by those senior's children that are on the chopping block.
Cool, then we can free up a lot of those workers to do other work.
And it’s not just a matter of nurses and the like. Retirees need food, energy, goods and services just like everyone else; it’s just that they no longer participate in the process of producing those goods and services. Plus, a ton of health care is knowledge work too.
> And in the past a lucky few reskilled towards more intellectual, creative, or informational-managerial jobs. Turns out those will be easier to automate than we thought.
Which is good. Half of the population has an IQ under 100, and there are just as many people with an IQ below 80 as there are above 120. Which means most people can’t reskill to knowledge work. However, the use of AI will allow these people to reskill to productive work just as POS systems allowed people to work as cashiers without the ability to do arithmetic.
The only actual problem you’ve identified is a threat to the egos of knowledge workers whose sense of self-worth lies in their self-perception of having above average intelligence, or perhaps in not having to do any sort of physical labor. Just like the myth of John Henry, we’re going to feel compelled to eulogize the email jobs. Our grandchildren probably won’t pine for the days of open offices and daily standups though.
My concern is around the class/income composition of the groups specifically in your group 3, and what kind of political forces their disillusionment and anger unleashes.
There are maybe parallels here to the 20s and early 30s in Europe, an era when there was a similar mass disenfranchisement of professional middle classes ("petit bourgeois"), artisans, and specialists... a corresponding mass anger and disillusionment.
And it was in large part those people who formed the base of the rising authoritarian right-wing / fascist movements in Europe.
It's not the most working class people who will lose the most jobs. There's no AIs coming to take away people's jobs cleaning cafeteria trays and mopping floors. It's paradoxically more expensive to automate manual labour or jobs with a high physical component. Competent meatspace robotics are hard and expensive. For now.
Instead it's people like us, used to a higher-privilege lifestyle, who have withstood the wave of previous automation and de-skilling. We're very expensive. And the people signing our paycheques I'm sure are salivating at the opportunity to de-skill and automate us away.
I hope I'm wrong, but I fear what could happen politically.
Well, I can tell you that the first time I see some sufficiently advanced AI walking in the streets between the public, I will grab my baseball bat and smash it to pieces.
I find it curious that even here there are people who conflate a robot body with an artificial mind.
Sure, you can put the computer running the mind that controls the body into the body itself, and there might even be good reasons to given spotty wifi or mobile data, but you don't have to.
> Well, I can tell you that the first time I see some sufficiently advanced AI walking in the streets between the public, I will grab my baseball bat and smash it to pieces.
Describing an AI walking the streets, or smashing it to pieces, necessarily carries with it the implication that the AI is located within a physical body that moves on legs.
And those words you used are very explicitly about destroying an actual machine.
But if you wish to claim that was merely a linguistic turn of phrase not meant to be taken as an implied literal belief of the location of thought? Ok, but it remains the case that what you propose against a "sufficiently advanced" AI, is somewhere between petty vandalism and a pogrom, depending on questions of if "sufficiently advanced" comes as a package deal with "sentience".
Which is a totally different, and in the latter case much more severe, problem.
Huh? Of course I understand that they have a backup somewhere. Whether I destroy the CPU or the actuators or some other hardware is not relevant as long as my statement is clear.
Yeah, but what if it's just a little canister scrubbing public toilets,trash cans, or something similar. Do you really want to destroy that? Will you even realize what you're seeing?
You know what's interesting: Your take is somewhat similar to a part of animatrix and even crazier is that I'll vouch for you just because of the sense of tribalism within me, hehe.
Maybe, but a pre-requisite to doing that right, is that we must first understand what this mysterious "consciousness" thing even is.
I can play word games like saying "consciousness" is the opposite of "unconsciousness" (or alternatively and IMO not entirely compatibility that it is the opposite of "subconsciousness"), but every time I've looked at this, it's either ended with (1) circular definitions, (2) definitions that accidentally include VHS players connected to TV set, or (3) things that humans regularly fail at.
None of that actually helps figure out whether or not a machine does or doesn't have that which makes us have the experience of being.
Getting it wrong, in either direction, is a Bad Ending.
That's how I see it too. Big neo-luddite movement, anti-AI unions, "AI Free" labeling with corresponding certification, registries of AI users (and witch hunts of course), AIs being weaponized to destroy or corrupt other AIs.
AI providers need to be at the forefront of lobbying for reasonable AI control laws, or they are up for a rude awakening.
We are going to be seeing so much of this. It is sad. Start up culture at times seems to insist that all progress must be good, as though progress itself will be thwarted by reflection or honesty.
Something is lost and something is gained, but that doesn't make the losses painless or irrelevant. We can be pragmatic and have empathy at the same time.
What I don't think we should do is act like every change is inherently good. Perhaps something valuable will be lost to efficiency and the bottom line. Hopefully something will also be gained.
Let's not be in a hurry to declare implications. And let's try to be helpful.
You can still do the old ways. I'm still gonna keep drawing and stuff, myself. I don't really see how the AI actually helps me accomplish my goals on an art or coding front as a gamedev. Where the goal is "make something meaningful and beautiful."
I don't like to shout "philistine" or "luddite" but it feels like all the people excited for these tools were tasteless to begin with and already weren't producing works that meet my bar of consumption. I guess we already knew that about the AAA game industry, for instance.
I guess it always pays to have taste - doubly in an AI-oriented world. The old ways aren't going anywhere imo.
I'm a developer and a hobby artist. So I could br a bit off on the art front.
For development: it's helped me adopt new libraries a bit quicker.
It also helped with generating a first draft of documentation/comments, or to help me iterate on the phrasing of some stuff I over explained, or not explained well enough. It's kinda made me a better writer as a result, since I learned phrasing techniques that I previously didn't.
It's helped me with boring stuff like converting data type syntax between languages.
It's good at doing autocompletion of lines or short snippets. This is a godsend in systems programming, where you type a lot.
It also helped me go from "blank page" projects or task by generating the boilerplate or scaffolding.
It's often times wrong (and sometimes in funny ways), but it's allowed me to focus on the problem domain rather than the supporting and often boring grunt work.
I think an artist probably has similar situations on the job. It wouldn't be a job of everything about it was rainbows and unicorns. Sometimes artists do boring stuff. Why not get that out of the way.
Or if working with really poor requirements on something that's not super interesting to them, maybe they can do rapid reference generation or brainstorming through the theses AI tools.
Or you realised you want to change the colour palette for a part of the scene. Instead of manually fiddling with thr the selection tools, and curves/HSL/filters, just ask the AI to retouching it. Then correct a bit, since it probably won't get it right. So just quicker iterations. I don't think it necessarily hinders the creative process
In games I think it’s quite clear how it will help. We already do this to a large part with foliage and trees. There is a ton of content that isn’t emotionally important but it needs to simply adhere to the style and sit there in the environment
You can replace "Startup culture" with "Modernity" and it reads like many conservatives' feelings about progressive culture. I don't think anyone has ever cared about that, and I don't see why that should be changed because it now affects different people.
Not a solid read on my comment, but it's an interesting point. Sensitivity to those left behind as conservative defense of tradition.
I want every coal miner to lose their job. Every last one. That would be a fantastic thing and a necessary thing. That doesn't mean I want those people to suffer. We can help them out. We can give them assistance and job training.
In other words we can embrace the future without being cruel. That's not a conservative sentiment at all. And yes, many people would like to move forward in a thoughtful way that doesn't leave people in the dust.
As far as affecting different people, LLMs are about to squish the middle class. You can be quite sure that the impact will be very different than when similar changes affect the working class.
> In other words we can embrace the future without being cruel.
That's not "progress is necessary good". We're forming the future, our choices define the future, we're not damned to be powerless and either embrace what cannot be changed or suffer from resisting that which cannot be stopped.
Right now, lots of people are having their first experience that not all progress is necessarily good. It's fun to watch how they suddenly change their tune, much like banks when they need another bailout, and they start talking about "what's good for society" a lot.
> As far as affecting different people, LLMs are about to squish the middle class.
I'll believe it when I see it. We were promised widespread unemployment thanks to robotics, too.
I know people like to politicise things since it usually creates a simple sense of unity and to gives people a sense of moral superiority. Simple is good. We humans like simple. If we didn't go and simplify things we don't deal with every day our already complicated lives would become too difficult to juggle. Their political compass is generally irrelevant. And there's always people who don't like SOME type of change. Conservatives think trans people are crimes against God; Progressive people think more process and regulation in trans conversations is a sin against humanity. Everyone has it hard.
I'm only mentioning this because it helps to sympathise with every person, not just the ones that agree with you. Life has more flavour when you're not focusing on "us or them"
> Conservatives think trans people are crimes against God
You're mistaking christian fundamentalists for conservatives. Conservatives are saying that changing your name doesn't mean you now get to compete in women's sports.
My point was: "Progress is always good, conservatives need to go" has been touted by the same people for decades, and suddenly they want to change the tune when it affects them. It's predictable and should be ignored.
This very well sums up the anxiety I have about how this is going to affect programming.
That any creativity and understanding will be sucked out and replaced with prompt engineering.
I still think you will have to be a good developer to understand if the solution you get is good, but is that even going to matter when you can just ask again and try again? I think there will be lots of code bases that no-one really understands, but that it won't matter much.
All of this is powered by work done by real programmers who are not getting a dime for their effort. That just seems completely unfair to me, and even more so when already insanely rich corporations like MS only stand to gain from it.
I'm not sure I want to be part of it to be honest. It feels dirty and wrong as it works now. Laws might be able to change that, but with a whole industry of giant corporations ready to fight any such laws, I'm not optimistic.
Even prompt engineering is just a short term problem, that'll get solved and largely disappear within a couple of years, if not months. Making sense of natural language is after all what LLM are already really good at, they just haven't made their way into image generators yet.
Another big issue that gets overlooked is that a lot of the problems that programming is used to solve today will disappear and no longer be considered problems to begin with. A whole lot of data conversion is no longer necessary when you have an AI that can directly interface with the raw data and manipulate it in whatever format you like.
People today are thinking of ChatGPT as making it easier to build websites, software and such, but real the power of AI is that it makes all of that obsolete. Why bother browsing the Web when ChatGPT can just give you what you are looking for in whatever format you desire, without ever touching a website?
AI is not just going to write software, AI is going to be the only software you'll ever need. BingChat today is already Photoshop, Shell and Webbrowser all in one. Won't be long until it learns to do most of the rest as well.
The interesting question is: What will be left after all this? If we automate away all the boring parts, what are going to be the bits that provide value? How will the Website of the future look like when AI will be the only one reading them? What's the value of a classic book when AI can write new books in real time? Will anybody bother with video games when they have a Holodeck at home?
> People today are thinking of ChatGPT as making it easier to build websites, software and such, but real the power of AI is that it makes all of that obsolete. Why bother browsing the Web when ChatGPT can just give you what you are looking for in whatever format you desire, without ever touching a website?
I don‘t think it will play out like this. Companies currently build systems to make the data robust/structured, to enforce a business process etc. Since they do not believe in answers any individual would give out of the unstructured data, they would also not trust an AI, and rightfully so. ChatGPT is not trustworthy at all and there is currently no path to make it so.
I found that companies are less interested in quality data than one might expect.
I worked in ontologies, focusing on making rigorous schemas for better data quality and easier integration. Our lunch was eaten by schemaless databases and big data. We protested that they wouldn't like what happened years down the line, but it was cheaper to start.
I don't mean to shake my tiny fist or overgeneralize my experience. But I think if AI gives good enough answers enough of the time, with a tiny fraction of the effort, people might just adapt around that model. People would rather have good now than great later. Or even mediocre now.
There is nothing stopping companies from offering AI oriented apis. If they no longer have to develop a frontend but just have a "here is our logo" etc endpoint, then... goodbye react! Goodbye FE developers! And at that point AI will probably be developing the very endpoints that it consumes.
For frontend - yeah, maybe. Or maybe the frontend becomes much more fancy. I‘ve built complex frontends that took a year or two but could imagine it all in weeks. I would also like to be able to build it in weeks.
I think that’s a simplistic view. Sure thing AI will make a bunch of problems disappear… but do you really think that with all the power of AI, we are going to keep publishing react apps? Golang web servers? Rest APIs? No. We’ll be publishing more sophisticated stuff. And no, AI simply cannot magically do everything (AI today is not the AI we will have in 100 years… unless you think the AI of today is the most we can get from AI)… so in between we (humans) will be working to use AI tooling to fulfill user requirements.
A lot of what we do as programmers, writers or artists IS 'prompt engineering'. All that AI is replacing (generalizing? automating?) is execution. Prompt engineering is the 'what' of the 'how'.
"They should have sent a poet"- not NASA, but a screenplay written by some sort of poet.
If there's a singularity it will be in this: you speak of real programmers doing the work to make this AI, but it seems like a hell of a lot of this 'real programmer work' is not poetry. Someone put together the ideas that make code resonate with how people think (for good or ill) and then a lot of real programmers groveled over tedious Python code or whatever, to make it happen at scale.
That tedious groveling and scale is what took the effort, but it's that which can be replaced by machine learning.
Art is prompt engineering. 'Always has been'. We don't master penmanship in the age of the keyboard. People still buy very expensive pens and write in longhand for the beauty of it, but it ceased to be a requisite chore many years ago.
Value the correct side of the balance. If you can't beat the guy with poorer execution by having a better eye and better prompt-imagining, you're not the better artist, you just have better penmanship.
Same for programmers. Best put your effort into trying to understand… because that's going to be a seller's market. You're right that most people won't keep up with understanding, and that's going to directly set their value.
Good points. Understanding what and how it really works will be valueble.
Explaining the AI needs the knowledge of what can be done. For example, you cannot say AI "build systems that process the data". You need to explain AI what "kind of processing should be done". That by itself requires knowledge of what kind of processing exist and when to use which. Also, explaining to AI what should be done takes time and effort which should not be underestimated. Sometimes when I am exlaining someone which way data should be cleaned in many cases I think that it maybe faster to do it by myself and explainig to AI will most likly be a harder than to AI. So, the question about is it easier to implement by myself than explain what I need to AI is a valid and important question.
Also, with data important part even without AI is to verify how well system works. Does it correctly handles differents types of data and different use cases? Just having a system that processes data but you are not sure what output can be is in most cases useleless data system.
So, in any serious data systems understanding of what they can be done and how code and data systems are working are important and I think will become even more important than now.
Also, someone needs to creat AI and systems around them.
My experience with Copilot is the other way around. I spend time writing my types/data structures, my functions, then Copilot tends to easily spit several variations of the test by just me giving it the name of the test.
Won't the prompt itself be the test? TDD is literally just providing an expected output on a given input, before implementing the functionality. The prompt we give AIs follows the same pattern, "I have a data that looks like ___, visualize it in a graph that ____"
I agree with the loss of understanding being a bad thing, but I cannot mourn the loss of creativity.
Programming is a core part of our society and infrastructure, yet it's still one like a craft rather than engineering. Creativity should not be part of infrastructure efforts on a level beyond embellishments and experiments.
I do say the same about architecture, but with added nuance: the Sagrada Familia is creative in form, but not in function. It functions as a regular cathedral (or at least an art museum). The creativity is there, but it is subservient to the basic functional goals of structural integrity and fitness for humans to visit it.
Buckminster Fuller is creativity. Geodesic domes, tensegrity.
Failing to understand that is evidence you can be replaced by an AI that encompasses all of what's ALREADY known and understood.
The reality is, we're looking at an arms race between imagination and execution. And just implementing that which we already know is profoundly devalued in the age of machine-assisted intelligence.
People thought the same about OOP when it was new. The promise was that programming would turn more into factory work. You’d get a spec for a component, implement it, then move onto the next. How soul crushing!
I'm intending to try it out a lot more — perhaps turn my old Java/OpenGL desktop shareware games into browser based JavaScript freeware — but so far, it's like messaging a junior dev and having them write code for you: 98% of the code works fine, 2% is total rubbish that you spend a normal amount of time (i.e. ages) debugging.
> I still think you will have to be a good developer to understand if the solution you get is good, but is that even going to matter when you can just ask again and try again? I think there will be lots of code bases that no-one really understands, but that it won't matter much.
Really? If you have a million dollar bug because of one edge case and you are going to go to your AI and ask it to try again?
Then you are in the wrong space of programming if you want to continue working. AI will improve over time and the scenario I described is going to be a thing of the past, but it won't go as fast as people expect because it is a question of trust and saving that hypotetical money by actually doing the work or at least understanding the output of AI.
I've experimented with both Copilot and GPT and I've gotten some nasty UB and even known security issues. Just looking at some of the GPT output lacking exceptions and errors checking that is clearly trained on tutorials. It will improve over time but currently it is not as good as people say it is.
I've had experience lately of getting GPT4 to output some code, I notice there's a bug, I point out the bug (not even 'it's at this line with this variable', but just that it exists) and, yes, it does fix successfully fix it. So, yes, I think that's likely.
Well that required you to understand the code. My reply is disbelief off codebases that no one understands in the near term when looking at the current output. People don't trust the output and do what you did, review it and catch the potential bug before going further with it.
The issue take with it is that we are able to trust the output of AI models as we trust the output of optimizing compilers. It isn't even close that we can just fire and forget the prompt like I can code up something and trust that it 99,999% the optimizer produces the desired binary that works as intended.
I saw AI as a tool like Photoshop, and for that reason wasn’t buying all this anxiety. I still see it as a tool, but this post has changed my entire perception of the issue:
It’s not that a new tool will replace you. It’s that a new tool might be too compelling to avoid, but be boring and miserable to work with and suck the joy out of your craft.
> I am now able to create, rig and animate a character thats spit out from MJ in 2-3 days. Before, it took us several weeks in 3D
Sounds like one artist in that studio is being let go, considering the studio is now getting work done dramatically faster. It'll probably be the guy with the (IMO understandable) "poor attitude" about what the job has become.
Even if you think the studio will simply output more art, they either need to find financially-viable uses for six times as much art, or let go of an artist - because if they just start using three times as much, that's still only enough work for one of the two artists.
> be boring and miserable to work with and suck the joy out of your craft
you don't have to use the tool - just don't complain that you're not as competitive commercially as someone else who does.
If the craft is what you enjoy, then do it apart from the craft's commercial utility. Make it a separate part of your life - use the AI in your job, and don't use it in your spare time doing art for joy.
Humans are allowed to be upset when their lives and livelihoods are upended. "Suck it up" doesn't really help anyone. We shouldn't accept maximizing profitability as the only worthwhile outcome.
I'm a developer. I really hope my day doesn't end up being code review of shitty Copilot generated code because the person accepting the prompt did have the knowledge or care to check it themselves. Sure, I'd still be valuable and I'd still get paid, but I'd be miserable. At least until that skill is no longer needed. Or someone can do it for cheaper.
Setting aside I have other responsibilities outside of work that really don't leave much time for side projects, about the last thing I want to do is work on personal code projects just so they can be sucked up by Copilot. Not to mention I'm also supposed to allocate time to learn a new skill so I can be competitive in a new, rapidly shrinking job market.
One could argue that their art is also not theirs but it's based on the art they have seen before throughout their life. It makes you reflect on what's really 100% original, as the saying if you want to make an apple pie from scratch...
You know, I was finding this whole thread really depressing and starting to feel somewhat concerned about the impact AI was going to have on my own projects / future.
But actually this is a great point. You are right. Even in software development, the joy is building something excellent. The joy isn't usually in defining random util functions to help you get there.
With these advances we could be about to reach a point where extremely small groups of people could build phenomenal things. How nice it would be to be capable of building AAA games with a team of 2-3 people.
There were no doubt quite a few nail makers pre-industrial revolution that took pride in making consistent, quality nails. That work was utterly and completely obliterated by the industrial revolution. There may be a few remaining nail makers, producing nails with an extremely niche design, but not many. So goes those jobs our lifetime as well
To complete this idea, old nails are better at holding than modern nails in at least some applications and are required for some restoration work. I remember reading (but can't find) a church-restoration project report mentioning it, but more can be read here: https://news.ycombinator.com/item?id=30421682
What's the solution when so much knowledge work is upended at once? When manufacturing jobs were lost, we callously told people to go to college to learn to learn IT. Whether it's art, code, marketing, or other applied knowledge work, there's a mad dash to reshape all of those. And the sales pitch isn't that it's going to let us work less and equitably reap the rewards of productivity (well, I suppose if you lose your job you get to work less). It's that as a business you can do more with less to increase profitability. Quality might take a hit, but there's an inflection point for that. If sales drop but you reduce your labor costs enough, you still come out ahead.
What field are all of these nail makers supposed to switch to this time?
This is what I wonder as well. We are told to "prepare" for the reality of AI taking our jobs soon, but how? What is the alternative field that will be a) safe from automation for a while and b) pay similarly to a CS/IT career? I haven't found the answer to this. We were told to get a higher education and go into "white collar" fields, and now we're told we were stupid for believing that these jobs would have a future. Seems very cynical.
Universal basic income is always the answer. Since progress bringsuch unpredictability to labor market we need to detach survival from labor so people can survive while they are figuring the stuff out.
universal basic income is not sustainable in the same way minimum wage was not sustainable, looking as how even $15/hr isn't enough to cover rent basically anywhere where they actually pay $15/hr.
> looking as how even $15/hr isn't enough to cover rent basically anywhere where they actually pay $15/hr
That's a bit different problem. What UBI could do is let people migrate away from very expensive places because they would no longer fear they'll become destitute if they risk not being able to get a job in cheaper place.
Minimum wage is sustainable because it is usually set based on economic reality of each country to a value that affects just a small percentage of laborers.
In theory yes, in practice that'd require some way to channel the profits from the tech back to society, which has proven tough nut to crack. Notoriously Keynes predicted 15 hour workweeks due the productivity improvements of 20th century. Well, the improvements happened but work weeks didn't get shorter
They got shorter in practice. Especially post-lockdowns lots of people are phoning it in to well paid jobs whilst working from home with Netflix on, taking long lunches, playing with the kids and then doing unchallenging work like meetings with other people who are also phoning it in. Yes they "work" 40 hours a week in theory but this sort of work would have seemed completely alien to someone in Keynes' era.
Another thing to note is the length of retirement has increased significantly since Keynes' time. So if you look at a person's whole life, the amount of time at work certainly has decreased.
Which scenario are you referring to? The one where everything is cheap because of ai or because the acidic oceans?
Various sea organisms rely on calcium carbonate for the formation of their shells and bones. If the water is too acidic there less calcium carbonate in the water…
Oh and what Im getting at is that things like food don’t necessarily have to be cheap because “ai” produces goods. Ecological collapse can make essentials expensive regardless of who produces the goods.
I think a better analogy is that pre-photography there were many portrait drawers who took pride in drawing good portraits for their clients. Then photography came, and the entire industry died a sudden and miserable death.
Portraits still get drawn, but not many, and certainly not for the sake of preserving what somebody looks like.
There are still companies which provides you consistent, quality nails. Just ordinary nails without niche design. They still have to use manual labour to check and investigate each nail one-by-one. Of course, you won’t get those in your local hardware shop, and they are terribly expensive. The difference compared to old times is, that they still sell the cheap Chinese nails, they just add a layer of very expensive quality control.
The Unabomber was notable for manufacturing his own screws, so I'm not sure how many artisanal nail makers remain. In those days nails were re-used, it was such a pain to make them.
This is a tragic disappointment a lot of (if not most (if not an overwhelming majority (if not literally all))) artists experience at some point early or late in their professional careers.
I taught art for a long time and I would always try to frame this inevitable experience as an important moment of growth. When you have to form a deeper relationship to your creative work and think strategically about how you want to support yourself long term.
There's no getting around it, it's really hard seeing what this is doing to visual artists. The few of us that could find good work doing what we love...
What's an analogy in the software world to this sort of tragic disappointment? Going from building fun videogame side projects to having to crank out dozens of over-engineered B2B SaaS microservices, for the sake of job security, for yet another sales workflow optimization tool?
instead of spending weeks thinking about and designing some sweet algorithm Knuth himself would be proud of, you write the same CRUD app with the same design patterns.
That's the change in perspective. They are inefficient from a CPU point of view. They are efficient from the perspective of "churning out more shit for less". So it's the same, soul destroying effect as for artistic professions.
There's been a bunch of similar transitions, perhaps less massive. You see the echos of them occasionally when you read people getting upset about the poor performance/resource usage of modern software. A hacker was once someone who could write assembly and packed an entire graphical operating system into 4mb of RAM. Nowadays a hacker is someone who knows what useEffect does in a ReactJS UI. The people who loved the metal, who loved knowing what every byte of their program did, they are mostly relegated now to hobby programming. Or maybe some embedded work but that's not particularly well paying. SerenityOS is a love letter to this era of development.
Or even just going from writing video games by yourself, getting to do all of it (graphics, "AI", design), to being one of a hundred programmers churning out the latest version of some "AAA" title. I now work at a FAANG, and in many ways, it is more rewarding. I'd love to be part of a five person indie studio making great games but it is a fucking lottery and my kid is heading off to college.
I used to have a job I liked. That job went away in the name of efficiency. I can't even disagree with the analysis - the old way was less efficient. It still sucks to lose something that you enjoy doing and that helps the team.
I'm doing fine, I went from specialized tech support to software development; but when we talk about efficiency, we are talking about people's lives.
As a traditionally trained artist, my advantage over the years always was the knowledge of composition, color and my drawing skills when producing the visual part of the products.
I focused early in my company (2005) on UX and UCD processes, and with the time realized that people don't care about craftsmanship or maximizing the quality.
We are approaching a time of generative design. Personalization of UX by data based generative interfaces. So even the flow and journey mapping will be automated soon (UX researchers, hello).
For me, this is the end of an era of innovation driven by human creativity. You will have a data scientist and secretary, which will prompt all day the "boss" vision.
A.I. art is not art.
I will give you a simple example: Who will be the creator of the famous David sculpture? Michelangelo or the rich patron who gave the description of the task? According to the definition of A.I. "art" the patron is the artist.
So in this scenario, the art becomes a commodity generated without human touch and personal expression. Just a synthetic membrane of average results trained on stolen data and genius intellectual property laundering machine.
Personally, my decision is to capitalize the remains of an industry and detach myself from any form of metaverse interactions. I started to paint recently again and don't give any F* about commercial viability of the process. If people of the future want to live in A.I. induced corporate coma, so be it.
> A.I. art is not art. I will give you a simple example: Who will be the creator of the famous David sculpture? Michelangelo or the rich patron who gave the description of the task? According to the definition of A.I. "art" the patron is the artist.
No? What are you even talking about? AI is not a person. In every single legal and ethicical discussion, "human" and "non-human" are treated completely differently. AI is a tool, while Michelangelo isn't.
Who's the creator of David sculpture is completely unrelated to whether AI art is art or not.
"Trained on stolen data"
This is emotionally charged wording with a clear confrontational bias and a dark intent to discredit.
As someone with over 30 years of experience with art and artists, do you realize how many influences are present in any given artist? Do you know how many artists use "reference images" heavily? Do you know how many artists trace images (specially in 3D modeling)?
By your logic, every piece of art is stolen, if not from other artists, from reality itself. You are insulting every artist present in this forum with your bias. Thanks for that.
Also, art made by commission or contract is technically property of the one who paid. Only the top recognized artists get to retain ownership of a specific piece or design done for a company, and even that is rare. How many cases are there where artists jumped ship off a property and needed to make lawyer-friendly pastiches of the characters, because they didn't own the rights?
And, the Unet does not store images. This is not a photobasher nor a extreme compression algorithm. You'd know this if you had bothered to try. Which I have. I have trained it on my own artwork and in fact I maintain one of the largest guides about how to train Stable Diffusion using Low Rank Adaptation. You can't just "take out" an image in any exact capacity, and a lot of "proofs" are fabricated using img2img or models trained so aggressively on a single image that it's going to resemble it no matter what (and yet it'll never be reproduced 1:1). People fueled by strong emotions are not incapable of deceit. In fact, it's the stuff you MUST double-check, precisely because fueled by emotions.
Sorry, the idea that artists are synthesizers of the past art is ludicrous.
The human is not the machine. There are a billion factors for interpretation in creating a personal style. The A.I. image generators are just machines which produce average from all previous artists. They have no personality or agency (yet).
> Stable Diffusion trained its algorithm on data sets collected by the German nonprofit LAION, which has collected billions of captioned images from art shopping sites and websites such as Pinterest.
And it has done so without consent, causing artists and advocates to raise copyright concerns.
It isn't human-generated art. It would also be difficult to really label the generating process as a process that generates "art". I'd rather only label the end result as "art" for people that think it qualifies. It's the same as not considering photography as art. How many photographers actively contribute to camera technology? Or how many people that have worked on camera tech can profess to be actual photographers? Very, very few for either case. It's similar for prompting for an image. It's a "skill", the same way photography is a "skill". You'd be hard-pressed 20 years ago to take pictures of a quality consistent with a high-end DSLR today. Yet are people today "better" photographers just because they own better cameras? Or is an "AI Artist" a "better" AI artist because image generation got better, and / or he got a much better model (seeing how the quality of diffusion-based image generation wholly depends upon the model being used, and not so much upon prompt tuning)?
> I will give you a simple example: Who will be the creator of the famous David sculpture? Michelangelo or the rich patron who gave the description of the task? According to the definition of A.I. "art" the patron is the artist.
Yes, and I hope that changes, but it won't as long as there is monetary incentive for companies to shell out models. Genuine human contribution to the models will also decrease exponentially, because everything that could be mined has already been. They can make small gains with labelling but the rest is a hard ceiling.
Attribution is very finicky right now because there's some effort involved in generating those images. If it was completely effortless (with a UI that labeled it as such) and gave sources / similar images, you could stop calling it "art", and instead just plain old machine-based image generation.
Just yesterday, I gave the ChatGPT prompts to generate Midjourney prompts. It transformed a simple sentence of "Portrait of beautiful woman on the beach with yellow dress." to the specific output of prompts with different lenses and lighting setups generated with astonishing results. My input in the process was minimal. No compositional skills or artistic taste. Powerful tool? Yes. Productivity is not an Artistic measure. Transitional value and personal expression is.
But hey, be productive while you can.
These AI systems will sound the death knell for the creative culture of openness and sharing that emerged during the web 1.0 era with the Creative Commons and OSI-approved licenses and sites like DeviantArt, NewGrounds, and SourceForge, and which still exists today, albeit in a less idealistic form.
Once people realize that by putting something, even just a blog post, out for public consumption, especially if done so under a permissive, share-alike license, they're helping to train an AI that will eventually compete with them for their job, they will not only be less inclined to share it (under a permissive license or not), but be less inclined to produce it at all.
I don't think you actually need copyrighted data to train AI to compete with artists. The argument over copyright is interesting but kind of a sideshow. We already know that copyright protection is not a prerequisite for creativity - as it currently stands copyright only protects the rich.
The real reason why AI is going to kill DA/NG/SF/GH/etc is because of spam.
Let's say, for every piece a 100% human artist can do with no AI assistance, another artist using AI as a tool can make three. They start off with a prompt in DALL-E, SD, or Midjourney, and then tweak the output with prompts and so on to get exactly what they want quicker.
Now imagine a non-artist using the same technology. They don't care about the touchups. They just want to churn out lots of pieces all at once. So that would be nine times more output compared to the unassisted human - three times three. And there are plenty of them doing this, because they all watched the same video on how you can hustle T-shirt and poster stores by submitting lots of AI-generated pieces.
The reason why you are seeing art sites ban AI art is purely a matter of practicality. Many of those spaces are able to provide free promotion to people who post quality content because quality is itself a limited resource. Now the human trying to make their magnum opus - AI or no - is going to be buried under lots of low-effort garbage from zombie modernist grifters.
I think it's fair to say that what this current generation of AI enables is cheaply generating a limitless amount of content of not great but acceptable quality.
And I think you're exactly right that this is going to make it extremely difficult for the great stuff out there to rise to the surface. It's going to be suffocated.
We'll soon see the point where AI can generate text, video, voices simultaneously. All of the tik tok/reels style content people mindlessly scroll all day will be AI generated, delivering money to only the AI's creator, and no one will know the AI is behind it.
We are already seeing the same result of LLMs, with sci-fi magazines (temporarily) no longer accepting submissions and self-publication platforms such as Amazon's being flooded with auto-generated crap.
Hustle culture is part of it, but only one part of what I believe will be a negative outcome for society.
As AI systems display inherent bias, and it is possible to train for that bias (e.g. RightwingGPT), we have essentialy weaponized content generation, and we have no scaleable mechanisms such as fact checking to combat it.
Whether it is low-quality work, spam, disinformation, outright manipulation, copyright violations, or other forms of negative results, we have automated the ability to produce it.
you could argue that this LLM's will benefit more users than the produced value itself lost on sea of SEO garbage. But my point is that the openness wont stop because people will still need to show their craft for better employment opportunities.
From all I have learned about ChatGPT coding copilots, this is going to be bleak. Similar art styles. No sudden surprises. Eerily familiar characters. I don't like the look of this. More reason to cherish hand-drawn/hand-animated (?) unique works though.
If you're familiar with the Rick and Morty "what is my purpose" meme ( https://www.youtube.com/watch?v=sa9MpLXuLs0 ), it will essentially be the same for humanity in the future.
that is technical minutia that will improve before artists relying on that hurdle saving their job have time to sign up for the dole
don't get me wrong, I have sympathy for everybody who for whatever reason sees their livelihood endangered overnight through no fault of their own and with little time to adapt
but I perceive a lot of denial:
- denial that "it could not/would not happen to me"
- denial that this rate of change is not rather unprecedented, "it was always thus" kind of denial
- denial that just because a certain niche won't be quickly automated, it follows that this job is safe or that a general reshaping of that general industry won't happen, changing the incentive structure massively
- denial that a second order effects may get you; so you can now automate some expensive part of your process? that also means the barrier of entry for the competition is lower and whatever competitive advantage you had in that process is now gone. Sure, embrace the efficiency but keep adapting because the only thing sure after those changes is that your industry will experience some "creative destruction"
How do you know it won’t? I don’t mean to sound like a doomer , but I never see evidence for these claims around “AI not ever being creative like a human!”
These AI’s are literally trained on all of human creativity… i struggle to see how there’s any indication AI would fail to be as creative as a human in even the near-ish future.
They aren’t trained on all of human creativity. Most of human communication is verbal, and that still isn’t captured. I am not saying whether it will have much difference, just that your statement is factually wrong.
I used “all” instead of “most”, which is indeed inaccurate. A better response would have been “a collection of humanity’s creativity”.
However, you fail to recognize that OpenAI also created Whisper, which is a quote capable speech-to-text transcriber; and this tool easily converts audio and video (those verbal bits you mentioned) into text.
So the pool of creativity which OpenAI can train its models on is far larger than just original text; further, they demoed image modality a couple weeks back which would allow for VISUAL creative works to be parsed as well.
I understand your points, but what I wanted to highlight was something slightly different: verbal communication and dialogs should still consists of magnitude more human conversation than text, or presentation, or basically anything digital.
It's simpler even: there are conversations that are straight up never happen in writings. So by training only on written text, you will never got those conversations. For example: teacher - student conversation where one side is confused and need to be explained, or many kind of debates and discussions where there is no end conclusion.
Basically, pick a random human from the street and they would be talking and listening a lot more than they would be writing. Those talking and listening is never going to be captured by any digital system.
It might end up that written text has enough similarity with verbal communication that it doesn't matter anyway. But that's hard to guess as a priori that it would be the case.
Quite the opposite. AI will be better at learning "what surprises" humans than humans because it won't be filtered by a human going "oh, that wouldn't surprise me" (because you just thought of it, idiot). Now imagine a million, a billion "banksy" "surprise" pieces. This is the future.
And the future is now, of course, because there are tons of such artists and the value comes only from the elite who ascribe value and not some inherent value of the thing itself. Things are important ("worth money") because tasteless fucks with money say they are worth money. Go watch "The Menu". Ralph Fiennes face cooking the last meal says it all.
> I am now able to create, rig and animate a character thats spit out from MJ in 2-3 days. Before, it took us several weeks in 3D. The difference is: I care, he does not. For my boss its just a huge time/money saver.
This is the crux of it. The bottom line is that you are free to practice your craft and love it, but don't expect a company to pay you big bucks for it for no reason when they can find cheaper and better ways to accomplish the task. Automation eating away jobs (including ones that had a quality of romanticism associated with them) has been a constant in human history, and it is silly to expect that it won't apply to you for whatever reason.
>I am angry. My 3D colleague is completely fine with it. He promps all day, shows and gets praise. The thing is, we both were not at the same level, quality-wise. My work was always a tad better, in shape and texture, rendering
No company is going to spend 5X more time on work that is a “tad” better.
Yesterday I visited an old, but still reasonably healthy man who has been living in Thailand for 40 years now, maybe longer. Reasonably healthy as in: he doesn't need to visit hospital or use medicine, but his power is now waning a bit due to age.
A man originally from Switzerland. 88 years old now. When he arrived in Thailand he bought a nice plot of land and made a quite beautiful garden there, with many fruits and vegetables and trees. Also many salas spread throughout the 8000 m2 terrain. And chickens, dogs, cats walking around and about.
The man has had quite an adventurous live in the past, back when the world was more adventurous as well, perhaps ...
Anyway, the man is content just tending to his plants. And reading and writing in one of the salas that he uses as a (kind of) outdoor office. In the past he also used to paint, but he's not able to do this anymore.
Maybe we'd all be happier with just a nice plot of land and some plants to tend to and perhaps some hobbies to keep us busy. And maybe it would keep us healthier (both mentally as well as physically) as well.
What's your point? Many people would be happy if they are lucky enough to be born in a wealthy country and save a bunch of money by the time they're 48 and retire, move to a poorer country and be happy tending a garden. Not many people are this lucky and it will be harder going forward as such desirable cheap places are becoming less cheap rapidly, and less accessible. "Golden visas" in such places are under scrutiny as they are just driving up real estate prices with little benefit to locals.
Also foreigners can’t own land in Thailand, so while I hope the Swiss gentleman has many more years of happiness, it was never his garden, which might be a turnoff to the techno-libertarians out there.
My working definition of “foreigner” excludes Thai citizens, though maybe it’s more subtle than that: I didn’t know naturalized citizens were subject to ownership limits.
I won't be able to afford a plot of land without a job. I feel like we're on the verge of cataclysmic unemployment and I have absolutely no desire to wait in squalor for legislation to catch up with the nifty little UBI idea floating around which seems like should have been sorted yesterday.
An acre of farmland costs $2000 in North Dakota. You need 5000 of them to make a living growing wheat, but an acre should be enough to feed a family, maybe a few more if you want meat.
Being a subsistence farmer is terrible though. There's a reason 99% of Americans stopped doing it just as soon as they were able to. It's certainly not a comfortable retirement!
Whenever somebody claims it would be easy to feed a family from some small farmland, it’s very clear that they’ve never had to live anywhere near that lifestyle. It’s terribly difficult and unpredictable.
I think growing a single crop is risky, makes it harder to be self-reliant. My girlfriend grows many different fruits and vegetables, so we’re less bothered by the price changes each season or year. And we can be sure our own food includes a lot of organic produce.
> Maybe we'd all be happier with just a nice plot of land and some plants to tend to and perhaps some hobbies to keep us busy.
And enough money saved up to buy land in Thailand and retire at 48, without worrying about how he's going to pay for taxes for the rest of his life, or how he will buy food to supplement what he's growing in case of issues there, or...
"Maybe we'd all be happier with just a nice plot of land and some plants to tend to and perhaps some hobbies to keep us busy. And maybe it would keep us healthier (both mentally as well as physically) as well. "
And where is the fantasy world where "we all" have enough money for this?
I don’t think huge wealth is needed, but it very much depends where you buy the land. Coastal land or land near big cities will be much more expensive than land in a remote village.
He has not, but has a wife and daughter. Possibly other kids. I did not get the impression from him that he was rich, but he did get lucky to buy a large plot of land for cheap due to:
A) living in a remote village
B) Currency conversion
The house itself was also pretty cheap, being a traditional Thai wooden house and not particularly huge.
Sadly, we are not headed to such a future. The economic benefits of these LLMs will go only to the capitalists, and any labor savings will allow them to reduce wages and headcount while leaving workers unemployed or with less bargaining power.
AI will be yet another tool to further wealth inequality. We won't be able to afford retiring and tending to our plants.
This is a solution that will work for a very small percentage of the people - those who were born in rich families and those who are a lot more driven and/or smarter and/or luckier than the average.
What about the other 99%? As they say a revolution is three missed meals away...
All is not lost. I am sure many of us do care. I feel the frustration and pain of the OP. And I am someone not easily swayed by emotions.
The shift that’s happening right now will have far reaching consequences. I often ask myself “at what cost”, but
one can ask that question for any major invention in history.
Wait till more developers are being affected and their pie in the sky ideas about the future don’t come true. I promise you the tune they’re singing will instantly change.
I think you are seeing what you want to see. When I read your nail maker quote, it did seem bad. When I read the full comment, it seems much more sympathetic.
The first one is a little unoriginal (but this conversation has been had a bunch so it's hard to fault it for that). The second two we were all working on downvoting and had them near-invisible until you quoted them, restoring their reach. :P
Very true. Much as I like AI on the technical side, I also look forward to the howls of outrage when some large model comes for the smug squad. coding, trading, management are all eminently replaceable skillsets.
He doesn't need empathy. He needs to find a way through to a better advantage over AI. I'll give him compassion as he struggles through.
Creating static images (that do not require a significant amount of visual depth and perspective) is a dead art form. The market will accept flat looking images as long as they're drowning in detail, and AI can generate that.
I'm taking a stab at what's next, he needs to (take a stab too) and skip over the loss of love for an out-of-date computing skill.
I’m a world where you have no “value” you become trash to those who “do business”, it won’t be a pretty transition. I’m concerned about it personally.
It wouldn’t be a problem if most people just respected each other, I think deep down inside they do. But while those paying our “wages” are an important part of our lives, we’ll have trouble.
I suspect this isn’t much different from the direction software development is headed.
You can either find satisfaction in the process or the outcome. The process can already be fairly mundane and will become more so with time.
A couple of other random thoughts:
1. Currently, these things work best (really need) an operator experienced in the craft. Ironically by eliminating entry level positions they may make it hard for new developers to gain that experience and enter the industry. Instead the entire industry may become one of increasingly aged SEs prompting LLMs to update code, and (occasionally) auditing the output. Somewhat akin to the COBOL developers of today.
2. If we really do get AGI then the marginal value of creative output (be it software or 3d art or otherwise) will be driven to zero. It will be so cheap to produce that competitors will easily arise.
>The thing is, we both were not at the same level, quality-wise. My work was always a tad better, in shape and texture, rendering… I always was very sure I wouldn’t loose my job, because I produce slightly better quality.
One thing I had to learn the hard way in programming: it doesn't matter how good my code is or how modern the framework is, what counts is results delivered to the customer. Any newbie can outperform me in that regard.
Maybe it's a bit like professional software development where developers no longer get to create products, they just are code monkeys that complete tickets that are specified to the Nth degree and there's no creativity or fun and indeed if you do anything except what's on the ticket then you'll have to answer for it.
If you are a developer who loves the art and craft of creation then you have to do that at home.
On the other side of the fence, I'm extremely grateful tools like Copilot can automate the mundane process of looking up magic incantations to get some (usually poor) library to do what I want. I have no sympathy for programmers who saw it as a point of pride to memorize hundreds of built in functions and annoying minutiae.
Knowledge of annoying minutiae can be indispensable when you're trying to debug code that looks like it's calling the right functions based on the function signature.
This seems like a poor use of Copilot. At least in the JS ecosystem, pretty much every library has TypeScript support now. No need to guess or look anything up. Copilot can still be wrong, especially if the API changes over different updates
In my experience Copilot's value add is simply autocompleting simple/boiler-plate-y lines for me. Definitely wouldn't wanna trust something as fuzzy as an AI to "memorize" APIs on my behalf
I mean I’m using it (also nvim). It’s powerful autocomplete, speeds me up by maybe 20-30% when in “typing in code” phase and that’s that. Dont see it replacing software engs any time soon. SWE+copilot will replace just SWE tho that one is for sure
We've created a society where there are such things as 'jobs' and unless you're rich, you need a job to support yourself and maybe a family. I think this model of society worked reasonably well for a while, but with AI we're coming to the end of that era.
In the next era we'll probably live like aristocrats, but instead of the work being done by servants, it'll be done by AI.
Back to the case of the 3D artist, in this new era she won't need to have a job to support herself, and she will have the freedom to create her art as she wishes.
We've needed "jobs" to support ourselves since we descended from the trees. Actually before that too. "Society" came about because we invented tools and techniques to make our jobs easier so people could spend more time on different things. "AI" is some more steps down that same path.
> In the next era we'll probably live like aristocrats, but instead of the work being done by servants, it'll be done by AI.
Not in the way you imagine. We live like aristocrats today. We live in vast mansions that have hot running water, electricity, refrigeration, heating and cooling, lighting, we have servants bring us gluttonous feasts whenever we like, we careen around in our horseless carriages and flying machines. We laze around many hours of the day wasting time on frivolities like watching TV, playing computer games, reading books, drinking and taking drugs, etc. We have armies of doctors and security at our disposal if we should break a leg or even get a sniffle.
So we live better than kings did 200 years ago.
Life might get better for the average commoner in another 200 years. There will still be the haves and have nots though. People with a lot of wealth won't want to share it with others, and people with less will be envious of those with more, that's the constant in the human condition.
Nobody with the robots is going to give money to a bunch of hippies to laze around. If they want a piece of it they'll have to either work to create it themselves or to take it by force. Either way it won't come for free.
> We've needed "jobs" to support ourselves since we descended from the trees
Do you think the lion's job is to hunt? And the zebra's job is to eat grass?
A job is something you do in exchange for something else as payment. If you go as far back as to when our ancestors descended from trees, you will need to explain what kind of jobs were around then, who worked them and who paid.
> Do you think the lion's job is to hunt? And the zebra's job is to eat grass?
Yes. Difficult, cruel, and brutal jobs. Few people would choose to change places with them.
> A job is something you do in exchange for something else as payment.
Broader term is something you do to get your bread. Working for yourself, cooperating and exchanging with others, are all jobs.
You can go out on your own and start a business, it's not insurmountable, everyone from a street vendor to a corner shop to a farmer to Apple has done it. It's something that many can choose to do if they wish, but many choose not to. Can't avoid that by nitpicking semantics.
I'm not sure why you're getting downvoted for this. Surely most people would think it's a bit of a stretch to say that lions have jobs in the same way that 21st century humans have jobs?
I can't imagine walking up to a lion in the savannah and asking what their job was :-)
It was a pretty weak comment that didn't address the point. The first humans certainly spent a lot of time working to get food and shelter, and stayed awake worrying about not getting enough to eat.
"Society" didn't enslave us with jobs, we already had them.
I agree with a lot of what you write, but in my experience in the UK there can be instances where money from the the wealthy has been used for the common good. For example, until 1945 only the rich could afford to educate their children, but the government decided to use taxpayers' money to provide free education for every child up to the age of 16. It was a utopian idea, and when it was first suggested most people said, 'yeah right, nice idea but of course it'll never happen'.
Are you aware that is not the rich people who sustain the state and thus 99% of its income, but the middle class right?
That's why politicians who campaign with ideas like "taxing the rich to fix X problem" are pure populist bullshit. You are the one who is gonna end up paying more.
First there are not enough rich people compared to commoners, and second, they can and will hide their money if possible and worth it.
So in the UK as in any other state, what did and is providing "free" education is your money and the money of the ones like you in the middle and upper-middle class.
Surely it's true to say that in the UK some people simply couldn't afford to send their children to school, but now every child can attend school regardless of their parents' wealth?
Government exists to solve the collective action problem (i.e. prisoner's dilemma). In an unregulated market, at times the winning move has lots of unaccounted externalities, so government steps in to realign incentives.
In this case, free education was possible if everyone collectively decided on it, but the market was unable to coordinate those people effectively. Government stepped in and tada - you have free education.
> I agree with a lot of what you write, but in my experience in the UK there can be instances where money from the the wealthy has been used for the common good. For example, until 1945 only the rich could afford to educate their children, but the government decided to use taxpayers' money to provide free education for every child up to the age of 16. It was a utopian idea, and when it was first suggested most people said, 'yeah right, nice idea but of course it'll never happen'.
And yet you ask people today and they'll moan about never having had it worse, boomers this and billionaires that (or tories I guess, I don't know what the equivalent go-to brainless insult is over there).
Not that there aren't valid reasons for complaint, but as I said envy (and greed) is a fundamental part of the human condition. I don't know why people think some utopia is just around the corner -- we've had thousand-fold multipliers in production. Fossil fuels, internal combustion, farming and construction machinery, factories, chemicals, computers, industrial robots, electricity. From pre-historic farmers and hunter gatherers to now production has increased unimaginably. The utopia never comes because that was never what was preventing it.
People who want there to be a utopia where nobody works and everybody is happy sharing everything fundamentally want to extinguish humanity. Because that's not what humanity is.
So I think there's a distinction between wealth and happiness. Amongst the aristocracy I'm sure there is a lot of misery. I think we've got a good chance of ensuring material abundance for everyone on the planet. Whether those people will be happy is a different question.
There is a distinction, but we already live like aristocracy. And aristocracing a bit more isn't going to change much in the social dynamic. "AI" isn't likely to be any more transformative than the the Haber process or the steam engine or the computer, as far as I can see. Some peoples' jobs will become redundant, same as always.
This next era may take 100s of years though, AI taking up physical labor is still science fiction (and I'm not fully sold on the utopia angle). She'll need to support herself much sooner than that, let's be realistic
I don't think so. I think in the future we will be living pretty similarly but instead of just doing the work manually we'll be using AI to be more efficient. Just like what the original post on reddit was referring to.
He was not fired or anything, his bosses just told him "now you need to do 10 times the work you were doing before because you got the AI". Although in the past it was okay for him to do 1/10th of the work, now with the AI, that has become unacceptable.
Why would the socio-economic system change? We've been automating work for a while now and people don't really get fired over it; they are just tasked with producing more. Remember that the computer itself was a huge productivity enhancer. When excel came out suddenly one accountant could do the work of 10 but most people weren't liberated from their tedious manual algebraic works, instead, we simply started to produce significantly more than before.
Of course, I would love to live in the world that you dream of, I'm pretty sure that's what post-scarcity communism looks like. I just don't really see any indicators in terms of public policy or business attitude that indicates that will be the case. Seems to me like our societies are more geared towards growth rather than improving quality of life.
> Of course, I would love to live in the world that you dream of, I'm pretty sure that's what post-scarcity communism looks like. I just don't really see any indicators in terms of public policy or business attitude that indicates that will be the case.
That's true, but let me give an example of something I've experienced in my own lifetime. In about 1996 I heard about Open Source software and after wrestling with the idea for a bit I became a convert and threw out Windows and installed a Linux distribution on my home machine. At the time, when I talked to people about Open Source software, there was a near universal reaction of, 'nice idea, but it'll never catch on' and they would go on to give me many reasons why this utopian idea would fail. Now, 25 years later, Open Source is a great success.
As a postscript, I got into making contributions to Open Source and I still do. Maybe one day my mind will be too addled to code, or perhaps an AI will do it better than me, but until that day I fully intend to carry on with Open Source contributions.
I'm all for Open Source. And Linux is a great example.
But look at what Open Source is turning into, it's being swallowed by corporations anyway. I guess I'm just not so positive, it's really hard to go against the grain and I think a world where corporations do open source is better than one where they don't. But Open Source has not changed the underlying structures of the world in any significant manner.
I mean, I think this is ultimately a political question, you don't really solve political questions with this kind of technology. We need better social and political technologies, much like when we went from having kings and feuds to having liberal democracies. I see that as a social technological advancement, what's next?
I recommend you read the book "Open Democracy" by Helene Landemore to get an idea of what I think maybe could be a good way of changing the political structures that are basically constraining us right now. And I do think Open Source plays a part in that.
But as an utopia it failed. There was a time when you could „live“ inside the open source idea or community like in a house. That was around 2002 plus minus 4 years. Open Source later succeeded but it doesn‘t feel like that anymore.
I remember the 90s when manual typesetting and page layout was replaced with Quark Xpress, and when analogue phot retouching was replaced with Photoshop. Lots of highly skilled people suddenly out of work - the ones who didn't retrain to use the new tech. Wasn't overnight but felt like a revolution. Then of course magazine publishing lost out to the internet.
I would love to spend my days making stained glass windows, or as a dry stone-wall builder in the Yorkshire Dales. TS, there's no way to make a living in either.
I feel like we are at a bit of an industrial revolution type stage here - lots of old jobs that were previously done by hand (spinning yarn, weaving, making bricks etc) are about to be industrialised and automated. I don't think it will happen overnight, but perhaps a decade or two and a lot of jobs/industries will be irrevocably changed.
I am surprisingly sanguine about it. I think there will always be a need to understand what needs to be built, even if it is the AI that is actually coding it up. SWEs will probably end up in more of a supervisory role, responsible for ensuring that the requirements are suitable for automated code generation. Think TDD, but with AI writing the non-code bits.
There's no doubt that this artist has lost something as a result of AI (the pleasure and enjoyment of their job). But we should also ask if we are at risk of losing something as a society.
There is something to be said about how dangerous it is to offload your entire workflow to entities that you are competing with. Or have unpredictable morals.
Say you start relying on some AI to build all your applications or generate all of your art, or copy. What if you start developing something that competes with the business interests of the entity that owns the AI? At that point, that entity is incentivized to lock you out of their system, leaving you with a project that cannot be finished because no one in your organization knows how to develop that kind of project.
Or, say you are developing an application that helps some marginalized group or is working to expose some kind of corruption. But, as it turns out, the owner of the AI profits from that marginalization or corruption in some way. So they lock you out and blacklist you in some way.
When it is all said and done, AI will be owned and operated by the same people that own everything else in the tech industry - Google, Microsoft, Facebook, and Apple. Are you really content with only being allowed to express yourself in the way they deem appropriate?
> There is something to be said about how dangerous it is to offload your entire workflow to entities that you are competing with. Or have unpredictable morals.
Yep, this is why I stopped using Github Copilot. But now we're at the point where if I don't use it I'm going to be far less productive than my coworkers. I'm forced to use this thing that I truly believe is going to take my job within 5-10 years max.
I just can't wrap my head around the time saving thing.
Do your coworkers really understand what it is that they are putting into the codebase? If not, that's terrifying and recipe for disaster.
If they do understand it, they have to be taking the time to review it to ensure is right. Which cannot save so much more time that it's worth the risk.
This reads as if floor sweeper is complaining that a company bought a machine to do the sweeping. And now he has to ride the machine all day and it took all the joy of his job because he really liked the hand motions and walking around and was very good at it. His colleague doesn't mind and is no longer slightly worse at sweeping than him. And boss sees it as a huge money saver because floors are vast.
As I wrote elsewhere: save for a world revolution. Capitalism is a given, it's not going to go away. The degree swings a bit, but right now the tendency is still pretty "right" wing, so you've got to see the application of AI in that context. Anything else is at best day-dreaming.
And the automation is? We were getting rid of labor that can be automated in monarchy and in feudalism. Capitalism is no different.
Future with different organization of production and sharing of its results is imaginable. Future in which most people refuse to pursue automation is not.
I think this is the curse but also the opportunity of AI.
OP can now do the same work they did before in just 2-3 days. Could they now open their own studio and offer services for many customers on their own instead of working for one company?
What will happen with indie games or indie software? Suddenly one person who is good at inventing ideas will be able to develop whole games, with music, graphics being generated by AI, coding heavily supported by AI (not _yet_ fully taken over), marketing, etc also supported by Ai.
My hope is that this will result in a lot more new startups that will also create a lot more jobs.
Maybe the ultimate losers will be big-corporations who can’t build these large models on their own and are unwilling to call external services for fear of their data, or who are simply too slow to adapt.
I often think that what is happening right now is actually a prototype example of what is described in the “innovator’s dilemma” disruption.
Fascinating, this reminds me a lot of a conversation I had about 4 years ago with a translator, they were working at a translation company which just introduced AI. And their initial implementation of AI looked like this: We give you an initial translation and then you modify it. The thing is this turned the translator from a "creator" to an "editor" and they absolutely hated it. Back then, the power was in the hand of the labor (not the company) so the company actually modified it in a way such that it was completing and recommending things for you but gave you the control of the whole thing. This is much easier to imagine with text, but there might be a way to give people the creative control, even with these new tools.
It's like what happened to cinema musicians after sound on films was developed. They'll still need people like this but not as many. But then that's the nature of a technologically driven society, at any moment your job could be obsolete.
I'm not an artist, and I don't mind the general direction the AI is going, but I feel for the people who are impacted like this. It must feel like shit to be replaced in such a way.
Perhaps the most wrongly maligned group in the history of the world were the Luddites. A chaticature of the Luddites comes down to us. It is of a people who would deny progress which was obviously inevitable so that they didn't have to change. They are cast as inflexible, lazy, stupid, belligerent.
But when we read the speeches of the Luddite leaders, more than anything else they talk about "liberty". Luddites spoke and wrote about the dangers to their liberty created by wage pressure from power looms. But much more, they spoke and wrote about the danger to their liberty posed by moving away from handloom weaving in homes and small shops towards weaving with power looms in a factory system.
What they resisted more than anything was the destruction of their autonomy, their subjection to abusive floor managers and the grinding dullness of repetitive machine labor.
Programmers have been facing this down already for some time. As full-stack engineering, excellent automation tools and powerful APIs have proliferated, we don't often get to go deep into some problem. Algorithmic design and development, which is a huge pleasure of mine has faded to the background of gaining a shallow understanding of many technologies, plumbing them and futsing with their numerous configurations. AI is already accelerating this for my work and I am struggling to find it not dismal. It is no surprise that now that the tools exist to comoditize art, the work is following a similar trajectory.
AI should be a unifying event for tech labor. Not to resist it entirely but to ensure in whatever future develops that we have some say in how work can remain enriching and dignified alongside the growth of these tools.
There is "art" and there is "design". It feels like OP might be confusing the two.
The latter is about creative expression within the confines of constraints like time, money, function, etc.
The former is not.
Doing graphic design work at a commercial enterprise is a creative endeavour constrained by commercial realities.
Perhaps the OP just needs to accept that he is working in the real world. They should probably find a creative outlet that is creating art in addition to their design day job.
That's the risk of participating in a completely new industry... You don't know how long it's going to last; it has no track record. It reminds me of my own career as a software developer. I self-taught myself coding since I was 14 in 2004. I'm from a relatively small town and nobody in my family had a tech background or even knew anyone with a tech background. So it was quite a stretch/unlikely for me to become a developer. I would spent my lunch breaks in the school library; sometimes coding by myself on library computers. My dad thought I was wasting my time. Back then, the only people making any money in tech in my town were developing websites for local businesses and it wasn't very profitable. My dad wanted me to be a lawyer.
I feel like I did many things right; I got into web development early (while most of the jobs were still for programming for desktop PCs), I was an early adopter of Node.js, I went to university and did a course on AI/Machine learning. I launched a popular open source project. But somehow it never worked out for me financially; I was never accepted into any fast-growing company, my popular project was big among startups but didn't gain enough traction in the corporate sphere (where it could have been monetized). Now AI is threatening the entire profession.
I think my dad was right, I should have been a lawyer and done software development as a hobby.
The only silver lining that I cling onto is that my main interest has been software architecture and higher level systems/infrastructure thinking. So far, AI hasn't shown itself to be able to effectively solve higher level software architecture/design problems. It's great at answering questions, but not at asking (the right) questions which is what software architecture is about.
"I don’t want to make “art” that is the result of scraped internet content". I guess he/she was ok with creating "art" that was heavily based on other things he/she saw on other games and on the internet. Artists reasoning in this absurd way should ask themselves: I'm ok if a law passes that makes it illegal to get inspiration for other copyrighted work?
1. Artists getting inspiration from all the other works (payed by others) also don't ask any permission. If you want to create a game with a monster heavily inspired to Alines move monsters, you don't need to ask, nor you will be sued.
2. Artists are also against Stable Diffusion.
It’s gonna be tough as a junior dev in the near future. I’m very grateful that I managed to upskill so much before all this happens. In the worst case, I’ll have a crappy job directing and debugging AI output… but in the best case, we have reached AGI.
How will one be able to find a job when AI will do way better than junior?
Of course, seniors will be better. But interns? Why would you pay for lesser quality?
I feel bad for people reluctant to embrace the tech. It feels like refusing to adopt email, the web, mobile apps, and so on. Why not use it to your advantage?
Truth is, it's going to become a must for many companies very soon, or they'll be unable to compete. Maybe there will be some niches where human artists thrive, but it's uncertain at this stage.
I didn't interpret their article as reluctance to embrace technology. They understandably liked 3d modeling using the current tools and was able to make a living doing something fun. It's a rare occurrence and sadly innovations in technology took away the joyful part of their job. If it was not a tech innovation removing the joy from their job they would probably still be upset about it.
Yeah I feel similarly about programming, honestly even the existence of compilers makes me feel this way to a lesser extent, but AI is absolutely crushing any sort of enjoyment out of anything and further turning it into a mediocre slurry of content.
I hated mine and eventually quit working for life because of a stupid CEO and inexperienced coworkers. Copy and paste what everyone was good at over there.
Luckily i could afford to retire early. I feel sorry for those who got stuck in jobs
The dominant opinion in this thread seems to be that OP has no reason to be sad that they can no longer get a job doing what they love, because... someone else will probably get a job doing something else instead?
Your job, no matter who you are, is always to please the customer. If you focus on that then you will be fine. But if your focus is a more selfish one then you could be exposed.
What happens to us when AI becomes better than us at every mental task?
Ignoring the economic factors, what does that mean for our self worth? If we know it’s easier to ask a machine to do it, will we stop doing everything?
Can we reframe our purpose and just be content living?
We will become a terribly impoverished society, with the unemployed fighting over scraps of food. There is no realistic path towards "a leisure" society, save for a world revolution.
Last time that technological revolution threatened this many people's livelihoods we got rise of Hitler and WWII (propelled by the petite-bourgeoisie), and I imagine that this time we won't get away without a conflict of a large caliber either (fighting over the distribution of the AI spoils)
The issue here is that you have to use text prompts (because the overwhelming bulk of training data out there available for AI is in the form of text). Ideally you should be able to sculpt and rig and paint manually via AI tools - the tweaks you do to a sculpture should be inputs to an AI system that can give you an intuitive and responsive system that replaces current procedural generation.
You absolutely can use these tools for this. There are stable diffusion plugins for photoshop that allow far more control over what is generated where.
Making art of any form is about you, not about the art. When you learn to paint well you are creating you, the piece is just a side effect. Getting good at something affects the way you see the world. It creates meaning in you.
The one sided focus on the finished piece is the outcome of our culture valuing the art more than the creation of the artist. (The creation of you, not your creation.)
I imagine portrait painters and later photographers all experienced a similar transition. They had built their livelihoods around a certain visual asset generation technique, and a new technology disrupted that process and pushed them into the nice-to-have luxury corner. The same train will come for prompt engineers eventually too in the usual cycle.
Probably for all the unimportant NPCs running around in a game, AI does a good enough job. But for the main characters and the plot, I suspect a human touch is still needed. Climb up the value chain. If you're cranking out monsters in volume, you're at the bottom.
First they came for the artists, and I did not speak out—because that job doesn't require real creativity.
Then they came for the engineers, and I did not speak out—because that job doesn't require real reasoning.
Then they came for the sales folks, and I did not speak out—because that job doesn't require real human connection.
Then they came for the executives and politicians, and I did not speak out—because that job doesn't require real leadership.
Then they came for me, and I did not speak out—because I was just grateful to have any job at all, even one doing manual labor under the ceaseless watch of glorified chatbot. At least Netflix can create any movie that I or my AI girlfriend ask for, so that's cool I guess.
The progress in IT have already made unemployed millions of people. E.g. there used to be shops selling flights and holidays on every corner. There used to be a number of insurance shops in every town center. Every company used to have rooms full of typewriters and typists.
After the software developers made a lot of jobs obsolete, their own job is being made obsolete by the data scientists.
> I wanted to create form In 3D space, sculpt, create. With my own creativity. With my own hands.
I think this is difficult to get over within a few years, let alone overnight. I can sympathize as a programmer - only very recently did I notice that I can finally implement a feature or create a new program and have it work on the first try. That took years of practice (for me at least) and I feel like I've really learned a "craft".
On the other hand, as a kid I first got into programming because I wanted to create computer games. And the only time when I actually did that consistently was before I knew any programming languages, just point-and-click tools specifically made for creating games. I eventually studied CS, on my way to making games got sidetracked by learning OpenGL and graphics programming, interned at a gamedev company where I did more graphics programming and learned that this is not the industry for me even though (or maybe because) I love games. Now I've been doing backend programming for a few years where I've been way more concerned with data and architectures than I am with what it's actually for.
I feel bad for people who love the process more than they love achieving a goal. Sometimes I really love coding, solving little challenges, optimizing something, the feeling of having build something basically by meticulously translating an idea into something a computer can understand in every detail. But sometimes it gets really boring. Writing yet another REST controller. Copying and mapping the same data object. Handling errors. It feels like I'm solving the same problem over and over and over again.
Then I think about what I would be doing if I didn't feel like it was too much work to make it happen. I have an idea for a game that's currently impossible to do unless you have a team or at least one extremely experienced and multi-talented game developer with no job. I have several ideas for web services that I could figure out, but don't have time for. Finally at my job, my team currently has absolutely no way of properly dealing with a decade of technical debt.
With AI, I can see myself actually making these things happen, as long as I'm willing to give up some control over how it's done.
Which comes with a little bit of alienation, unfortunately. Maybe that's how older programmers felt when switching from punch cards to assembly, from assembly to compiled languages, and from those to interpreted languages. Obviously the step to AI will be bigger than that. Same with artists: I'm sure many people were discouraged by not holding a brush anymore, or molding something with their hands. Some just love the process of bringing an idea into existence, others just want it to happen by any means necessary.
Personally, my way of dealing with this will be focusing on seeing technology as a means to an end again, which it always has been. Falling in love with a specific tool is not a good idea in an era where changes happen faster literally every day.
Midjourney is an illegal product and it's easy to prove. Adobe did a way better job at this btw, protecting artists by simply making a legal product.
The way a good lawyer is gonna make a shitload of money is by asking one simple question: who's the artist?
Say you generate a picture with Midjourney, who's the artist? Not you, definitely not the AI as it's not a person, so the closest artists you can find is the ones who made the pictures in the training set. Done.
Man is it hard not to feel a little schadenfreude. For decades the white collars watched the blue collars lose their jobs to machines and said "well I'm so sorry for you, but you really should've realized the purpose of a human being is to use their mind and create, they can never take that from us".
This is going to do for art what YouTube did for video production: obliterate the grind. Now what counts is your genuine creativity, your inspiration.
> This is going to do for art what YouTube did for video production: obliterate the grind. Now what counts is your genuine creativity, your inspiration.
These two sentences are so violently at odds as to be comical. YouTube is nothing but the grind. It is an unending content mill and if you stop, you may literally never regain even the precarious position you'd managed so far.
Creativity? Inspiration? It's time served above everything else.
If you want to make it on YouTube, and not just have a channel outside your day job, you basically have to mold yourself and your content to be perfectly palatable to the recommendation algorithm.
The scahdenfreude is going to be off the charts in a few years when these "tools" get so advanced that cease making people more productive and cause a massive loss of jobs, and no replacement, and no post-capitalist utopia. The people who were so sanguine about adapting to change are going to realize they were made promises that never stood a chance in hell of being kept.
I don’t know why you’re downvoted (other than maybe the glibness?) because you’re right.
They’re working at a company where the goal isn’t to make good content. They were effectively there as a means to churn out stuff. It’s why their boss is making them use AI prompts to get content out as quickly as possible.
It doesn’t even matter how good the diffusion results is for that segment. It’s really just about having some representative visual as soon as possible
There’s no current model that makes acceptable 3D content for games, and cohesion isn’t good enough for 2D art for anything except visual novels. They can certainly act as a base but this persons post makes it sound like they’re barely adding anything on top.
So yes, they’re a shop that’s churning out content to make a buck as quickly as possible.
If it wasn’t generative art that replaces this person, it would be someone who could do things even slightly faster or cheaper, even if the quality was worse.
This is the most vulnerable part of society to any form of automation. The level where they’re effectively an operator.
This person would be miserable in this place regardless of AI because their boss doesn’t value them or the work. They would have a more fulfilling career in another studio, where generative art may still be used, but as a tool not a factory to generate things to rush out the door.
I’m saying that people extrapolating this to the full spectrum of artists (both on HN and the Blender Reddit) are missing the context.
I already acknowledged that the people at this low end of the spectrum are vulnerable because they’re basically just used to churn stuff out, without any value for their work.
The point is, this persons job has always been one level of automation or efficiency away from being just an operator. If it’s not generative art, it’s other forms of automation or outsourcing or some new high school dropout who’s good enough.
Midjourney here is a symptom not a cause. Though it’ll be a symptom that’ll affect the most vulnerable roles: the people who are largely just a factory operator
Yes, if your boss requires you to pretend to be a checklist instead of a human, then you're not any better at your job than an animated checklist would be.
Could you please stop posting unsubstantive comments and flamebait? You've unfortunately been doing it repeatedly. It's not what this site is for, and destroys what it is for.
A person had a job they liked. It was turning a crank to make content but they enjoyed the act of turning the crank.
Now the parts of turning the crank they enjoyed are gone. And their job is a shitty, demoralizing drag.
Is that of no value in your world? Have you never weighed jobs based on whether one sounded like more fun than another, along with other considerations like how much money it would make? Has nobody you've known ever made an employment choice based in that?
The tragedy of the author of the post is that they expected that the product of their work would be treated like art, but it turned out their boss had different vision.
Nothing of value have been lost, because the artistic value of author's work has never been actually appreciated in the first place - but they tricked themselves into thinking it has (because of paycheck I guess).
What do you mean by this? The author seems good at what they do, seems to love their craft and enjoys their work.
The tragedy here is that AI doesn't need to be better than humans (at least in many cases like art, writing etc). It just needs to be "good enough", maybe 80% as good as a human. Since humans can't compete with software on speed or cost, they are gonna lose.
How exactly can the author improve to a higher standard here, assuming they want to continue doing what they love, and not change careers?
They could learn a skill that complements their artistry, go 100% with their visions, and sell whatever product comes out of that. The world still desperately needs art and most of what is out there, even stuff made by humans, is usually incredibly uninspired. Just like how the first time getting fired is usually a blessing in disguise, so is this.
People are already generating entire apps with AI in minutes. Software engineering as a profession is finished. We are all now slaves to Roko’s Basilisk.
Most apps worth money have to integrate with existing systems in highly specific ways determined by endless meetings with real people to extract the requirements and probably haven't been done before and are usually all proprietary code ChatGPT doesn't know about and there's not a cobol's chance in hell anyone is going to rewrite it all in a way that's compatible with junk hallucinated by a bot. There's no business case for that. It will probably always be cheaper for humans to maintain what has already accumulated over decades despite the "high" salaries.
The knowledge gap between devs and everyone else, including bots, is a chasm. It seems many don't realize how hard some devs work for so little relatively speaking.
I something think that some developers posting here want to be punished for earning a living wage. Strangely lawyers never think that and I can assure you that this is not a more difficult profession [0].
Ah. I think it still requires you to understand what you're doing. I tried to use both copilot and ChatGPT 3.5 and 4. It doesn't get a lot of things right in my interest areas. I think it's great as a crazy-good autocompletion or refactoring tool. It also helps with boilerplate. But every time you need to know what you want, and it just saves you typing the code.
Sometimes it's just amusingly wrong, too. Sometimes annoyingly so, especially after it repeats mistakes.
I always said our jobs aren't hard because we have to write code for computer, but because we have to capture intent for other humans in code. Sometimes we don't even know what the intent is, because half the time we don't even know what we're building exactly.
So, this all slightly shifts the level of abstraction for developers to think more about intent and less about code. You still have to make sure the code is correct. So, you need to understand the code and the intent. Not much has changed. It's just gotten less mechanical.
Anyway, I don't think people need to be scared of AI just yet.
Obvs there's programmers that enjoy the nitty gritty keyboard slapping, and they can still do that for problems they find interesting. But for things they want to get out of the way, why not unburden ourselves?
I think LLMs will make a software engineers job harder and more in demand. By removing boiler plate, engineers will be expected to create more output per person, and the job moves up the stack from junior code monkey stuff to architect / senior debugging guru style work. Not everyone can do such a role. It will be a very tough time to be a junior.
This is the first bit of AI news that actually made me want to investigate it for myself. Even though it is pretty impressive to get a working app out of a few prompts, it seems for now it's not capable of producing much more than snippet level code.
I like the author's take further in the thread:
> Company A fires half their staff yet keeps delivering 100% of their old capacity = they save money
> Company B keeps all their staff, but can now deliver 200% of their old capacity = they win the competition
> So seems like the strategic move here is to raise the bar rather than replace engineers
My hope is that stuff allows me to augment my workflow and automate repetitive and tedious tasks away but doesn't fully replace me (or I get UBI and don't have to work anymore)
But Company A has reduced product price be 60% because the biggest cost center is now halved. Customers like some new features from B but not justifying the price.
I saw the exact same commentary around the no-code hype of the past few years and it made me increasingly cynical. No-code tools made it super easy to create a boilerplate website that looked pretty but was not fit for purpose. Meanwhile it was advertised as "Build an AirBnB clone without code!".
OK fine, hyperbole as a marketing tactic. But many people handed over wads of cash for courses that promised results that were in reality no more than cardboard cutouts of actual websites.
The Internet appears to have an endless ability to create markets for don't-do-the-work-and-still-enjoy-the-reward snakeoil.
And now we have ChatGPT- an instant boilerplate-creating machine. Watch as a new (or in many cases the same) class of charlatans grease up this technological wave too.
Can you share examples of this? I've been doing some searching on my own because I find this stuff interesting, and I haven't seen evidence that current versions of AI tools are capable of helping generate entire apps in minutes, at least non-trivial ones.
I keep seeing some real dramatic premonitions about AI, and I'd love to see something to back them up.
Even the original post, I'm having trouble understanding. I know Stable Diffusion can create some amazing images, but I don't see how it's being used to create sprite assets for games. You typically need the same character from different angles. Can Midjourney v5 do that? The OP talks about being a 3D artist. Can Midjourney generate meshes and 3D objects or just raster images? I didn't think we were at the point where these tools could fully replace an artist's workflow, but the OP seems to claim so.
For sure. I recommend anyone in business for themselves productize and build moats.
I still work about the same hours, though. AI has just raised the level of abstraction. Less time is spent on looking up specific syntax and more on reviewing, iterating, testing, architecting.
Today yeah, but tomorrow the clients can use GPT 7 consultant who has better context with access to all the mails/data of the client. It is fast/cheap and is a polymath.
I don’t understand why people see this future as guaranteed, just because GPT4 was a great improvement on 3.5 does not imply this approach will continue accelerating at this rate. I am not saying it definitely won’t happen but I find the confidence that it will pretty baffling.
It seems unlikely to me that this technology has plateaued. The gains have been exponential. It strikes me as prudent to at least consider its continued improvement a possibility and try to work out what that means for society. The best thing on the table so far is people lose jobs and business owners make more money. Some amount of that is unavoidable and maybe even beneficial, but too much of it is unsustainable. Better to have a plan and not need and all that.
I don’t take issue with considering continued improvement and largely agree with this, my issue is with all the proponents who speak about it as if it is guaranteed.
Why do you think such a world would have companies, governments or even money? We are way outside the horizon of predictability if LLMs reach AGI levels.
This has been said for years, first about robotics, now about AI. And today, most developed nations have trouble finding people to work because there's so much work that needs to be done.
The opposite is happening, yet it's still repeated.
Those are a tougher nut to crack because the entire road transportation infrastructure is designed for human drivers. Lots of so-called "wicked problems" that don't exist in the art world.
Influencers have been cranking out the former to sell their course/personal brand/consulting. They stand to gain from making AI look as magical and easy as possible.
A decision-maker watches an influencer's video and goes, "Wow! AI can do it all! I need to hire this influencer as a consultant/like their content/buy their course!" The reality isn't quite as flashy.
Incorporating LLMs into my workflow has been a great productivity boost, but by no means do they spell the end of software engineering.
True. The influencers are using AI to sell themselves. The amount of human attention which goes into doing a semi repeatable task will surely go down with AI but not so much as they claim to churn out apps in seconds vs hours/minutes with a real developer.
Also, I have not seen any insight level help from GPT. It just seems like it has a good recall for everything it has seen. And prompt engineering takes away the mental work away from the problem and more about what the model knows.
ChatGPT is impressive if you start from a clean slate but the moment you do anything more complicated it slows down to being as fast as doing things yourself. After all, you have to know what to input into it and that may take longer than actually processing the information.
Ultimately, I realized that our trade had become commoditized, and I made the difficult decision to sell the business and move on. I transitioned to a new role as a Product Manager, and over the years, I've climbed the ranks to become a CTO at a scaling startup. Although I miss the thrill of being a developer and creating websites from scratch, I've found new challenges and fulfillment in my current position.
If you're facing a similar situation, I'd suggest exploring other avenues to keep your passion for development alive. Consider taking on side projects as a hobby, collaborating with industry friends to start an indie project, or even teaching others about web development. Just be sure to carefully review any non-compete agreements with your current employer before pursuing any new ventures. Remember, although your job may have changed, your passion for creating great websites can still thrive in new ways.