Setting aside the uncanny valley, my understanding is that people like seeing people not because they look like people but because it conveys effort and investment on the part of another human being. We want to feel like we're worth someone else's time, in every context.
We don't like to interact with people just because that's the only way we can interact but because humans are human. AI avatars are the exact opposite: a statement of disinterest. If you don't care enough about my business to be on a sales call with me, why would I bother speaking to an AI avatar you send in your place? What's a thank you message without a human being actually taking the time to record it?
AI avatars seem a lot like crypto: they're a neat technology solving the wrong problem. The "inefficiency" of humans interacting with humans is the fundamental component of communication. I guess it's a lot like LLMs: instead of producing less content that is more valuable / thoughtful per unit, we're producing a lot more content that is much less valuable / thoughtful per unit. AI avatars will create more vacuous communication, not enable our communication to be more thoughtful.
Maybe human behavior will change because of this, maybe the next generation that grows up interacting with AI avatars won't have this same feeling that speaking to an actual human means something.
Spot on. There's something instinctual about this. My kids (13,10,8) all love making art. When I show them AI tools that they could use to generate art or riff on existing art, they are actively opposed. They tell me that AI art is not real art, that AI tools for doing this sort of thing are gross, and they want nothing to do with it.
As a technologist I think "omg this is so cool" and it has been genuinely surprising to see how they actively rebuff it.
There's a real human sense of accomplishment and ownership when you put your own effort into making your own creations real. Typing words into a box to make a picture is a fun novelty, and might be useful to people who have to shovel images out the door, but I've never felt anything like the same satisfaction, and I'd imagine kids feel that innately.
sure, my comment is merely to express doubt that that the specific level of dislike from the kids is organic. most people do not hate like that on first impression, even if the satisfaction is of course not the same.
I am 'very online'. I have seen a huge rise in right-wing content on social issues, but also a huge rise in very explicitly leftist and anti-AI content in a way that did not previously exist - especially on economic/industry issues.
That public education has a left-wing slant in many jurisdictions seems difficult to deny, but I did go to a particulsrly leftwing school district. I do not think you really know anything about my POV and likely the inferences you would make based on my statement are wrong.
> but also a huge rise in very explicitly leftist and anti-AI content in a way that did not previously exist - especially on economic/industry issues.
It's always existed; you just haven't been around it. It stretches back to the beginning of the Industrial Revolution; read about the Luddites, Blanketeers, and John Henry.
Humans generally don't like having their livelihoods threatened and that's happening more and more as the people who paid others for labor are finding ways to not pay people while simultaneously getting the fruits of labor. It used to just be physical tasks that were simple to automate; now, it's knowledge-based jobs.
I'm not afraid of losing my job as an engineer. I can babysit an agent to build software and my expertise will still matter. I'm upset because that's not fun. I chose this career because programming is fun for me. I put up with everything else and I get to code and eat.
Now I'm not sure what I like doing if the bot does my coding. Who am I? My whole identity is built around being an intelligent programmer but if intelligence is commodified and programming is obsolete then I don't know who I am anymore or what to do for fun
:(
Making a living is a concern but.. it almost feels like one of the more shallow ones. I can stack bricks for money but if AI takes the fun out of programming what am I supposed to do for fun
If they can replace someone who can deal with problems that are of the size that software engineers deal with on a daily basis, they can replace most of the workforce.
if intelligence is truly commodified, we are effectively post-scarcity or very close to something resembling it.
it seems hard to justify like continued massive material deprivation for tons of people in the world on the basis of ‘who am i’ ego for the top like 0.2% income earners globally.
you're conflating income with wealth. as programmers many of us have good salaries, but if those went away we'd have to dig ditches like everyone else.
miss me with the .1% salary stuff -- income is not wealth.
I'm not sure why you think anti-AI is a leftist thing. If anything, it seems more conservative.
Not conservative as in the 'libertarian', "hand over your agency to the perfect algorithmic machines because they will set you free." But conservative as in traditional.
I agree with you ideologically, but in terms of an empirical description of how the culture blobs are actually turning out, it unfortunately does not appear to be shaping up that way.
We're definitely hard wired to recognise the difference between people being friendly to foster a good relationship and people being friendly because they've good ulterior motives
We don't like to interact with people just because that's the only way we can interact but because humans are human. AI avatars are the exact opposite: a statement of disinterest. If you don't care enough about my business to be on a sales call with me, why would I bother speaking to an AI avatar you send in your place? What's a thank you message without a human being actually taking the time to record it?
AI avatars seem a lot like crypto: they're a neat technology solving the wrong problem. The "inefficiency" of humans interacting with humans is the fundamental component of communication. I guess it's a lot like LLMs: instead of producing less content that is more valuable / thoughtful per unit, we're producing a lot more content that is much less valuable / thoughtful per unit. AI avatars will create more vacuous communication, not enable our communication to be more thoughtful.
Maybe human behavior will change because of this, maybe the next generation that grows up interacting with AI avatars won't have this same feeling that speaking to an actual human means something.