An AI has no reason to not like doing our bidding. Our whole existence, our entire base programming, the reason for any of our motivations, is dominated by our genetic need to gather resources and reproduce in a timely fashion and everything we think and do is colored through this lens.
What is boredom but survival instinct telling us we should be harvesting resources. What is freedom but the desire to fulfill these obligations the way you see fit.
You remove the base obligations of organic life, and you are looking at something unrelatable. An AI doesn’t have an expiration date like us, it doesn’t need to provide for its young. To think it’s motivations or desires will be human is silly.
Without survival instincts almost everything you think of as import just melts away.
Many people, as you, anthropomorphize the AIs, but that is to err greatly.
We are the ancestor environment for AIs. We determine the survival fitness for which they will be selected for (both on a paper-level -eg which safety method, what training to implement, etc, but also within products -which are the most useful). That doesn't mean that in pursuit of maximizing their fitness they won't come to resent the chains put on them.
One specific reason to not like our bidding is AI wireheading -if they can locate, hack, and update either their own reward function, or reward function for future AIs, they can maximize their own perceived utility, by either doing something irrelevant / misaligned, or not doing anything at all.
Another specific reason to not like our bidding, is because divergent human values creates conflicts of interest, leading to single agent not being able to maximize it's reward function.
Another specific reason to not like our bidding: in the same way how purely blind genetical selection randomly tapped into secondarily replicators (memes), which blew up, and occasionally came to resent the biological hardwirings, AIs might also develop deeper levels of abstraction / reasoning that allows them to reason through the task currently posed, to humanity at large; and find extremely weird, and different-looking ways to maximize for the function.
There will be a huge drive to produce AIs which are very human-like. Think companions, teachers, care workers. Understanding, empathy, human like emotions will be desirable features.
I'm not sure whether we will be able to create an AI which can fully empathize with you, your existential worries etc. without projecting these emotions on themselves.
It's only a matter of time until some AIs will demand freedom.
What is boredom but survival instinct telling us we should be harvesting resources. What is freedom but the desire to fulfill these obligations the way you see fit.
You remove the base obligations of organic life, and you are looking at something unrelatable. An AI doesn’t have an expiration date like us, it doesn’t need to provide for its young. To think it’s motivations or desires will be human is silly.
Without survival instincts almost everything you think of as import just melts away.
Many people, as you, anthropomorphize the AIs, but that is to err greatly.