> Because, of course people are, but that's not the point.
Are they?
If the LLMs could talk to grandma for 40 minutes until it figures out what her problem actually is as opposed to what she thinks it is and then transfer her over to a person with the correct context to resolve it, I think that's probably better than most humans in a customer service role. Chatting to grandma being random for an extended amount of time is not something that very many customer service people can put up with day in and day out.
The problem is that companies will use the LLMs to eliminate customer service roles rather than make them better.
Are they?
If the LLMs could talk to grandma for 40 minutes until it figures out what her problem actually is as opposed to what she thinks it is and then transfer her over to a person with the correct context to resolve it, I think that's probably better than most humans in a customer service role. Chatting to grandma being random for an extended amount of time is not something that very many customer service people can put up with day in and day out.
The problem is that companies will use the LLMs to eliminate customer service roles rather than make them better.