AI has a lot of edge cases and caveats. It can be trivial like not being about to count the Rs in "strawberry". Or it can be more nefarious where it simply makes stuff up (eg some fake precedents in legal opinions). AI is still incapable of explaining its reasoning and dealing with errors.
Yes I know some of these problems like the "Rs in strawberry" problem have been solved but (IMHO) you're going to be dealing with those edge cases forever.
Another issue is response time. Currently, you need to go through several steps: query -> embedding -> LLM -> answer -> back to English. Each of these steps takes time.
But here's the big one: energy. The sheer scale of Google search needs to be put in context of how much energy is consumed and how many queries can be answered per unit energy. With all the steps involved in AI queries, we need orders of magnitude of improvement to compete.
Most searches are fairly simple. They just don't need a large model to answer them. There will absolutely be a place for AI queries and they will continue to get better but displace search? We're not even remotely close to that outcome.
AI has a lot of edge cases and caveats. It can be trivial like not being about to count the Rs in "strawberry". Or it can be more nefarious where it simply makes stuff up (eg some fake precedents in legal opinions). AI is still incapable of explaining its reasoning and dealing with errors.
Yes I know some of these problems like the "Rs in strawberry" problem have been solved but (IMHO) you're going to be dealing with those edge cases forever.
Another issue is response time. Currently, you need to go through several steps: query -> embedding -> LLM -> answer -> back to English. Each of these steps takes time.
But here's the big one: energy. The sheer scale of Google search needs to be put in context of how much energy is consumed and how many queries can be answered per unit energy. With all the steps involved in AI queries, we need orders of magnitude of improvement to compete.
Most searches are fairly simple. They just don't need a large model to answer them. There will absolutely be a place for AI queries and they will continue to get better but displace search? We're not even remotely close to that outcome.