Google is in a non win spot. LLMs will be more expensive to run than current search. And Google needs search to make big money, since it subsidises everything else. Microsoft needs search as a side business for now, as long as it breaks even it is good for them. A 10B search profit per year would be so good for Microsoft, but catastrophic for Google.
Not to mention that current search infra and ads UX have been optimised to the end to gain every penny, and a LLM based ads system won't have the same margins to start with.
The running costs will drop precipitously with time. Especially in Googles position - they can afford to design custom silicon for their search engine, and potentially save money compared to having hundreds of copies of every page on the whole internet sitting in RAM, just waiting for some user query to maybe need to see it, which they currently do.
Language models are only a few terabytes of RAM, which is small chips in comparison.
Not to mention that current search infra and ads UX have been optimised to the end to gain every penny, and a LLM based ads system won't have the same margins to start with.