Yes, it is battle-tested, and the testing result was the big failure: memory-related issues and vulnerabilities alone caused billions of dollars of damage, and on their way to cause more.
It is generally not a good idea at all to use C++ in network-facing applications.
The first maxim of software architecture: You Are Not Google.
(And even at Google most systems running C++ code are their core indexing and analytical systems, which are not directly related to their Internet-facing perimeter; there are some exceptions, of course).
That said, looking back at previous jobs there are many places -- particularly in ad-tech stuff -- where I used Java server side that I now think C++ would have been more appropriate (or these days, Rust). I wasted a lot of hours tuning for garbage collection that I'd love to have back.
And yes, there is a crapload of stuff that is internet facing that is C++.
Well, at some point it is a matter of personal preferences and tradeoffs. One time, I used to work on a high-frequency trading system written in Java... yes, in the environment with zero tolerance to GC latency. The core system was written with “off-heap Java” style, with memory blocks preallocated... and for the periphery everybody could use regular GC-enabled Java.
Could have written everything in C++, but everybody hated C++ so much they preferred heavily modified Java compilers and environments.
Afterwards, some parts of the core were rewritten in Rust... and there were no significant performance or other gains, so it was left as is.
It is generally not a good idea at all to use C++ in network-facing applications.