CDP is great for testing. But one of the most basic checks for bot detection is checking for CDP(webdriver). Its always going to be a cat and mouse game. You'll see a bunch of solutions captch solvers etc, But they usually are only good for a few weeks.
It sounds like you're thinking of window.navigator.webdriver, which is a WebDriver thing not part of Chrome DevTools Protocol. With CDP, as far as I can tell the detection mechanisms are more about the heuristics of e.g. how fast a form is filled -- which this AI stuff will trigger immediately too.
(And even if CDP had an explicit marker somewhere, surely patching that out is easier than piling up enough patches to "make a new browser".)
Dont you need to navigator.webdriver === true for CDP to drive automation? Maybe I need to update my understanding on this. THis is usually a dead giveaway
With stuff like https://www.cloudflare.com/en-in/application-services/produc... and https://blog.cloudflare.com/ai-labyrinth/ big money going on both sides last thing you want is to shadow detected as a bot. Its all ok if you are scraping to top rated SEO slop which is usually static sites but for anything beyond it wont work well eventually. Quite a few issues on browerbase, crawl4ai and similar repos around being detected as a bot.
Its kinda built really well without exposing webdriver etc and can comfortably run js and communicate with LLMs.Has full agentic capabilites.
Why a new browser instead of a robust extension?