Preferences

yencabulator parent
Why a new browser extension for Chrome instead of an MCP operating Chrome over Chrome DevTools Protocol?

https://chromedevtools.github.io/devtools-protocol/

Not vouching for this project, but just an example of the category existing: https://github.com/AgentDeskAI/browser-tools-mcp


Tsarp
CDP is great for testing. But one of the most basic checks for bot detection is checking for CDP(webdriver). Its always going to be a cat and mouse game. You'll see a bunch of solutions captch solvers etc, But they usually are only good for a few weeks.
yencabulator OP
There's no reason why the same cat and mouse game wouldn't apply to this browser as a whole.
Tsarp
True, but its orders of magnitude lesser when webdriver flag is an extremely basic bot check that is now considered 101.
yencabulator OP
It sounds like you're thinking of window.navigator.webdriver, which is a WebDriver thing not part of Chrome DevTools Protocol. With CDP, as far as I can tell the detection mechanisms are more about the heuristics of e.g. how fast a form is filled -- which this AI stuff will trigger immediately too.

(And even if CDP had an explicit marker somewhere, surely patching that out is easier than piling up enough patches to "make a new browser".)

Dont you need to navigator.webdriver === true for CDP to drive automation? Maybe I need to update my understanding on this. THis is usually a dead giveaway
yencabulator OP
I see mentions that (unpatched) webdriver is easy to detect but detecting CDP only works by heuristics on timing etc.
Tsarp
With stuff like https://www.cloudflare.com/en-in/application-services/produc... and https://blog.cloudflare.com/ai-labyrinth/ big money going on both sides last thing you want is to shadow detected as a bot. Its all ok if you are scraping to top rated SEO slop which is usually static sites but for anything beyond it wont work well eventually. Quite a few issues on browerbase, crawl4ai and similar repos around being detected as a bot.

This item has no comments currently.