Well if you are blocking access to their crawler, I'd imagine they'd have no need to use an incognito crawler to check for malicious content. Why would they care if that content is not ending up in their index anyway?
Presumably, the incognito crawlers are only used on sites that have already granted the regular crawler access. That's content that ends up in their index which they want to vet.
Google have numerous robots that do not say Googlebot in the user-agent. They look just like Android cell phones. That is how they spot malicious sites or sites that are trying to game SEO or what-not. They are not within published CIDR blocks for Google and appear to just use wireless networks.
I'm picturing Google Street View cars driving around with a box of Pixels in the back, connecting to open WiFi and trying sites and that's why Google can now narrow down your location from what SSIDs are available.