Consider that any such detector or even method that is made available becomes a tool to finetune models to evade.
Alternatively "spinning" tools to create slightly modified content (for black hat SEO, to avoid duplicate detection) has been a thing for years, and you can bet tools that'll take model output and apply small models to them to mutate their wording without corrupting meaning will be a big market and will see evading AI detectors as part of their feature set.
It's an unwinnable battle, that just produces competitive pressure towards focusing on making AI's write more like humans.
Alternatively "spinning" tools to create slightly modified content (for black hat SEO, to avoid duplicate detection) has been a thing for years, and you can bet tools that'll take model output and apply small models to them to mutate their wording without corrupting meaning will be a big market and will see evading AI detectors as part of their feature set.
It's an unwinnable battle, that just produces competitive pressure towards focusing on making AI's write more like humans.