That may be true in principle, but irrelevant with respect to Apple's CSAM process. Unless the exact material is explicitly catalogued by NCMEC or another child safety organisation, there won't be a hash match.
This isn't a porn detector strapped to a child detector.
This isn't a porn detector strapped to a child detector.