> Certain images are CSAM by _context_. They do not necessarily require those within the image to be abused, but rather that the image at one time or another was traded alongside other CSAM.
At the risk of sounding like a broken record, how can we know this is actually true? Every description of the NCMEC database's contents that I've seen is incredibly vague, and as of 2019 it seems like there were fewer than[1] 4 million total hashes available. I would think that if it genuinely did include innocent photos of people's kids, the number would be much higher.
> ...certain well-known images are known to flag. Such as Nirvana's controversial cover for Nevermind.
I've heard this multiples times now, but I've never been able to find any evidence of it actually happening. The only instance I could find was one where Facebook removed[2] that Nirvana cover once for containing nudity.
At the risk of sounding like a broken record, how can we know this is actually true? Every description of the NCMEC database's contents that I've seen is incredibly vague, and as of 2019 it seems like there were fewer than[1] 4 million total hashes available. I would think that if it genuinely did include innocent photos of people's kids, the number would be much higher.
> ...certain well-known images are known to flag. Such as Nirvana's controversial cover for Nevermind.
I've heard this multiples times now, but I've never been able to find any evidence of it actually happening. The only instance I could find was one where Facebook removed[2] that Nirvana cover once for containing nudity.
1. https://inews.co.uk/news/technology/uk-us-collaborate-crack-...
2. https://www.theguardian.com/music/2011/jul/28/facebook-nirva...