I'm not sure why people are using AI for a task far better/easier done with a normal image editor. The badge could have been just superimposed at the top; it doesn't have to look like it was part of the original image. And if the photo carried legal implications (not this time), AI alteration would ruin that.
Edit. I'm pretty sure every AI photo editing app out there explicitly advertises that they use AI. Nobody wants to hide that feature. So I rather doubt the police claim that they thought they were using a normal photoshop app.
I don't see it as lying, just not very handy with the tooling. As for why the patch wasn't there in the first place - it sometimes happens. Was it needed/required for a shared photo? Dunno. It doesn't look like a big deal considering the photo was not used in legal proceedings
> "This is NOT an AI-generated photo," Westbrook Police declared on Facebook when first questioned about oddities in their photos of seized meth and fentanyl. They doubled down, insisting "Westbrook PD is not and would never generate an AI photo to try and depict evidence."
If it were just a simple copy/paste job with the patch, that's one thing. But the whole image has been turned into AI slop, with certain items moved, changed, or excluded. This just isn't an acceptable pattern for law enforcement communication of any kind.
You dont even need specialized software for this. You could literally do this in Microsoft PPT. This is like brining an atomic bomb to a fist fight - totally insane how AI is being treated like a Swiss Army knife
Edit. I'm pretty sure every AI photo editing app out there explicitly advertises that they use AI. Nobody wants to hide that feature. So I rather doubt the police claim that they thought they were using a normal photoshop app.