Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you see talking heads with static/simple/blurred backgrounds from now on, assume it is fake. In the near future they will accompany realistic backgrounds and even less detectable fakes, we will have to assume all vids could be faked.



I wonder how video evidence in court is going to be affected by this. Both from a defense and prosecution perspective.

Technically videos could've been faked before but it would require a ton of effort and skill that no average person would have.


Just as before, a major part of photo or video evidence in court is not the actual video itself, but a person testifying "on that day I saw this horrible event, where these things happened, and here's attached evidence that I filmed which illustrates some details of what I saw." - which would be a valid consideration even without the photo/video, but the added details do obviously help.

Courts already wouldn't generally approve random footage without clear provenance.


There will be a new cottage industry of AI detectives that serve as expert witnesses and they will attest to the originality of media to the court


I still find the faces themselves to be really obviously wrong. The sound is just off, close enough to tell who is being imitated but not particularly good.


Especially the hair "physics" and sometimes the teeth shift around a bit.

But that's nitpicking. It's good enough to fool someone not watching too closely. And the fact that the result is this good with a single photo is truly astonishing, we used to have to train models on thousands of photos for days only to end up with a worse result!


It's interesting to me that some of the long-standing things are still there. For example, lots of people with an earring in only one ear, unlikely asymmetry in the shape or size of their ears, etc.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: