Hacker News new | past | comments | ask | show | jobs | submit login

> why wouldn't we trust that these image hashes can't be reversed back into images?

You should avoid making political or moral arguments when your technical knowledge is 10 feet below your ego. An image is orders of magnitude larger than a password, meaning that many more collisions. Also, because it's a perceptual hash, even more collisions. One does not simply reverse a perceptual hash. Keep your hat on ;-)




>You should avoid making political or moral arguments

I am not being political or moral. Why would you think that? Is this one of those "since you don't immediately agree with me it means you are the enemy and disagree on everything" type things?

>An image is orders of magnitude larger than a password, meaning that many more collisions.

Whole point of hashing algorithms is that they produce unique hashes for different inputs. Wouldn't a larger input value (bits of the image) make it harder to have collisions?

>One does not simply reverse a perceptual hash

So what actually is the privacy concern here? There is no way to get the original image from the hash at best you can get an approximation of it.

>Keep your hat on ;-)

I guess this is suppose to be some kind of dig at me. No need to be an asshole even if you don't agree with someone.


> Whole point of hashing algorithms is that they produce unique hashes for different inputs. Wouldn't a larger input value (bits of the image) make it harder to have collisions?

Standard cryptographic hashes should have certain properties like https://en.wikipedia.org/wiki/Avalanche_effect

However, a perceptual hash desires the opposite. Due to the pigeonhole principle, a larger space mapping to a smaller space, involves more collisions. In fact, all hashes have infinite collisions, one just tries to design hashes that don't have many collisions on smaller length inputs. Ultimately though, there will be at least 2^n collisions for all inputs n bits long if the hash is n bit output. You can easily calculate this by looking at the excess in size between input space size and output space size.

https://en.wikipedia.org/wiki/Pigeonhole_principle

Perceptual hashes are essentially designed to collide on similar data. The fine details are lost. An ideal perceptual hash algorithm would quantize as many alterable properties of an image as possible. Contrast, brightness, edges, hue, fine details, etc. In the end, you have a bunch of splotches in a certain composition that form the low dimensional eigenbasis of the hash.


Thanks for that, I still dont get why this is such a big privacy concern for everyone.


You're welcome. I think the crux is that people are worried about the police knocking at their door and/or inspecting their photos due to mismatching on this purportedly unreliable hash method.


That feels like purposefully misundertanding the situation. The hash match only escalates the matter to a human inspection. In no sane world is that inspection done physically. It means that if your picture approximitely matches a known abuse picture someone gets to view it to verify if it contains abuse or not and then potential legal actions can be taken.

Police knocking down your door will only happen if you are found to posses multiple abuse pictures and they suspect you might be endagering people.


Could be as simple as you bathing your kid in the tub. Why cede power to arcanists?


For one I don't think you should be taking photos of your kids in the tub. No one is going to want to watch those. Secondly if you manage to capture same position on your kid and same angle as someone in an abuse picture there might be something wrong with your bathing routine.

This just feels like people constructing imaginary boogieman out of this


I both understand and acknowledge your position. I agree that this technology can do a lot of good if used for its intended purposes. However, it does strike me as a bit overarching.

The question really is, will it be used for intended purposes or will it become like roadside drug sniffing dogs? A common euphemism is "probable cause on 4 legs."

https://reason.com/2021/05/13/the-police-dog-who-cried-drugs...

If it is overly sensitive, or if Apple or the government expands the list to include much more than what it has initially been slated for, it's possible that the faux detections will be used for escalation and ultimately result in parallel constructions for crimes like drug possession.

To clarify - here's an arcane scenario. Authorities receive an alert and are given permission by Apple to look at all photos taken the same day as the photo in question. The photo in question is a false match but the rest of the photos that day are still inspected. The agents observe a photo of a bag of white powder, and use it to procure further investigations/warrants. The target is later arrested for possession once physical evidence is found.

https://en.wikipedia.org/wiki/Parallel_construction

I find the laws in this nation, particularly drug laws, to be quite obtuse. I don't think it is wise to give any additional enforcement capabilities to those that are continuing to deprive citizens of life and liberty over archaic drug laws.

I hate pedophiles and child abusers as much as anyone else should. I'd execute the abusers if we are being totally honest. It's just that this technology strikes me as a potentially Orwellian device.


I don't get why they would get all of the images taken that day if the flagged image doesn't actually contain any abuse material.

This might not be true, but I've always assumed that if I am suspected of something the police can already access all my iCloud (or any other cloud storage) images with court order.


Yes, they'd get access to every photo you've ever taken if you suffer a hash collision.

What court order?

Apple spots the hash collision, snitches on you, and then police get all your photos. Isn't that how this is supposed to work?


Has it been confirmed that police get your all of your media in case of a collision and not just the offending image? Do they even get the offending image?


I'd be willing to bet they get the entire archive if there's an offending image.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: