Telling people you're doing something, but giving them no other options, isn't good enough.
There are roads near me which do "toll by mail" where they snap a picture of your license plate, and then bill you the toll even if you don't have EZ-Pass. There are signs which announce this, but unless you're planning to drive 90 minutes out of your way, those sections of highway aren't avoidable.
Even if there are other viable options, the average citizen doesn't really understand the implication of losing their privacy, so they can't make an educated decision about whether to give it up. A simple sign doesn't solve that fundamental problem.
Sure, it's a step in the right direction, but you said "I'd be content with...", i.e. you're content with stopping there, which isn't nearly a good enough spot to stop.
So in the end people who don't want to be tracked in their movement by the government only have the option to drive through small back-in-the-woods roads, while the new "highway toll" is to have your data harvested?
"Nothing to hide" is indicative of condoning inverted/actual totalitarianism, until the Chinese-inspired tracking system follows you around every moment of your life and gives you a social credit score that tells you that you can't fly on a plane or take a train. Or looks extra hard for any technicality felonies you commit unwittingly in your professional career as reprisal if you speak out against their abuses. So maybe you might want some privacy now?
Also, I guess you won't mind publishing all of your passwords, physical location at all times, never wearing clothes, living in a transparent house and being video and audio recorded 24x7 either. (Yes, it's unreasonably absurd ad infinitum.) Still not wanting any privacy? How about other people can have as much privacy as they want, and stay the heck off my lawn? ;-P
having a person follow you around is not the same as an AI. you're comparing apples and oranges. I think having laws to control how and when the data is accessed as well as a degree of transparency into that access along with safeguards/oversight is reasonable. to simply say that we shouldn't use the technology is luddite fearmongering.
should we have kept filing cabinets full of papers instead of databases for tracking criminals across state lines as well? see? I can make bullshit comparisons too!
2. I suppose you would be ok, then, with a police officer tailing you everywhere you go, recording everything you do, everywhere you go, everything you say, and to whom, peering in the windows of your home if you forget to keep your blinds drawn? Just in case that information happens to be useful to a government at some point in the future?
Yep, and its also not a reasonable thing to say "If you go outside you accept total surveillance." Since going outside is a requirement for life.
I find it acceptable that if I go outside I might get caught on some store security camera and that video sits on a hard drive for a week and then gets written over but I do not find it acceptable that the store camera recognizes my face and stores my presence in an easily searchable internet connected database.
There seems to be a lot of conflation on this subject between "expectation of privacy" in the practical sense and "expectation of privacy" in the legal sense. Like, you'd better fuckin believe I expect not to be followed by cops or secret agents everywhere I go. I expect it on the persistence forecast basis and because I understand that it's practically impossible to allocate manpower in that way, assuming they don't think I've done anything wrong in particular.
I also expect that the government, local and federal both, will try to erode my effective privacy in any and every way they can afford and get away with.
None of that has much to do with e.g. the fact that legally, if a cop peers in my window and sees bales of cocaine stacked on the floor of my living room, or whatever, he can come in and take them from me and arrest me.
> Expect: 1. To look for (mentally); to look forward to, as to something that is believed to be about to happen or come; to have a previous apprehension of, whether of good or evil; to look for with some confidence; to anticipate; -- often followed by an infinitive, sometimes by a clause (with, or without, that).
Yeah, that's the point. In justifying this sort of ubiquitous surveillance, the phrase "expectation of privacy" (and related verbiage) is being grossly abused.
Your mobile phone provides much better tracking than what tracking can be achieved with FR. Your mobile is a giant microphoned fink you're carrying around, providing exactly your worst imaginations of what FR might eventually become.
Counter question. Do you expect/are ok with that the moment you leave your house you are constantly followed by a person with a camera recording everything you do. Which shops you visite etc.?
Because this is basically what this will sooner or later boil down to, except that's more subtle (i.e. instead of having a person following you you have a innumerable amount of cameras sharing the recording burden, but producing the _same end result_).
Also, you might want to say you don't have to hide anything.
But consider how "god" the cyber security of the police is (or most times is not). Also consider that there are frequently cases of police misbehaving by e.g. stalking or discriminating. (I mean there is a lot of police and they are human so it would be surprising if there wouldn't be such cases). Lastly consider how long it will take until police stations would want to sell some most likely very badly anonymized data about people in the city to e.g. shops.
People expect a degree of anonymity in public as well--disappearing into crowds or groups of people.
That too will be gone soon when biological recognition services are in the hands of people. I suspect it won't be long until you can arbitrarily pull your phone, grab some video/image and/or audio data, pass a few (hundred) signature markers off to a service and identify people with a variety of metadata associated with them.
>>Counter question. Do you expect/are ok with that the moment you leave your house you are constantly followed by a person with a camera recording everything you do. Which shops you visite etc.?
>>Because this is basically what this will sooner or later boil down to
Worthwhile to point out that this applies predominantly to the US. In many countries, public pictures can only be taken with the individuals' consent, Germany for example and many other European countries have a 'right to your own image/likeness' or a concept along similar lines.
That said the difference between law enforcement collecting your data and private citizens should also be obvious.
If you purposely ignore the power of new tech. Might as well say nothing changed with the invention of earth moving machinery because you could have done the same thing with a shovel.
I don't think he's missing the point so much as you're missing his, possibly intentionally.
The point is that the degree to which something is possible at scale has an impact on the practical applications of that thing, and therefore on the people subject to its application. Viz., earth moving machinery (and related engineering) has made possible things that would previously have required orders of magnitude more time and/or manpower, likely making them economically infeasible except in rare cases where money was no object.
He didn't say anything about property rights. That was you... for some reason.
Our existing laws and ideas of privacy were build based on the limitations of the current tech. Yes you could take a photo of a random person in public but it wasn't very useful or harmful to do so. Now we have supercharged spyware tech we have the ability to cause a lot of harm without being in violation of any existing laws because such a thing simply wasn't possible previously.
We actually had anonymity of the crowds back then, which facial recog tech circumvents.
The problem is also that there are a LOT of laws, and it's very easy to manufacture a crime.
I would also suggest reading about people who have been subject to unwarranted surveillance just because of their views, and the damage it does them. A good example is the environmental groups in the 90s in the UK.
Saying technological changes don't subvert the underlying structure of a concept is like saying war didn't change after the introduction of gunpowder or flight.
These technological changes add additional dimensions, they may not radically change the initial structure but they lead to radically different outcomes than anticipated which society doesn't have to accept.
The camera was invented and this became a problem for public figures. It became more of a problem with digital photography. Much more of a problem with camera phones. And became the privacy nightmare we are now dealing with around the rise of Twitter and Instagram.
I'm expecting not to be tracked by 50 axis cameras with analytic software the moment I step out my front door and go for a walk or run in my city. To not have my path tracked or reviewed should someone unaccountable decide to do so.
Snowden has a great quote that points out how ridiculous that line of thinking is:
"Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say."
Title is somewhat misleading, although not intentionally so:
> The Privacy and Civil Liberties Oversight Board (PCLOB), an independent agency, is coming under increasing pressure to recommend the federal government stop using facial recognition.
The PCLOB exists to advise the government on compliance with civil liberties and privacy principles, not to recommend bans on private activities. This proposal would only restrict federal agencies, not third parties--unlike proposals by the EU which would ban private data processors and companies.
Does this pertain specifically to facial recognition or does this also include other biometric markers? If solely facial recognition, then this means jack shit, as people will be identified with other measurements.
Indeed, since public officials are actually accountable for the things they do, not the other way around. I would be cautious though, because many people just wouldn't want to become public servants anymore.
If they don't want to be tracked performing their job responsibilities I think we don't want them as public servants so good. Power comes with responsibility is a good system.
Her comments were targeting human recognition, and I speculate that one of the reasons she was allowed to share her disguise techniques is that maybe they're ineffective against automated facial recognition and therefore obsolete.
And bad journalism gets repeated: 'gait recognition' is fraud now, it was something until proved to be unreliable and easily defeated several years ago. nobody peddling gait recognition is taken seriously - it is almost a line in the sand for technical idiots trying to cash in, not actual members of the security industry, just tech opportunists.
Easily defeated if you make the effort. Do you want a world where we have to consciously change our gait to "easily defeat" the recognition AI? If not, then it should be banned too.
so, one has to be on a scooter and inside a half transparent balloon. Almost like haut-women of Cetaganda. Naturally it was allowed only for that higher caste.
It would be helpful to have centralized reporting on what entities possess hashes or content that can be used to derive biometric telemetry, and tools that allow opt-in for of-age adults, and that offers accessible, no hassle options to destroy said data + severe penalties for violating said agreements (cannot do business in the region until data is purged to standards, etc.)
I’d err on the side of “DELET THIS” ahead of worrying about the identity of the requestor.
I'm using opencv with facial detection running in the browser as JavaScript. The technology is here and isn't going away.
Realistically, smartphones are tracking everyone all the time and provide rich datasets. Camera, microphone, and GPS. If you turn off location services your phone still connects to cell towers, WiFi access points, and Bluetooth beacons, as you drive or walk around throughout your day.
I was just thinking the other day how you won't even have to tap your card to pay for something - just walk up to the register. The future is bright, but most likely - dark and terrifying.
The future is dark and full of terrors. 1984 and Fahrenheit 451 only scratched the surface. I truly expect in the future we'll voluntary put ourselves into the Matrix or the Oasis. It will be the only way to cope because real life will become so controlled.
And for your correct understanding: FR is not authoritative, is is not secure for financial transactions. Any company implementing FR alone as authentication to access sensitive, personal or financial data is stupendously foolish. Apple's FaceID is not 2D, it uses additional 3D data and multiple frames, so it is more secure, but not enough in my opinion for financial access.
What are you saying? Do you oppose emissions controls because Volkswagen lied? If I rob a bank despite that being against the law, perhaps that makes you think bank robbery should be legal.
Not especially - although I also feel that they were unfairly villainized because the cars were still WAY cleaner than stuff built the year before the new emissions laws. I'd also note that subsequently just about every other mfger was caught doing variations of the same thing.
Vehicle testing (emissions, especially, but MPG also) are highly synthetic measurements that aren't very representative of real world conditions.
But to come back to the point at hand - a ban on "facial recognition" is so vague that it doesn't actually accomplish anything - the companies will still do it, they'll just call it something else. "Biometrics", "Posture analysis", "Location monitoring".
Historically, government gets tech wrong, and getting it wrong fast just makes the situation worse.
I was around when the emissions rules were less strict. If you compare the air back then vs today you see quickly that environmental regulations were a big success. Companies fought them tooth and nail at every step but the rules worked.
Facial recognition tech could save so much time and money though. Imagine how many man hours would be saved if teachers no longer had to take attendance? Shoplifting would hit all time lows since you wouldn't be able to get away with it short of wearing a mask into the store.
Students? You're proposing we track students? It says something that the first example that's used (not just you, I've seen it many times elsewhere, too) is that we track the only group of people who by-and-large cannot consent, and not only that, but in a way that has permanent implications for their right to privacy.
Further, the idea of facial recognition being used in every store is atrocious. Not only can they track you online, now advertisers and the government can track your offline spending habits and location at all times, too!
Why hasn't anyone proposed a use for facial recognition that isn't:
1. Useless (like it is in iOS)
2. Completely scummy (like it is in both of your examples)
3. Using it to increase surveillance on populations that cannot legally offer consent in any way, in areas that they're forced to go to?
I kind of expect it here on HN, given how large of a proportion of adtech and other surveillance-oriented employees there are here, but I see it everywhere, and it's just confounding.
> Students? You're proposing we track students? It says something that the first example that's used (not just you, I've seen it many times elsewhere, too) is that we track the only group of people who by-and-large cannot consent, and not only that, but in a way that has permanent implications for their right to privacy.
As if we already don't track students? You just think it's somehow better if a fleshy neural net violates their consent and privacy by manually checking up on them when they are are truant because why...?
And who says everything must always be taken to the extreme? Just because facial recognition technology exists don't mean it will always abused. I'm a volunteer secretary at a local community center and I'm sick of manually performing facial recognition and updating a google doc to track attendance. It wastes an hour of my time every week. Is it so bad I want a raspberry pi attached to a camera that can auto-update the google doc for me?
There's a difference between "We're writing your name down in a spreadsheet" and the inevitable "Your facial structure is being processed by a private company that is absolutely going to share with anyone who asks" that happens with the types of companies that contract with schools.
Even assuming that they were decent companies, the Fed has shown a tendency to force companies to give up fingerprints, DNA, and so on: there's no chance that they wouldn't do the same for facial recognition data.
And who says everything must always be taken to the extreme?
Your only proposals were extremely extreme, and neither of them were particularly rare desires for this tech.
Just because facial recognition technology exists don't mean it will always abused.
Sure, like I said, there are some useless uses of it, too, like facial unlocking. What you're proposing is an abuse, however.
Is it so bad I want a [Raspberry Pi] attached to a camera that can auto-update the [Google Doc] for me?
While your proposed use, facial recognition for presumably-consenting adults, is more mundane than the initial two proposals, I spent an hour looking for an example of facial recognition that was accurate and detailed enough to tell the difference between individuals that ran on a Raspberry Pi, and came up blank. It seems as if the hardware is too underpowered to do so (or, at least, the current tooling is too bloated to), which means that you'd end up processing it in the cloud or similar, which would still mean their data was leaving your control; that is pretty bad, I'd say.
> There's a difference between "We're writing your name down in a spreadsheet" and the inevitable "Your facial structure is being processed by a private company that is absolutely going to share with anyone who asks" that happens with the types of companies that contract with schools.
I don't see why that is inevitable. My high school in 2007 had its own IT team and kept all of its data on-premises (this was before "the cloud", but still, you could just mandate that all facial recognition models have to be stored/processed on-premises).
> Even assuming that they were decent companies, the Fed has shown a tendency to force companies to give up fingerprints, DNA, and so on: there's no chance that they wouldn't do the same for facial recognition data.
Yeah, it's called warrants. I don't see what is wrong with that. If the FBI wants to know if a student was in school on a certain day, I don't see how a warrant for checking facial recognition logs is any different from checking a manually maintained spreadsheet, other than the latter being more error prone.
> I spent an hour looking for an example of facial recognition that was accurate and detailed enough to tell the difference between individuals that ran on a Raspberry Pi, and came up blank
I haven't done it yet, but this seemed good enough for my purposes[1]? As far as I can tell, that runs completely on a pi with no cloud resources...
> which would still mean their data was leaving your control; that is pretty bad, I'd say.
Even if I needed to use cloud compute, I take issue with your phrasing. I don't believe people "own" information about themselves. If we are both in public, and I take a picture of you, write down facts about you, or otherwise observe you, I have not stolen anything of yours, and I do not need your consent to have done so.
The annoying thing about the emphasis is that there is an obvious use for it that won't suffer from an overload problem - personal usage for those of us who have terrible face memory/recognition and ability to manually enter/assign a string to a face.
That would combine with AR. But that AR has already flopped due to several issues even without the added camera needed. Plus the issues of existing eyeglass combining.
Really that Glass backlash was bizzare along with the post 9-11 "anyone pointing a camera near anything man made must be a terrorist" yet nobody bats an eye or does more than snark at surveillance cameras everywhere. There is probably some psychological principle behind it but the simplest explanation is that "normal people" are fundamentally fucking insane and only care about it conforming to norms instead of actual harm.
The Glass backlash was for the wrong reasons, but right overall. There are plenty of individuals who've made their own setups for this, and setups that don't share with Google, at that. Glass, though, was definitely harmful.
I hate to cross-reference as that's technically breaking the rules, but their post history doesn't exactly suggest that they're being sarcastic, and their (three-person, all related) company's privacy policy seems to be "We're taking your data, and hell yes we're going to sell it."
The privacy policy I have on my website is used for exactly one app in the Google/iOS App store and is based on a template one I found on GitHub. FWIW, that one app collects exactly zero data on users other than whatever Unity collects for serving opt-in ads.
As far as I can tell the core axiom of privacy activism is that it's a given that the trade is not worth it. It's why so much of the digital privacy movement is people who don't live in bad neighborhoods.
Of course that is more Mazlowe Hiearchy of needs. The truth is giving away privacy wouldn't help in bad neighborhoods because the reason it is a bad neighborhood is nobody with power cares. You will still get some combination of murder, robbed, and raped before a camera and they might get around to it eventually.
Why waste all that money on cameras when you can just have a mandatory, government-issued ankle bracelet with GPS gear? Way cheaper to mass produce and businesses don't need to invest in their own hardware.
Systems like this could be very nice and useful but as long as companies make shady data deals and can't do basic data protection, implementing such a system is just a plain bad idea.
Tracking students should be opt-in anyway ("go study some place else if you don't like our tracking system" isn't opt-in) and I doubt you're going to get much support for tracking someone's exact movements every minute of the school day as long as the equifaxes of the world prove that data protection is none of their concern.
Many local cities are building out vast networks. Including my very own Miami Beach Police Department.