Hacker News new | past | comments | ask | show | jobs | submit login

Neither of you are really using supporting evidence in this discussion, you're both just two sides of a value argument increasing in volume.

On the one hand, the other guy is legally correct - gdpr's purpose is to legally give individuals control over data about them (pictures they upload, addresses they input, whatever). That control is responsible on a site to site basis - if a person's naked picture is leaked online, every single IP address that hosts it must take it down if requested, or violate gdpr.

You're making a functional argument - if a person's nudies are leaked online, they don't functionally have control over that data. Morals and laws be damned, that picture is staying on the internet.

You can both disagree about the morality of this but simply restating both of your points with more "full stops" is pointless. If you're both really having trouble understanding each other's positions, step back and try to defend the opponent's decision.




> You're making a functional argument - if a person's nudies are leaked online, they don't functionally have control over that data. Morals and laws be damned, that picture is staying on the internet.

Which is my point. You no longer retain any control over it. A law saying you have control over it is silly, because it's worse than worthless: it makes me think I have control over something I don't have control over.

Laws cannot magically manufacture things which cannot ever be created.

Moreover, the gdpr doesn't prevent any of the problems that have caused data breaches in the past. The way that Target and Equifax (both of which could easily claim the data they had was essential to their business: Target with credit cards and Equifax being used by banks to coordinate information) are both equally likely under the gdpr and both equally unpublishable.

As for Facebook and Cambridge Analytica, how would this have been prevented? Facebook can just ask you to opt in to their usage of your info to use their service. Facebook can share information with other entities that claim to be gdpr compliant. Other entity then shares information with other people outside of Facebook's control.

I just don't see how the gdpr changes a fundamental fact: you no longer control something someone else has. Laws cannot change that. Laws can give you recourse, but they cannot change it. I actually believe that it's dangerous to believe that I have control over things I don't: it's a false sense of security.


So what's the point of your argument? Should we scrap GPDR because it isn't 100% effective in curbing personal data misuse? You could argue along similar lines that almost any law gives you a false sense of security - why do copyright laws exist, do they give the music industry for example a false sense of security that their audio files can't possibly be copied or shared? No, but they do provide legal recourse and they do set up a framework for acceptable use. If facebook shared your information with another entity without your consent, that would be a violation of GPDR and you would again have legal recourse.


Perhaps the public could be forced to realize that they are responsible for what data they share and with who?


And now we're back to the argument that we shouldn't protect the public against systemic unethical behaviour, and rather we should protect the bad actors while blaming the public for not being educated enough. Wanting better education is a fine goal (and one I agree with), but the reason why we have seat-belts (as well as driving lessons) is that sometimes you also need other protections for the public. The same logic applies for consumer laws. We don't blame the public for not doing enough research to know that their new phone charger blows up after 3 months -- we blame the manufacturer.

Also there's the fact that companies can end up sharing data to other parties, or a company can be acquired and change their mind about what the data will be used for (which is allowed because of the originally nebulous scope of their T&C which was specifically designed to allow for expansion without asking for user consent explicitly when usage changes). GDPR provides methods for users to be protected in both of those cases -- while just enforcing education does not.

Not to mention that if education was mandatory, then the same companies complaining about GDPR today would be complaining about educating users how their services abuse their dignity. Cutting Google/Amazon/Facebook/etc slack for making hundreds of billions from users' personal data and creating "Big Brother"-esque profiling systems for their billions of users doesn't really seem rational to me.


What if we make companies behave responsibly instead? There are less companies than people, and it is easier to go after them.

If I owned a site I would not have a problem to delete someone's personal data.

Also your sugguestion is that if you want to keep your data protected then you should not be using anything on the Internet or make any deals because once you have ordered something on Amazon it can sell your data to everyone else? Or when you rent an apartment, realty agency should be allowed to share your name, SSN, bank card number and address with everyone? No, I don't think it should be this way.


You are correct that GDPR means "you cannot functionally control something someone else has," but you are missing the aspect of "force."

If I find the person that owns the server hosting my data, and I put a gun to his head, and I say, "remove my data," do I now control that data?

What if I instead pay someone else to go around putting guns to the heads of server owners? If I build an army?

What if instead of that I communitize my resources into a legal system that doesn't put guns to people's heads, but will take their money away and put them in jail if they don't follow the laws?

Don't get me wrong, I'm with you in the hacker-culture sense: fuck the system, man, if Google wanted to it could probably blackmail individual US government officials to the point that it took the country over. I get that. I guess we can get deep into a political science debate about governments and social contracts.

Put it this way: Is your sense that you can walk to work without getting mugged a false sense of security? If not, the only alternative is homesteads with militias (not walking to work anymore), or, arming entire populations (putting the burden of self-defense on the people). In the past, this has been tried, and led to gang rule.

If we "give up" on legal systems, we have ample evidence for what happens. When you apply those lessons to the digital space, maybe it's not 1:1, I guess some countries will be learning that for us, while others will try things like GDPR.

It's all a journey for human civilization. People like you that promote self-defense are great because we get amazing government-agnostic tools out of the deal. People that support GDPR are also great because we can test out "social contract" methods.

What's wrong with dancing around both sides of the aisle?


It's not about giving up on the legal system, it's about taking responsibility and not sharing data with people you don't trust, or don't trust to keep secret.

The gdpr makes people think that they don't need to think about what they share and with whom. Do you really think companies are going to significantly change just because of this? I highly doubt it. Sure there will be some things, but in the end many of the same patterns and uses will emerge.


It's not about giving up on the legal system, it's about taking responsibility and not sharing data with people you don't trust, or don't trust to keep secret.

You keep writing as if everyone has a meaningful choice about who gets data about them, but clearly that is not always the case. Someone may obtain data about someone else from a third party, and you can't avoid sharing a certain amount of data and still function as a normal member of society.

The idea of absolute, black-and-white privacy, where either you share personal information or you keep something completely to yourself, isn't very useful in the modern world. Our conventions must be more nuanced than that, and in practice that means what really matters is who gets access to data about you and what they're using it for.


> not sharing data with people you don't trust

That means basically don't share them with anyone, don't sign any contracts, don't work and live in the street. Because even your employer or real estate agent can sell them to anyone else in your model.


Is there an example of "control" that is not of the "you have legal recourse" nature ? Outside of crypto, I can't think of any.


Yes, not giving ownership away to a party with no contractual obligations to you.


You'll find living this way to be difficult and frustratingly annoying.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: