Hacker News new | past | comments | ask | show | jobs | submit login
California Eyes Data Privacy Measure (npr.org)
179 points by _zhqs on May 28, 2018 | hide | past | favorite | 120 comments



> "Every industry sector [that] has looked at this initiative considers it a very serious threat to the ability to do business in California," says Robert Callahan, vice president of state government affairs for the Internet Association. The group represents major tech companies, including Google, Facebook and Netflix.

It's not hard if you don't base your businesses doing what many people consider creepy.

Maybe they should have thought of that before doing so.


Some companies don't sell ads or user data at all, but use related user information to detect things like scam rings on dating sites, spam rings on forums, and fraud rings on finance sites. If that information can just be deleted at the request of the user, say hello to way worse user experiences on sites like those, and many more you haven't yet thought of. The only way around it is to lock down signups, provide constricted service, or let the communities rot.


Data privacy laws generally don't give blanket opt-out options, and this proposal doesn't seem to include an opt-out. (Think about it: eg criminals obviously can't opt out of police databases).

The full text of the proposal is here: https://oag.ca.gov/system/files/initiatives/pdfs/17-0039%20%...

> A. Giving California consumers the right to know what categories of personal information a business has collected about them and their children.

> B. Giving California consumers the right to know whether a business has sold this personal information, or disclosed it for a business purpose, and to whom.

> C. Requiring a business to disclose to a California consumer if it sells any of the consumer's personal information and allowing a consumer to tell the business to stop selling the consumer's personal information.

> D. Preventing a business from denying, changing, or charging more for a service if a California consumer requests information about the business's collection or sale of the consumer's personal information, or refuses to allow the business to sell the consumer's personal information.

> E. Requiring businesses to safeguard California consumers' personal information and holding them accountable if such information is compromised as a result ofa security breach arising from the business's failure to take reasonable steps to protect the security of consumers' sensitive information.

I don't think the sky is going to fall if this gets passed.


"D" is surprising to me. It's either going to create a high incentive for dark patterns or it will kill the business of selling data. Why doesn't the proposal make it illegal to sell data if that's the case?

Past that, it's also surprising that a lot of this is about selling data, but what about using it internally? It seems like Facebook and Google don't want to sell your data as they consider it their proprietary asset.


That's cool, and I agree. Though as mentioned in another comment, I wasn't referencing the article, I was responding to the criticism that collecting user data doesn't serve a real purpose, and that it's just "creepy."


It's creepy if it's done behind your back. The tasks described upthread can be done openly.


But why can't users be asked and given the option? If that's the result of their data being deleted, then I'm sure the company in question would make users well aware of the consequences of opting out.


The problem is that even a few users opting out can create a bad experience for other users of the service.


I don't think you can expect users to be anything but selfishly motivated, and concerned with their own experience only. I don't think most will care about the experience of others. Certainly not if they're deleting their own account from the service. Plus the point is that in these cases, potentially the users who would most want to delete are the ones most motivated to do so: the scam/spam/fraud ringsters.


> would make users well aware of the consequences of opting out.

People don't even read the ToS (term of service). I'm pessimistic that they would even do this.


If you read the article you would know that this is perfectly legal under the proposition. It's selling your data to third parties without user consent that it makes illegal.


I wasn't referencing the article, I was responding to the criticism that collecting user data doesn't serve a real purpose, and that it's just "creepy." Also, I could be wrong, but GDPR does mandate data deletion on request, so even if it isn't a concern for this bill as written right now, it's still something to think about, since these regulations are headed in that direction.


GDPR only applies to EU citizens. This would apply to California residents. It also is covering a different behavior.


Consumer perception of what is and isn’t “creepy” has somewhat shifted in light of recent issues, however. Many of the businesses to which this law would apply have established their business models in a time when privacy may have been less in the front of a consumer’s mind.

Businesses just starting out have an easier time knowing which lines not to cross in their business models than those who began their journies sometimes over a decade ago.


>Many of the businesses to which this law would apply have established their business models in a time when privacy may have been less in the front of a consumer’s mind.

I think the degree to which your business model is future proof in the face of changing preferences of consumers should be on the mind on entrepreneurs, and if it isn't I think it's a good thing that new companies will eventually supplant them.

Not different at all from companies that had their eye on increasing environmental standards and planned accordingly.


I agree that we're somewhat overdue for a shakeup, although I'm not entirely sure any companies founded 10+ years ago could have anticipated the changing of the tide with respect to consumer privacy. I think the larger companies (notably Facebook) are learning as they go. And, frankly, doing a fairly good job of that.

An example is Facebook's reaction to racial profiling in the real estate and renters market. In my opinion, they handled that as well as could be reasonably expected.


I think a lot of people have always considered some of the practices employed in the tech sector to be creepy and unwelcome. It's just that awareness has increased, and now there's a sense that you don't just have to put up with the intrusions and abuses to live a normal life any more.


You have the choice not to use the major tech companies. In fact, a favorite pastime of HN is recommending all of the alternatives to use.

Another example, you don't have to buy an Amazon Echo or Google Assistant. If you find cloud connected microphones in your home creepy, don't do it. Why is that not sufficient? Or do the people who are creeped out want to make the choice for other people and remove products from the market they'd like to use?


You have the choice not to use the major tech companies.

Sure, but it is increasingly the case that by doing so you are excluded from normal daily activities, because in some cases these technologies are now how our society communicates.

For example, you can no longer park legally in many UK streets or public car parks without using a smartphone app, which means working with Apple or Google (Android) technology for the device, and working with a mobile network operator for the data connection. In theory, you can usually call or text from a feature phone instead, which means you only have to deal with the mobile network, but typically those facilities are among the worst user experience you will ever encounter, and taking upwards of 15 minutes to make a simple payment is not unusual, if you can even pay successfully at all. You no longer have any option to pay quickly, reliably and anonymously in cash the way we used to.


In situations where you have no choice, I agree. This has long been an issue with credit card payments too. But IMHO, the government could fix this by simply having their own payment mechanism. I mean, every store has their own rewards card, why can I just buy a parking card for the whole city that I top-off once in a while and just at the meters?


But IMHO, the government could fix this by simply having their own payment mechanism.

Honestly not sure if you're joking here... The government in my country has a somewhat well-established payment mechanism, which relies on small metallic or paper-plastic tokens to represent value, and that is what we used to pay for parking until these new-fangled things came along.


I'm not joking, physical coins and tokens are irritating and inconvenient to carry. Cash has other portability annoyances. A cash card that can be refilled electronically is more convenient than having to hit ATMs constantly or deal with merchants having exact change.

If you tell me to give up credit cards or mobile payments, I won't. I'd rather credit cards and mobile payments be made secure and private, but in the worse case, I'll trade convenience for marginal loss of privacy, since in decades of walking this earth, I've yet to have been harmed by it.

I'd be more concerned if I lived in a country with a more fascist authoritarian government, but the cost/benefit tradeoff for me, personally, it well worth it. Other people have the choice of not doing so, they just have to suffer the irritation of cash and loss of convenience.


Well, that's fair enough, if you're choosing to accept that trade-off. My concern here is that often it is no longer a meaningful choice, and everyone else is being forced to accept the same trade-off whether they favour their privacy and security or not. In that particular case, with a simple payment card or the like, perhaps it's not such a big deal. However, when something like your phone, your home or your car is compromised, the potential harm is more significant.


Stop using subjective terms like “creepy” and define exactly what you mean.


The subjective nature of the term creepy is exactly what is important here because it all depends on context. Most people would consider it creepy when Target knows their daughter is pregnant before they do. [1] I would also bet most people would be perfectly happy if Target figures out they have a new puppy and starts sending coupons for dog food. Those things are fundamentally the same when it comes to what data is collected and what process predicted it. However the human element of privacy makes those two situations worlds apart. That human context is what is important and why it is so hard to come up with universal algorithms or rules that can adequately handle any privacy situation.

[1] - https://consumerist.com/2012/02/17/target-figures-out-teen-g...


Somewhat unrelated, but is anyone else having trouble loading that consumerist page? It keeps redirecting me off to https://consumerist.com/remote-login.php?login=.... which 404s. Looks like a WordPress plugin (WPRemoteLogin) gone amok.

EDIT: looks like this is because I'm logged into my own Wordpress blog. Weird that that should have any bearing on my ability to load this blog.


Not being clear enough about what, how and when they proceed in regards to data


As well as being quite clear that they know much more about your life than you are comfortable with.


> Mactaggart recalls the moment about four years ago that turned him into a privacy advocate. He asked a Google engineer at a cocktail party whether he should be worried about his privacy. "He said, 'Oh if you just knew how much we knew about you, you'd be really worried,' " recalls Mactaggart.

It's not like we don't know that they don't know but if they did know they would not like it, and we know that too.

Creepy: doing something to someone that you know they would not like if they knew about it but doing it anyway.

What the tech industry et. al. is doing is deeply creepy and the only reason we're not up against the wall is that people haven't quite understood yet what's going on. It's so egregious that people are incredulous, but as they start to get a clue it will eventually be pitchforks and torches time. (Ahem, GDPR...)


I think a fair number of people would agree that the Google pre-G+ is not the same Google post-G+.

The businesses can survive and do well without needing to know every habit of everyone and target ads based on knowing everyone's predilections and peccadilloes. Sure it would be more like classical Newspaper ads and broadcast TV ads; which, while they worked, are acknowledged not to be as effective, but still effective enough to support the ad and consumerist economies.

It's probably not to late to return to that business model, if people demand it enough and legislators don't cave-in to business demand.


So the standard of illegality now is what people consider creepy or cringey? Maybe this is actually the solution to the Fermi Paradox, technological civilizations dwindle as they engage in navel gazing and precautionary restraints.


Yup, by and large societies determine the parameters in which they would like to operate. These take the form of social agreements, laws, constitutions etc...

In the olden days it was normal and legal to own people. Then (most) societies "decided" that they are not comfortable with the idea anymore and the norms and laws changed.


That's a rather bad analogy, considering that slavery is an initiation of force against an individual, giving them no choice, over a physical good (their own body). Information is a non-rival good, someone's possession of it doesn't diminish your possession of it, unlike say, a slave master possessing you on a plantation.

Philosophically, some of these regulations run afoul negative rights. When you interact with other people, or any entity, they will retain a memory of it. A lot of this data collection used to exist, but was informal, on pen and paper, or simply retained by the mind of the local establishments as their customers were locals. Saying that you own information that is by nature joint information like an entangled particle, and that another party must erase it, may be a pragmatic and utilitarian policy to deal with the increasing probability of bad actors (State or otherwise) misusing information, but the philosophy behind it rubs me the wrong way.

Let's say a run a video arcade, and I keep track of which games everyone plays when they come in through the door. Though I hold no PII on the people, through some mechanism I can assign them a unique ID and recognize them on return (e.g. token card). Why should this joint information: what entities playing my arcade cabinets, be exclusively owned by you, especially if I'm not even providing the games for free?

I'm asking this as a philosophical question, what's the moral justification that one side of a two sided exchange retains exclusivity to information? (and by exclusivity, I mean your right to ask me to delete it)


I was not comparing slavery to lack of data privacy. The point I was trying to make was that, in the greater scheme of things, there is no requirement for philosophical or moral motivation to change laws - only a critical mass of people who want the change.

In this context, the slavery example wasn't a good one as it is so charged with exactly that. Perhaps drug prohibition laws are a better example in general and Alcohol prohibition in the US in particular: https://en.wikipedia.org/wiki/Eighteenth_Amendment_to_the_Un...

With this in mind there is no need for moral justification that one side of a two sided exchange retains exclusivity to information (and given the way personal data is used and abused, I think that there is such justification) , only sufficient public opinion that this is how things should be.


> someone's possession of it doesn't diminish your possession of it

However someone's possession of data about you may indeed diminish other rights you hold and/or may be used to gain advantage at your expense. This is why we talk about data getting "into the wrong hands". I may not care for my employer to know about my sexual proclivities. I may not care for insurers to know my search history and draw (possibly incorrect) inferences from it.

The argument regarding privacy isn't about the potential that others' knowledge of your info diminishes the utility of the data to you (in most cases there is no _personal_ utility). It's about the power to restrict who can use that information as leverage in the advancement of their interests, often to the detriment of your own. It's about preserving agency.


I'm wondering if they did think of that, but were all rushing to cash out as much and as quickly as possible before the inevitable public backlash. A creepy-ness bubble, if you will.


> But Callahan [VP of a tech industry group] doesn't think a privacy law should be written by advocates like Mactaggart and put on the ballot.

> "Without any sort of process ... the proponents came up with this law," Callahan says. "[Mactaggart is] suggesting [this] should be the law of the land without any sort of public vetting or scrutiny and we think that's irresponsible and dangerous."

Isn't the ballot initiative process the "public vetting or scrutiny" this guy is talking about? Seems like he's unhappy his group didn't get a chance to water down or kill the measure behind the scenes.


The public aren't lawyers. One of the points of representative democracy is supposed to be that legislators are more experienced with the practice of writing effective laws - laws that will have their intended effect and stand up in court. With the initiative process, it's easy to find text that sounds nice when you describe it in 30 seconds to people on the street but is either ineffective or has awful unexpected consequences.

That's not to say that there's no place for the initiative process, but it's not without its problems.


Your criticism is valid, but keep in mind the alternative is generally legislation written by lobbyists.

https://www.theatlantic.com/technology/archive/2010/10/googl...


If laws can't be easily written and understood, that's a problem with the legal system (and the media explaining said laws and initiatives).

The CA implementation seems terrible, but in general direct democracies end up with better laws, and the populace in such democracies ends up being better than a bunch of self selected politicians (who happen to not be experts anyway). US and UK lawmaking processes are a perfect example of why you need to give citizens more control (but you still want legislative bodies to do the grunt work).


"but in general direct democracies end up with better laws"

Is there something that indicates that?


>If laws can't be easily written and understood, that's a problem with the legal system (and the media explaining said laws and initiatives).

Do you write code by chance? Ever introduce a bug or unintended consequence? I bet 10-1 it was logic problem, not a compiler problem.

The hard part of writing laws is in exactly and completely defining your terms. It has nothing to do with the legal system, and more to do with the nature of language itself.

Gödel's 'incompleteness theorems' comes to mind.


> Isn't the ballot initiative process the "public vetting or scrutiny" this guy is talking about? Seems like he's unhappy his group didn't get a chance to water down or kill the measure behind the scenes.

This is the same electorate that banned gay marriage (prop 8) and caused the worst housing crisis in decades (prop 13). Even if his opinion is wrong, his criticism of the the process is spot on.


California ballot initiates have a pretty wonky history.

I'm not a legal expert or legislative one but I do think there is good reason for public vetting as in a ballot initiative, and other things that require a bit more / or different type of vetting, such as something that goes through a legislature and maybe has the input of some folks who pay attention for longer than a tv advert.


Especially since ballot initiatives, to my understanding, cannot be fixed after the fact by legislative actions.


This law seems much more reasonable compared to GDPR.

"(b) "Business" means: (1) a sole-proprietorship, partnership, limited-liability company, corporation, association, or other legal entity that is organized or operated for the profit or financial benefit of its shareholders or other owners, that collects consumers' personal information, that does business in the State of California, and that satisfies one or more of the following thresholds: (A) has annual gross revenues in excess of $50,000,000, as adjusted pursuant to paragraph (5) of subdivision (a) of section 1798.115; or (B) annually sells, alone or in combination, the personal information of 100,000 or more consumers or devices; or (C) derives 50 percent or more of its annual revenues from selling consumers' personal information; ..."

And the definition of "selling" is also clearly defined.

" (q)(l) "Sell," "selling," "sale," or "sold," means: (A) selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer's personal information by the business to a third party for valuable consideration; or (B) sharing orally, in writing, or by electronic or other means, a consumer's personal information with a third party, whether for valuable consideration or for no consideration, for the third party's commercial purposes."

[https://oag.ca.gov/system/files/initiatives/pdfs/17-0039%20%...]


It looks like it was designed to be useless, since neither Google nor Facebook sell info under the definition of selling you posted.


There are many business that “sell” data under the second clause TO both Facebook and Google which could be impacted.

The Facebook Pixel is definitely disclosing data to Facebook, even if they aren’t being paid for it. And they probably use data from the Facebook Pixel across many advertisers for Facebook’s own business purposes.


I guess the question is whether selling targeted ads counts as selling personal information under this law. My impression is that it doesn't, which makes me wonder what the author was thinking. In fact, this will be worst for the smaller companies that don't harvest, target and serve all in-house, while the behemoths will be untouched.


It's not about Facebook Pixel, but the fact that ad companies usually don't "sell the data". They share the data with third-parties, and they can also give access to their advertising platforms (where the third-parties don't have direct access to the data).


Sharing the data is explicitly in the definition of selling I listed so long as the entity receiving the data uses it additionally for their own purposes. If Facebook uses it to provide more accurate targeting to a different advertiser, that sounds like using it for their own purposes to me.

Edit: to be clear, the Facebook Pixel is basically fine under this law so long as Facebook only uses the data for facilitating ads for the company the data was collected from in the first place.


n/a to facebook and google != useless.

sounds like it would hurt equifax. that’s worth way more than google, which only gives access to demographic data and doesn’t directly share it.


And yet they still lobbied against it.


They might just be over the 50 million gross revenue threshold though.


Is the law only effective at protecting privacy in your mind if it kills Facebook's and Google's business model?

Because I suspect that is what GDPR advocates truly want.


I became an advocate for GDPR after I started implementing it for the company I work for. I am an advocate for it because it requires companies to think hard about what they need the data for and whether or not it is going to adversely affect their customers.

Before GDPR there was virtually no downside for gathering private information. There was no downside for using that information to profile customers for any purpose you want. Now there is a downside: you have to tell the customer what you are doing with the data and you have to get permission to do so if the use is not related to the service that you are providing to the customer.

IMHO this strikes a good balance. You can still use the data, but there is a cost. Even within the organisation where I work, it has completely changed the way they look at this data. Previously the attitude was, "Let's collect the data and use it, because why not?" Now we're being told, "These are the only things we want to collect data on because we don't want to piss off our customers".

This is exactly what I want. I have in the past used Facebook's services. I currently use Google's services. I don't mind if their business model is destroyed because IMHO, on balance this way is better. I don't mind if people will have to pay for services like theirs. I'm old enough to remember a time where it was already like that -- it's really not so bad. Having thought about it (by way of being required to implement GDPR), I'm going to move to move away from the Google's et al. Having seen the transition in the company I work for, it's clear to me how much better it is.


If their business model is violating privacy, then sure.


Letting smaller businesses off the hook here is not a good idea, since the larger businesses could just buy data off of the smaller ones, or both could belong to the same umbrella corporation and the data could be shared for nothing.


What incentive would such a small business have for operating. They would have to deliberately cap their revenue in order to maintain the business size and not run afoul of the regulation. The moment their revenue tips over, the profit maximizing behavior would be for facebook and google to use home-grown compliant solutions. Once they have a home-grown solution that complies with the law, they'd have no reason to ever again contract out the collection work.


Not only would this not affect Google and Facebook, who "sell your data" to their own internal marketing products (let's not pretend they aren't actually selling your data, they just have the marketing company in-house), but that minimum revenue bar also likely exempts your Unroll.mes who do literally sell your data, because they probably don't bring in enough revenue to cross the line. Companies burning investment money and not bringing in actual revenue would also be exempt.


Unroll.me would probably fall under either of the other two categories that are not revenue-based.


Ah, good point.


>let's not pretend they aren't actually selling your data, they just have the marketing company in-house

They aren't actually selling your data... Maybe if you decide to change the definition of the word "selling" they are selling your data.


Or, as pointed out in this very discussion, use the definition that is right there in front of you.


No, I'll use the actual definition of the word thank you very much.


To many details in laws give judges less roam to do their job. There's a reason why European Countries aren't as specific.


I should know what's forbidden without having to go consult every judge.


Ironically, this article is not available without opting in to their tracking. Could someone post the plaintext link?


You just need to decline to be tracked, and then follow the appropriate link on the text-only home page.

https://text.npr.org/s.php?sId=614419275


Ah, didn't see the link was still under "Top news stories". Thanks.


Well it isn't always. If it isn't you have to copy the article ID from the URL. Also you don't get pictures. This clearly isn't GDPR-compliant.


With uBlock origin in block all third party assets, the article was fully readable without opting in to anything.


> Ironically, this article is not available without opting in to their tracking. Could someone post the plaintext link?

What are you seeing? I didn't see any consent popups on Firefox or Chrome (even with uBlock & NoScript disabled).


I'm using Firefox + uBlockOrigin + DuckDuckGo.

It's full page, not a popup or modal. Text:

By choosing “I agree” below, you agree that NPR’s sites use cookies, similar tracking and storage technologies, and information about the device you use to access our sites to enhance your viewing, listening and user experience, personalize content, personalize messages from NPR’s sponsors, provide social media features, and analyze NPR’s traffic. This information is shared with social media services, sponsorship, analytics and other third-party service providers.

Options:

- Agree and continue

- Decline and visit plain text site

Decline takes you to the homepage, which is literally plain text. Apparently it was too hard for them to provide some css. You have to search around for the content you were going to, ultimately ending up at this plain text page: https://text.npr.org/s.php?sId=614419275


One can manually assemble a text-only url by taking the long number at the end of a standard url and using it as the sId parameter in the text-only link.

Apart from that, I've found no way of getting text-only links to articles that don't happen to be linked from the frontpage.


I use uBlock Origin, uMatrix and Decentraleyes Firefox addons and had no trouble viewing it...


"He asked a Google engineer at a cocktail party whether he should be worried about his privacy."

Something similar is what led me to write my essay on ad tracking and Google that tried to trace it back to Larry and Sergey:

http://yuhongbao.blogspot.ca/2018/04/google-doubleclick-mozi...

Also see:

https://twitter.com/berendjanwever/status/775366191078641664


Your essay is a bit of a wall of text, I would suggest adding headers and cutting up the paragraphs into smaller chunks to make it easier to read.

Also I feel like the general essay format that is taught to you in grade school is actually a bad format. Grade school project style reports are usually easier to read and easier to write.

Essays contorts most people's writing into a very awkward form. If people were allowed write without the restrictions of essays, they would usually be more clear.

Better yet, try the typical medium or journalistic article format to get better reading comprehension.


And maybe have it proofread by someone? I always do this and still end up with typos and grammar errors but a proofreading session will usually get rid of 90% or more of those.


Yea, the essay has other problems I want to fix too. What I am interested in most is the points about Larry/Sergey BTW.


You are using an older blogspot theme that is not mobile optimized. You can update it to one of the newer, streamlined and mobile optimized themes with one click. Since you currently have no background graphics, you might like Emporio.

This would likely help make it a bit more readable for very little effort, though a grammar and spell checker and shrinking some paragraphs would help as well.


I actually just now took the time to read this. A good (20-30 min) read. +1


Data privacy is great and all, but I just want one-click unsubscribe from paid subscriptions.

I’m a practicalist.


I'd love a law that says [paraphrasing] "You must allow unsubscription via the same medium as subscription." Problem is there are far too many loopholes and potential issues, there's no way you could make the text of the law both exact enough but vague enough to work in most circumstances.

It would be better to just have a consumer watchdog that had the power to ask companies to change anti-consumer practices and use a jury to determine if something is anti-consumer (like dark patterns).


Yes I’ve thought about this too - especially with cell phone/internet/cable/magazines - silly how you can subscribe online or in-store but need to call to cancel. With digital subscriptions it should be as easy as unsubscribing from an email list.


It's a tough issue to get right, because unfortunately some customers are clueless and some customers can be abusive.

I have some experience running online subscriptions, and my policy has always been that you can unsubscribe straightforwardly. Typically it takes a couple of clicks clicks, one to start and one to confirm, and the whole process is immediate and fully automated. In some cases, maybe there's also a brief and optional exit survey, but never anything deceptive or that significantly obstructs someone who wants to cancel. Like not using some conversion optimisation techniques that cross a line into invading privacy, I just see this as treating our customers how we'd like to be treated ourselves.

Many people have used these facilities with no trouble and those services have generally received few complaints, but you do still get the special people who instead of taking ten seconds to do that would rather send a page-long email about how they demand that we cancel their subscription right now and if we charge them again they'll dispute it, and they'll literally send that email 5 minutes before their next renewal fee goes through.

When you're dealing with that kind of person, you want it to be stated very clearly and in writing that their subscription continues until cancelled in the proper way, that merely emailing is not sufficient to cancel and may not take effect immediately even if we do honour the request, etc. Of course, that doesn't stop us from being friendly and helpful if someone sends a polite cancellation request by email, even though it's an irritating waste of time to deal with those requests manually, but you can't be too casual about cancellations or, sadly, some people will exploit that.


A confirmation screen isnt allowed in case of a mistake or confusion?

What if you've agreed to a contract and want to exit early and there are fees?

What if the company wants to offer an incentive for you to keep your subscription?

Should all those things be illegal?


Netflix is great that way. Really easy to unsubscribe.


The is great and all, but it requires individual action on the part of the consumer. I'd like to see a US version of GDPR and soon.


this is their homepage https://www.caprivacy.org/about - they should at least implement the conspicuous "Do not sell my personal information" link on their own website. Just to get an idea about the extra work that they're asking millions of people to do.

*Edit: under the proposed law, websites like this wouldn't actually be required to post the 'Do not Sell' link


Their privacy policy states that they do not sell personal data at all. So what benefit would that button have?

Source: https://www.caprivacy.org/privacy-policy


Upon closer inspection, it seems like this law would only apply to large business or businesses that actually sell personal information. If that's the case then it's more reasonable than I initially thought.

I was concerned that every blog and startup might need to implement the functionality, but it seems like that might not be the case.


But what if the personal information is not sold but shared for free, will this law apply? I can see the big comnpanies finding so many loopholes in this.


If I'm reading the quoted portion correctly, it'd apply to large companies whether or not they sell personal information.


But if the large company doesn't sell, then it would be a blank page for the most part?


I'd have to read more of the law than was quoted on HN to know for sure, but I suspect there would at least be rights to learn what data they have on you and rectify inaccurate data.

Maybe some deletion right too but the above is the part commonly found in other privacy laws, such as in Canada where I now live.

None of this depends on whether they share your data, let alone share in exchange for compensation.


> functionality

A link to a static page "we don't collect user data" is now "functionality"?


I was referring to the requirement of a page with an opt-out form, but since this law wouldn't impact small websites it's a moot point anyway.


> If voters approve the measure, businesses will be required to have a "clear and conspicuous link" on their website's homepage titled "Do Not Sell My Personal Information." The link would take users to a page where they can opt out of having their data sold or shared.

Ugh but why though? Why not just have a law that says you cannot sell or trade personal information.


Because some people might believe that allowing companies to sell their personal information could provide them with some benefits that outweigh the costs and risks. Not everyone attaches the same value to privacy.


But why not force all sites to give you clear information so you are informed. So when I follow some link and land on a "intresting" site I should be informed that continuing navigating means I agree that Google,Facebook, other 20 companies will be informed about me opening this link. If I do not agree I will be forced to leave.

The only work the site owners needs to do is to have a list of all the companies they sell the data too, show that list to you, and save your acceptance in a cookie.

For more complex apps like Facebook that sell more data or mine your private messages/emails,purchases, music and movies you watch they should also display the exact thing they sell or share for free.

My point, the companies would have just to inform, would not be forced to give you access.


You're basically reinventing European cookie law. I'm pointing it out for two reasons - one, for those wondering how this "stupid law" was created, this is how. Two, we ended up needing the GDPR anyway, since just informing is not enough.


The cookie law did not forced the sites to tell you exactly what data they sell/share and to what companies.

The way it was implemented was "This site uses cookies to store your preferences and it can't work without this essential technology".

I suggested this because I see some people here don't want to stop selling the data, at least have this selling transparent to the user. Maybe we get the ad blockers, containers in browser and other related technology adoption rise faster(it won't solve all the problems but would stop tracking at least)


Yes and no. If all you wanted to say was:

> "This site uses cookies to store your preferences and it can't work without this essential technology".

... then you didn't need to display anything at all. You only needed a cookie warning if you were doing tracking and other data collection that was not a technical requirement of the site (and "supporting the business model of selling user data" is not a technical requirement of a site).

Alas, the law was broken enough that everyone could get away with defaulting to show a vague "this site uses cookies for your own good" message.

Where it applies, GDPR doesn't disallow selling data. It just ensures the user explicitly opts into that scheme. That creates extra burden for those who don't mind their data being resold, but that's like a small percentage of users. The ones who desperately don't want to be tracked are another small percentage. The vast majority of users are people who don't know any better and don't even understand the topic, so they will go along with whatever is presented.


Requiring companies have an opt-out version of their sites is essentially the same thing as saying trading personal data for services is not a valid business model. It does seem that fundamentally that is the point of GDPR, it would be a lot simpler if they just came out and said that instead of these confusing obfuscations "you CAN use PII if you absolutely need to and the user agrees and you allow them to use your site as usual if they don't agree".


Probably just saying that would be too vague. A law generally can't just point at something and say "don't do that", because it'll leave enough loopholes for antisocial people to drive a fleet of oil tankers through.


I'm ok with companies using my information for our mutual benefit - I have a relationship with them.

I'm NOT ok with companies selling my information.

I might be ok with companies selling derivatives or analysis of group information.


The fact that they require the name "Do Not Sell My Personal Information" is part of what I don't like in the bill.


And the fact that selling personal information isn't the only concern with personal information. I'm pretty sure that Facebook and Google don't sell my personal information to third party entities, but I still don't want them to have it unless I give it to them by choice.


This is broken in so many ways, it's ridiculous. There needs to be a programmatic opt out (at least something a browser extension can do on each page visit), or, better, a manual opt-in.

Also, opting out won't actually prevent any of the big data abuses people are actually concerned about...


Although I agree with the sentiment that we need more privacy protections, I don't believe that a ballot measure is the right way to do this. There have been several poorly written ballot intiatives that have led to various unintended consequences.


The article fails to mention that the proposed law doesn't actually impact the majority of startups and small businesses, so calling it 'sweeping' might be a bit much.


I think that the basic issue here regarding privacy is that only the ones breaking it are writting. There are literally millions that wont give upvote but want it.

Google and Facebook already launched their lobbyists there and are trying to undermine it, I wonder what they will do to Japan.


[flagged]


Would you please stop using HN for political and ideological battle? It's not what this site is for.

https://news.ycombinator.com/newsguidelines.html


> Otherwise California wouldn't have the worst homeless problem in history.

My understanding, from living in Vancouver BC (which also has quite the homeless problem) is that homeless people will migrate from other areas of a country to find the place with the best combination of 1. local weather amenable to living outdoors, and 2. social programs with high ROI (i.e. programs that give you the most benefits for the least bureaucratic folllowing-up.)

No matter how quickly/effectively you deploy solutions for the local homeless problem in a city, of said city is the optimum place for people to live while homeless, you won’t see the end of the problem. (Unless you also have the lobbying power to make changes that solve homeless problems for the entire country.)


This concept was immortalized in this seminal work of lyrical prose by Matt Stone and Trey Parker.

https://www.youtube.com/watch?v=b2zGFXaPn0o


Califor-nyaw-nyaw ;)


Could you please stop posting unsubstantive comments to Hacker News?


[flagged]


That's not how you argue a point.

California can have both migrating homeless and an uncaring political class.


Excellent news. Next step: reciprocal enforcement.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: