The official title of the article is "How to Remove Your Google Search History Before Google's New Privacy Policy Takes Effect", and the writing itself is much more neutral than this submission title.
I'm still not sure why people are afraid of Google's new privacy policy. I understand that there are people who have specific privacy needs, but outside that scope I doubt you have anything to worry about.
It's doubtful at best that Google's "log" of you would become compromised (unless your personal account were compromised, but then this would have been a problem anyways!). It also isn't the case that some Google employee is reading row after row of Google's customer DB snooping on individuals.
Google isn't some unified entity; your data is being manipulated by advertising algorithms to tailor ads for you. Unless you care about a CPU "knowing" your secrets, or you have specific privacy needs/concerns, none of this is a problem.
Maybe someone can surprise me with some good reasons to be concerned, but until then I am trusting Google.
It's not that I'm worried about Google doing evil things with my data, I'm worried about it increasing the complexity of the identity management I already have to do.
I don't want Google targeting ads at me while I'm at work, based on searches I did at home. I don't want to have to worry about the carefully targeted adverts displayed on random webpages I show people on my screen.
I don't want my hypothetical future child having to worry about accidentally disclosing via content targeting that they L, G, B or T before they want to.
Until it's obvious that Google are building tools that let me actually cope with these sorts of problems safely, and easily I'm happy to not have my thousands upon thousands of Google searches being centrally logged. Even if it was an occasionally useful feature.
Remember, people are not normal. The long tail is still long and there are a vast number of people out there with "specific privacy needs" that are "outside that scope" and definitely do have something to worry about.
You make excellent points, and I'd like to add one more:
Google's personalized search is just not as good as their unpersonalized search.
The time I was Googling for suitable Xmas presents for my Grandma should not influence my searches for valentines gifts.
Incidentally, am I the only one to mentally filter out search results that have the avatar of an online "friend" recommending them? Is that link was ranked highly just because someone I've never met tweeted it?
I don't want to be put in a glasshouse where google introduce bias on its own agenda on search results. The claim that it is for my best is also a claim made by all the spammers. Google is also subordinate to US interests and this is another problem for people like me who live out of the US.
Without saying so, google is building its own great wall of china, but worse, around each individual using google search. I'm using DuckDuckGo for now.
I understand that SEO are corrupting search result pertinence and google is trying to do something about it. But there is no guarantee that it's new strategy will make things better.
My feelings and the reason that (personally) I don't want this is I spend all day doing online marketing for brands and retailers and I have very different interests outside of work.
The system (at present) does not seem intelligent enough to decipher between which of my friends should suggest what film I should go to vs. which should be giving me financial advice vs. which are just backing a product because they're a client.
Strictly regarding the web/search history thing, if I work on a campaign for Burger King and browse their site and related products a lot of the day, I don't really want "Burger King" to come up first on a search with local intent for "restaurants in New York", in that instance I'd prefer see the generic results.
For me it's not that I don't want Google to have the data or I have anything to hide, it's that I think the "search" experience it creates because of my very different needs in and out of work is poor (and likely to get worse with sharing this data across platform) and I don't want to spend my days having to keep a super close eye on which account I'm currently logged into before doing a search.
It's all personal preference and for some this will provide a great experience, and it may get better. For now though, I like the ability to say "no thank you, I like generic search".
Have you discovered Google Chrome's helpful "Users" feature? I use it for keeping Facebook and other abusive sites away from my main web activity, but it sounds like you've got a good use case for "Work" and "Real life" users.
With all due respect, this attitude really disturbs me. It's becoming very common these days: the implication that only deviants, criminals, off-the-grid wackos, and the anti-social have any real need for privacy.
No. I reject that premise. We all have a right to privacy, and we shouldn't have to give our reasons. (That's why it's privacy, after all!). We're quickly turning into a society whose members need to opt into privacy, and in which doing so is somehow stigmatized as abnormal.
I'm not suggesting that you are championing such a viewpoint, or that you even mean to suggest it. But that line of yours -- the idea that only a subset of us have privacy "needs" or "concerns" -- is troubling, and it reflects the rapidly encroaching, popular attitude of which I'm speaking.
[FWIW, I agree with you that Google's data-gathering isn't exactly a cause for immediate alarm. It's hardly taking us on some ominous path toward 1984. But I'm more concerned about the slow, steady erosion of privacy in the digital space, of which Google's activities are only one small part.]
Fair. Whatever case you want to call it, the point is that privacy shouldn't have to be a conditional, special-case thing (a "specific" "concern" or "need").
I think the point he was making is that the average person doesn't need to (or want to) care about these kind of things.
I mean, the mentality I'm seeing on HN and Reddit and other sites nowadays is "How DARE $company tie my activities using their service to the IP address I accessed it from!"
It always seemed like common sense to me that any data I give a company (whether implicitly or explicitly) would be combined and reused and remixed in almost any way possible.
Google is welcome to my web activity, as far as I'm concerned, that's my payment for some really awesome web services that don't really have an equal anywhere else, especially wrt. integration.
But then again, I'm not a political dissident and I don't really have anything to hide during an average day on the internet. (And if/when I do have something to hide, I take steps to anonymize myself)
There most certainly is something as a need for a certain degree of privacy, but please do not confuse "need" for "justification". A soccer mom's average activities might not need privacy, but any justification she has is none of anyone's damn business but hers. If she wants to keep everything she does private, fine.
Someone who has a need for privacy would be the usual, political dissidents, unpopular opinions, etc.
There is very relevant post on a webcomic for people who things that privacy is only for people who have something to hide: http://abstrusegoose.com/strips/missing_the_point.png and sometims images is worh thousand words, right?
I think that is alarmist in the context of this conversation. Google is not 2 people watching me sleep, shower, drive, date, make love, etc. It is software that stores and analyzes data that I choose to send to it. I choose to send my search terms to Google because I value the responses I get from Google; this strikes me as a voluntary exchange of value between two parties, not Big Brother from 1984.
The Abstruse Goose comic is much more applicable to pervasive government surveillance. To the extent that I fear that the government would get its hand on Google's cache of data about me, I blame the government, not Google. I think if people spent as much time focusing their attention elected officials as they spend on hand-wringing about Google, we'd probably have better laws.
> Google is not 2 people watching me sleep, shower, drive, date, make love, etc.
its irrelevant how many people is watching you, but Google will watch you as much as technology and law (sometimes bent to maximum) lets them. Like you stated you value responses, but it doesnt get the rocket science to know humans are unique enough that one search to one person is irrelevant to another. Hence, search needs to be customized. If they could turn your security camera on, they would watch you sleeping. You can learn plenty from the act alone: having problems sleeping, snoring, moving your legs (whatever the symptom is called), waking up too early/too late -- al this information is gold to Google. Your driving behavior is important to insurance companies. Your dating life is important to dating websites - if you are single you are worth money to them. Otherwise they dont care. Making love is important too -- can you handle the ride, are you an impotent, do you like wearing kinky clothes, etc etc. This is all GOLD to google. Now, they may not throw at you Viagra pills when you search for steakhouse in New York but if they AdNetwork goes to other website they may, in their discretionary, use information gathered about you to serve better ads and get better click-through.
> It is software that stores and analyzes data that I choose to send to it.
but you not always aware you sending, sometimes knowing what you "sending", you wouldnt!
> The Abstruse Goose comic is much more applicable to pervasive government surveillance.
whats the difference if Google has to abide by the laws of the country they do business? They can sometime deny response to govnt subpoena your searches, but I say more and more they just comply. I dont think number of gov inquiries will go down. I dont think % of denials by Google will go up.
> I think if people spent as much time focusing their attention elected officials as they spend on hand-wringing about Google, we'd probably have better laws.
how is this relevant to the subject or your comment at all?
> The official title of the article is "How to Remove Your Google Search History Before Google's New Privacy Policy Takes Effect", and the writing itself is much more neutral than this submission title.
The submitter just removed the "How to" crud from the title! :)
If that encroaches too much on your ability to neutrally assess a topic title, well, this ain't gonna be your century.
> It's doubtful at best that Google's "log" of you would become compromised (unless your personal account were compromised, but then this would have been a problem anyways!). It also isn't the case that some Google employee is reading row after row of Google's customer DB snooping on individuals.
Both of these cases have happened in the past.
Personal accounts have been compromised. That this sucks for other reasons doesn't make it less of a problem--labelling your house keys with your home address is still a stupid idea, regardless of whether losing your house keys "would have been a problem anyways!".
Google employees have been snooping on individuals (minors, even) in Google's customer DB. They were fired on the spot, but afaik no legal charges were pressed.
So basically you just told yourself a pretty lie, writing what you want to believe, not the way things actually are. How can you speak of neutral writing?
And then there's of course the government requests:
See, on one hand I can see your point. IF all my data would forever remain under the sole control of Google AND will only ever be used in the ways they tell us it is used now THEN yes I concede there is little practical concerns to be worried about.
But if you believe those two requirements are met today and will remain so as long as Google possesses relevant data about yourself, you are deluding yourself.
The data is already being given to the US government and maybe others too.
Data leaks are not going to get less, and exploits will not stop to be found, and at some point it'll be Google's turn.
Finally there's the possibility that Google will have a change of "mind" and either turn evil, sell their data, starts doing less conscionable types of data mining, or maybe simply becomes utterly lax in protecting your data. It's a risk.
Combine all those risks and it'd be pretty naive to just ignore the facts and hope for the best.
My concern is that someone out there is accumulating a huge amount of very personal information about me. It happens to be a corporation that has a legal obligation not to me but to its shareholders and that obligation is to maximise profit. It also has a legal obligation to the US government and the US courts, a government I have no right to elect and a legal system I cannot afford.
And here's my point: I simply cannot know or control what profit maximising ideas they may come up with in the future or what kinds of ideas politicians may deem necessary in order to win an election.
Some say that it will always be more profitable for Google to "respect my privacy". The problem is that the meaning of "respect my privacy" is a matter of fast changing attitudes. It's not my personal privacy they need to be concerned about, it's what the vast majority of their users will passively accept.
Their privacy policy says they can do anything with my data to create new services and improve existing ones. That means they can do anything. Full stop.
The reason why I'm more concerned about their new privacy policy than about the old one is that lots of small piles of information are less dangerous than one big pile of information, exactly for the reason they are creating it: The value of accumulated information is more than the sum of its parts.
The conclusions that can be drawn from that combined dataset are a lot more reliable and robust. Right now, much of the data that is out there about us is basically garbage and everyone knows it. No one can know if it means anything that I supposedly visited this or that site. But if they can also analyse what I searched for, what my emails are about, what blogs I read, which people I communicate with, etc, they can learn something about my intentions, not just about my actions and they can throw out the garbage.
I don't want all my intentions to be known or to be knowable to anyone on earth because I cannot know or control the intentions of those wanting to know my intentions. It's a onesided shift of power away from me and I can never take it back.
Once the data has become so much more reliable and the conclusions so much more robust, the desire to use that information for all kinds of new purposes will grow accordingly. The pressure on Google to use that knowledge to increase profit will grow. The incentive for criminals to steal the data will grow. The pressure on governments to proactively spy on everyone to prevent this or that type of crime will grow.
This is not me saying "Google has turned evil". It's just a recognition of an inevitable social and economic dynamic ensuing from the possibilities that well intregrated, high quality, sets of personal data provide.
I know more than the vast majority of people about data analysis and hence I will not delegate what "respecting my privacy" means to that majority of passive Google (or Faceboook, ...) users.
> It happens to be a corporation that has a legal obligation not to me but to its shareholders and that obligation is to maximise profit.
That's a myth. I belived it was true until someone here on HN linked to a Harvard Business Review article debunking it. Unfortunately I can't find the article, but here's a blog post discussing the issue: http://truthonthemarket.com/2010/07/27/the-shareholder-wealt...
This article is all about errors that a management team could make, but I'm talking about their intentions. Google's management team cannot intentionally pursue goals that are bad for shareholders.
I doubt that it would be legal to do so, but I'm not a lawyer, so suffice it to say that it's not practical because they would be removed from their management positions very quickly.
All you have to do is define "good for shareholders" in terms of setting up long-term growth (finding less-local optima). I feel like a lot of people here know that -- no one is going to be suing amazon over its very liberal return policy, because it keeps customers very happy. Yes it needs to end up being profitable, but not necessarily on a case-by-case basis. Some customers may end up being a net loss, but it's worth it for giving customers the overall feeling of safety in making a purchase from amazon.
Google actually explicitly put something like that in their IPO report ("focus on the user and all else will follow" as facebook helpfully brought up recently).
Now, Mark Zuckerberg also put similar statements in his letter to investors, and you could certainly make the case that these statements don't guarantee that the user will be put first, but investors have still been alerted that decisions they don't like may be made if they're deemed important for long term growth (google's voting shares scheme also helps in the regard).
For companies in general, though, the reality is that the judge will almost immediately throw these cases out unless there has been a major stumble (see, for instance, all the thrown out stupidity around demands for a documented and public Steve Jobs-succession plan). Even then you pretty much have to prove that the actions made were negligent without any kind of foreknowledge of how the market is going to behave. In other words, the only people that end up making money on these suits are the ones in cases where the company settles to avoid court time.
It just irks me that people often use the myth that they must maximise shareholder profits at all cost as a reason for why companies act evil. The Harvard Business Review article, that a still can't find unfortunately, also talked about how this myth is pervasive in management circles, and leads to bad business practices and corporations being socially unresponsible. I got the impression that you were insinuating this.
I didnt read the article but I believe this is a state by state law, and some states use the term 'stakeholder' instead of 'shareholder'. A stake holder can be interpreted as anyone affected by the company's actions. If you live down the street from a manufacturing plant, you are a stakeholder because their emissions are going into the air you breath.
It also has a legal obligation to the US government and the US courts, a government I have no right to elect and a legal system I cannot afford.
Generally, if you're not in the US (and I'm assuming that's what you mean, not that you're disenfranchised but live in the US), then Google and other US companies have an obligation to comply with the laws of your country, assuming their terms of service allow use in your nation.
Ultimately, you have a choice to use or not use a service. The Company offering it has an obligation to disclose what information is collected and how it's used, and you also have to make an assessment of their ability to protect that data from misuse either internally or by external parties. We make these decisions all the time both online and offline.
I think it's fine for folks to opt out of using a service like Google if they no longer agree with the privacy policy. I don't think it's appropriate to villanize any company for good-faith efforts to update their privacy disclosure over time as long as you can opt out of those changes and erase your data. And I think we have to be realistic and expect that for 95% of users, they're not going to opt out.
I simply explained what my concerns are and why I'm opting out. I think that is a perfectly appropriate thing to do and has nothing to do with villainizing anyone.
I also think that it would be appropriate for you to disclose that you work for Google if you contribute to a discussion like this one taking the side of your employer.
I appreciate your view on that, but I'm expressly not posting on behalf of Google or in my role as a Google employee. I don't think it would make sense to clutter the HN threads with disclaimers considering that just about every topic here that involves a tech company will have folks from that company posting, or who have investments in those companies. It's not as if I keep it secret who I work for.
My rationale behind this is one of "I trust you but not your boss" or "I trust you today but I don't know if I can trust you tomorrow". I use this in telemarketing calls or whenever someone asks me for more information than they actually need.
I'm pretty sure EFF is wrong here. Google's Web History is basically a public version of your search history. You can turn on or off whether you want Google to rub your face in its knowledge about you. But it retains that knowledge even if you turn off the rub-in-your-face personalization part.
Google retains a complete history of your interactions with them, which is not subject to this Web History setting, not deletable, not removable, and will be shared across its properties.
Short reply: This doesn't remove Google's search history of your searches at all.
Yeah, I've had this feature disabled since it launched and google still seems to return "personalized" search results. If you don't want google to track your behavior, you probably shouldn't be logged into google in the first place.
Do they not track your behavior if you are not logged in (genuinely asking), using cookies or IP address for example? Sorry if you find this to be a naive question.
I doubt they would be using your IP address to track anything, since a house full of people can be behind the same IP address. I don't know if they use cookies to track you if you're not logged in, but to be safe I would use incognito mode if you are worried about that.
If google states "Your web history is currently empty" and "web history is paused" but they continue to track your searches in a way that is tied to your identity then I think they're in breach of several laws.
They could chose to withold the ability to disable tracking. But by giving you the impression that they're not tracking you I think they're definitely across the line if they continue to track you in spite of that.
> Google retains a complete history of your interactions with them, which is not subject to this Web History setting, not deletable, not removable, and will be shared across its properties.
Is a pretty heavy accusation to make without hard proof. And if it turns out to be true I'd hate to be the one to have to explain to the EU regulatory body how "your search history is empty" can co-exist with "we have a complete history of your activity". If it is a history on my activity then my search history can not be empty and if it is empty then google should no longer have it either.
Uh, this is not a secret. Google says specifically that web history is a way to personalize your search results, nothing more. Web history is an add-on to Google's normal tracking.
"You can delete information from Web History using the remove feature, and it will be removed from the service. However, as is common practice in the industry, and as outlined in the Google Privacy Policy, Google maintains a separate logs system for auditing purposes and to help us improve the quality of our services for users."
This refers you to the main privacy policy, which says very clearly that Google stores every request you make along with cookies that uniquely identify your account.
Web History is an add-on tracking system. Google does not permit opting-out of Google's regular tracking system, which at one point stored every request you had ever made to any Google server, but which I think now may be semi-anonymized after a couple of years, if I remember their changing policies correctly. Although, the Privacy Policy makes no such promise, so perhaps they do still keep every request ever made in Google's existence. Google absolutely has a perfect record of every interaction you've had with them for the past year or two. Even if Web History is "off".
I'm sorry but I distinctly got the impression that 'opting out' of the search history feature meant that google would not be saving my search history and I'm quite surprised that this is not the case.
It is of course possible that I'm alone in this but somehow I doubt that.
I'm surprised that you would come to that conclusion. I would think that most "mega power-users" (as I'm now labeling you :) would simply assume that they were turning off some kind of public-facing display of their search history, and that you were not opting out of their ability to target ads at you with greater precision.
I would not be surprised if many people got the same impression.
I think what blindsided me was a combination of two factors: (1) I want to believe that google tries to be the 'good guy', (2) the specific wording of the history feature which states that "Get results and recommendations that are tailored to your preferences.", as in "before you enable this feature we can not do that". That feels like an opt-in.
The fact that it is only available to logged in users and that - as far as I can see - it is not public facing, but just to you specifically further increased that expectation.
Yeah, I can see now why it would give off that impression. That feature is going to give them so many issues in the future. I wouldn't be surprised to hear about its removal one day.
Could the difference be that Google associates data with your Google account if you have Web History turned on, but only uses IP addresses in their "logs system"?
If this was the case, IMHO, it would be consistent enough with expectations.
Based on Google's "What Google knows about you" page[1] I would have to conclude that their logs associate your search behavior with your account either way.
That's the doubleclick cookie personalization, which they aren't allowed to tie to your account based on the doubleclick acquisition agreement (which is why "on the web" and "on search and gmail" are separate and why if you go in an incognito window or block doubleclick cookies it will say it has nothing on you).
I don't believe this is quite correct. The EFF article clarifies this in an update:
> [UPDATE 2/22/2012]: Note that disabling Web History in your Google account will not prevent Google from gathering and storing this information and using it for internal purposes. It also does not change the fact that any information gathered and stored by Google could be sought by law enforcement.
> With Web History enabled, Google will keep these records indefinitely; with it disabled, they will be partially anonymized after 18 months, and certain kinds of uses, including sending you customized search results, will be prevented. If you want to do more to reduce the records Google keeps, the advice in EFF's Six Tips to Protect Your Search Privacy white paper remains relevant.
I found this link in google's privacy FAQ, but there may be a better one out there:
Of course, they unhelpfully don't enumerate all the uses of that log data, but it seems pretty clear that it isn't for personalization and (as far as I can tell) it isn't even changed at all by the new privacy policy anyway.
edit: it also appears that they log by a cookie ID that isn't tied to the account. You could probably still reconstruct account information for many users (and even after anonymization a lot is revealable (see the AOL search logs debacle)), but, again, this is quite different than logging data associated with an account and using it to target advertising.
> We strike a reasonable balance between the competing pressures we face, such as the privacy of our users, the security of our systems and the need for innovation. We believe anonymizing IP addresses after 9 months and cookies in our search engine logs after 18 months strikes the right balance.
This is an interesting point and something worth digging around for (the answer). Though the "super users" may know the difference between this and that, the average user, and even above average user may not. The "reasonable expectation" of an average user might be that turning off web history would cause Google to stop tracking.
I for one am glad that Google at least provides this option. I'm sure if Facebook, Zynga, or many of the current startup psoter-boys would not, given the same opportunity.
It sounds good in concept, but I'm concerned that somebody searching for child pornography or similar could be associated with myself. I have the same fear of running a TOR exit node.
Actually it gives you plausible culpability, not plausible deniability. This idea that people actually don't get charged/tried/convicted because the evidence is a little flimsy is completely false.
What you say: "I did this cookie thing. I never searched for any sort of kiddie porn on the internet."
What the jury of your peers hears: "Child molester trying to pretend he didn't do it."
What the judge does: Extra time in prison because you aren't taking responsibility for your crimes.
It might give you plausible deniability in front of a jury (emphasis on might), but that says nothing about assumptions or biases made by law enforcement officials or prosecutors.
Yes, plausible deniability is pretty useless after the police have arrested you at 6am in your home after bashing down your door, and all your neighbours have found out that you've been accused of being a pedophile.
To go with what the sibling poster said: when you're so much as accused of paedophilia in this country, any sort of real expectation of due process or rational justice goes to shit. Plausible deniability won't be of much help.
Plausible deniability is needed when you want to hide your role in something. Since I'm not sharing kiddie porn, I have actual deniability. No TOR traffic needed.
As I understand it, the client plugin and the proxy are different programs. I don't think you have to run an "exit node" to use it. Though it obviously wouldn't work if no one would be running one, but that's a different issue.
I don't see how they would associate that with you. Only the exit node for the google queries would be suspect, and I'm pretty sure most don't keep logs.
I have found Firefox add-on OptimizeGoogle [1, 2] quite promising. I'm not sure if it does exactly what you are looking for though. Unfortunately the development is being stopped due to resources issues.
Handing your search data off to other people still opens up potential privacy problems. There was a FF plugin that would run random web traffic in the background based on various RSS feeds you pointed at it. I forget the name but I am sure you could easily modify that to do google searches. This would obfuscate your real data just as well without handing it off to other people.
I never realized you could access your search data history. It was really fascinating to see the different visualizations including my search concentration by hour and days of the week.
That said, even as a non-statistician, it wasn't hard to imagine the amount of information Google could infer by focusing on an individual. I had a data set exceeding 50k searches with the resulting click-throughs on my account. (Plus, they can of course leverage their massive db to help eliminate anomalies like Whitney Houston)
In addition to the basics like big item purchase history, hobbies, and problems (e.g. sickness); I wouldn't be surprised if Google could predict my relationship status and sexual preferences.
Custom search results are great, but I'm extremely happy to have been given the option to delete that profile. If anything, it is the single largest factor pushing back to Firefox and possibly DuckDuckGo.
Google makes money from your ignorance. They could put a "Disable all Tracking" button in the black bar at the top of all Google properties, but their business model relies on the majority of people not knowing they're being tracked, or not caring enough to go looking into how to disable it.
Some people (like me) just don't mind. Tracking is not inherently bad, and they use that data to provide some value for users. If you had to inform everyone and provide opt-ins for every single feature that someone out there might not like, nothing would ever be done on the Internet.
Also, running analysis on this kind of data must be pure awesomeness, from scientist's point of view ;).
You'll note that I made no statement regarding the goodness or the badness of tracking. I merely stated that Googles business model relies on the majority of people not being aware of the tracking, or not caring enough to disable it.
"Ignorance"? I don't understand this righteous indignation people have towards online companies who would dare to try make money off the awesome free services they provide. Some of us are completely aware of what Google does. Seeing some personalized ads in exchange for access to all the world's information seems like a fair trade-off to me.
Yes, "Ignorance". He was "ignorant" of the possibility. When you call somebody ignorant, for being ignorant of something, this is not an insult, it is a statement of fact. I don't know where you got "righteous indignation" from. There is nothing righteous about my saying he is ignorant, and there was no anger or annoyance, so I don't know where you got "indignation" from.
"Some of us are completely aware of what Google does"
And most are not. And that was my point. At no point did I state that 100% of people are unaware of what Google does, so I don't know why you felt the need to add that tidbit.
"Seeing some personalized ads in exchange for access to all the world's information seems like a fair trade-off to me."
Not really relevant to anything I said. (EDIT:) However, if "seeing some personalized ads," is the only cost you can imagine, then I'm afraid that Google is benefiting from your ignorance too. Here are two more costs:
1.) If your account is compromised, your search history is compromised and could be used against you
2.) If you get in legal trouble, your search history could be used against you
"If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged"
Likewise for me with the Google Apps account I'm usually signed in as when doing searches.
Going to the non-SSL page http://google.com/history (as suggested by someone here) doesn't result in the redirect, but gives the message "Web History is not available for mydomain.com."
As usual, it seems that Google Apps users are in a different (lower) class when it comes to Google functionality.
I had to sign out of my Apps account and sign in with my Gmail account. Switching accounts didn't seem to work. I'm assuming (perhaps naively) that Google is synthesizing all my history into that one "primary" account.
I hate that Google Apps accounts are second-class citizens. Especially knowing that you can pay for corporate accounts... seems backwards.
Same for me. What helped is going there manually. Click on the gear icon and select Web History. You will be prompted to confirm login and then you'll proceed to web history page. I guess it's for security reasons so some malicious javascript won't grab your history (that easy).
I didn't have that problem. Google did ask for my password again, which made sense, but I didn't end up on their home page.
I didn't specify HTTPS, but I use the HTTPS-Everywhere Firefox extension so the result should be the same. Even so, maybe there is some slight difference which altered the behaviour for me.
After doing what EFF recommended, the Web History is "paused" indefinitely. If you want to opt out of the service completely, you can use the following URL:
As far as I know, Google Takeout doesn't support Web History. If they supported it, I'd take out my search history in a heartbeat.
As it stands now, I do sometimes refer back to it, but I don't think it's worth having it if it means giving up the information to advertisers as well.
"You can delete information from Web History using the remove feature, and it will be removed from the service. However, as is common practice in the industry, and as outlined in the Google Privacy Policy, Google maintains a separate logs system for auditing purposes and to help us improve the quality of our services for users."
So.... why would I delete this information if Google still keeps it elsewhere? Do I just have no ability to control what data Google collects on me, even if I agree to stop using its services?
1) If someone breaks into your account, that person can steal all your search history. This is why I disabled search history recording a long time ago.
2) It was uncovered that Google passess some of its data about users to third parties, especially ads agencies. Now, with the new policy, you can't be sure that search data leakage will not occur. Moreover, ads agencies use mechanisms like cookie syncing to share user data between themselves.
"It was uncovered that Google passess some of its data about users to third parties, especially ads agencies." Can you provide a source to back this statement? (I am not challenging you, just interested to learn about this)
I think it's just trying to tell you this isn't a silver bullet for get-me-out-of-google, there are other systems that keep records of when you login/do things or other systems that log searches but don't identify them to you. You just opt out of this specific system.
without going back to read the privacy policy again, I believe a big difference is that normal search logs are anonymized after a certain amount of time, whereas search history is kept as long as it's enabled, I assume.
Sometimes I get the feeling in all the discussions around google's new privacy policy that I'm one of the few people around here that bothered to actually read it :)
I'm not convinced this works as it should. I turned of history a while back and have since repeated "remove all history". But I still get these ads related to searches and email that is too obscure to be co-incidence - e.g. Lately I get shown ads (sometimes multiple times daily) for "Foreign SIM cards" as I googled this about a month ago. Especially on Youtube. Anyone else notice this who has turned off their web history?
Are you sure they're Google Ads? If you followed a link from the search results, and that link had ads from another search network, that other network would've associated you with that search term.
or the same scenario, but with doubleclick ads. The FTC(?) agreement for acquiring doubleclick included a clause that they can't mix google account information and doubleclick logging, but as parent said, the sites you clicked through may have had doubleclick ads and associated that topic (or their brand specifically) with your doubleclick cookie. Deleting it should reset it.
Note that disabling Web History in your Google account will not prevent Google from gathering and storing this information and using it for internal purposes. It also does not change the fact that any information gathered and stored by Google could be sought by law enforcement.
With Web History enabled, Google will keep these records indefinitely; with it disabled, they will be partially anonymized after 18 months, and certain kinds of uses, including sending you customized search results, will be prevented.
Perhaps they removed the web history itself, but kept whatever inferences they made, which are associated with your Google account. Which is about as bad as keeping the history, but it's probably completely separate system, which just uses the history for training purposes.
I knew about the search history, but apparently a number of people didn't. This makes me wonder: Are there any other semi-hidden Google services which I can "clean out"?
There are probably better ways of doing this but I just use Chrome now exclusively for Google services. I don't login to Google on my primary browser anymore. I'm hoping this will limit their stalking a bit. I don't mind if they read my mail since that's the price I pay for using the service but I'm getting a little paranoid about the other things they might be doing. Better to keep it sandboxed away from the rest of my browsing.
I've tried this but find it to quite a pain to remember what browser I should be looking at what in.
What I want is a browser where I can be logged into Google, and any link I click will open that link in my "private" (not logged into Google) browser. Same with Facebook, Twitter, etc.
So I've started developing XUL apps to run each in their own sandbox.
It's pretty simple right now. I only spent an hour or so on it and will post it to GitHub when I get some more time to iron out a few kinks.
Basically you open the XUL app for the service you want to access. It acts as a chromeless firefox browser and intercepts any links you click. If it's an external link it opens it in your default browser where you wouldn't be logged into google preserving a bit of anonymity.
Wow, did not know that this existed. Now I know that across 11 thousand searches on Google, AAPL was my top search, and that my search activity spikes on Mondays and declines through the rest of the week: http://o7.no/wiL2jx
I use google apps for domains and I get the following message: "Web History is not available for MY_DOMAIN. Learn more about Google products you can use with MY_EMAIL."
Is Google collecting this data and not giving me the option to turn it off, or are they not collecting the data?
If it's not enabled, the logs are not associated with your account. You could enable the service in the apps control panel (assuming you're the administrator) and check for yourself, though.
Interesting. Although I'm not sure I care if Google applications have access to information about my location, interests, age, sexual orientation, religion, health concerns. I'm a 42, male, married, Catholic software developer living in Brazil.
youtube.com doesn't have any obvious method of removing of past watched videos and is even more nefarious than Google will probably ever be (before March 1st, 2012 at least) about tracking visitors.
Additionally, YouTube has become pretty much just a branch of the RIAA and MPAA and their local equivalents, and I'd rather burn down my house than allow them access to my data.
That's why I've been blocking YouTube from setting cookies on my computer, and that's why other people should do too.
I am not saying that the average user knows about them, though, but they're not that hard to find.
(Also, since you seem to talk about YouTube and Google as separate companies: They're not. Google owns YouTube and their new policy applies there as well.)
Call me naive, but I like the idea of having tailored Search results and integrated information and recommendations based on my data at Google products.
Google is one of few companies that I trust. So while most people are freaking out about losing some sense of privacy I personally think that it can open some very exciting opportunities and new features. I want to see where this is going and how Google will be able to use my information that he has to provide something valuable to me. I'm happily keeping all my google data and search history. And I'm also ok with Google selling my anonymized profile with third parties.
After my password was stollen on PSN PWNAGE and my own Mom broke into my MSN account to delete my contacts on a Rage spree I don't trust many people anymore. So don't think I don't think my info and privacy are not worth my caution.
But after reading Google ToS carefully I agree with everything they want to do and I'm ok with it. Most people certainly never read Google's new ToS. a lot of information on the news in the last months and even here in Hacker News are FUD and misconceptions. Google will not own you, they just want to use your data to provide more value to you and it's advertisers. You are not exposed.
So top freaking out a little bit. Read the ToS and think about it.
I've just done so, and I intend on keeping Web History turned off. I simply do not derive enough benefit from having this data around for myself (when's the last time you benefited from Web History), and unlike other permanent data such as G+ posts, comments etc I do not self-censor searches that I would like to conduct, so I find it very plausible that someone with more processing power than me could find out more about me than I intend.
I was just about to and then realised that I had turned it off a while ago. The fact that I had completely forgotten it shows that I don't need this data.
However, it's also not something I'm hugely concerned about; as others have said the main danger is from some unscrupulous individual gaining access to your account. In terms of Google having access to my data, I would imagine my data will never be seen by human eyes. Having said that, one does need to remember the inherent permanence of data on the internet, as Facebook's timeline has shown sometimes your data can surface in unexpected ways.
An amusing way to spy on your friends is to create a google account, and then log into it from their computer. Until they log out of your account, you will be able to view their web history.
Aside from this, I have avoided this Google feature.
I did it a few years ago, for the concern that someone can log into my account and see or steal that data. The second reason was, that there actually wasn't a practical utility for me to have access to that data.
I likely won't. Reason being, if I want customized, "filtered" searches, Google is where to be. If I want privacy and/or unfiltered searches, I'll use Duck Duck Go and Blekko.
If trust becomes an issue with Google, I won't just clear my search history, I'll leave for good.
When somebody's life is horribly dismembered as a result of Google's insane privacy policy (new or old), please let me know. I'll start thinking about privacy and necessary safety measures at that point.
Until then, please stop whining about privacy, because frankly, I just don't see how any of this really matters. My life has yet to be negatively impacted by Google and I don't foresee it happening in the near future.
To be fair, I haven't seen a load of claims about how this is "insane" or "horrible" within the context of this article. It just provides a useful instruction as to how those that wish to opt-out, may do so.
It's fine that you don't see how this matters but it does matter to some and having the choice and understanding how to exercise that choice is essential.
Fair enough. I do get your point - people seem to go mad about privacy in general. It does sometimes feel (to me) like people aren't as bothered about it with Google as they are with Apple, Amazon or Facebook, but maybe that's just me.
No need to apologise, wasn't trying to call you out or anything, just felt like this was one of the few posts I've read that doesn't whine too much about it.
When people draw your profile to sell you things it is just plain manipulation. I don't know about you, but I don't like it. Have you read this article, it is quite convincing.
The issue is not with start-ups or established players its with US Congress and the DOJ who have consistently used 911 to over-reach to control things so that they do not have to hear dissent. Sort of what the standard Russian citizen experiences on a daily basis, bbut they are not alone in that experience.
Thus, when did we become a less free 'Third World Country'?
My apologies to citizens in Third World countries as I lack the vocab early this morning to express it in a different way.
when i go to that page with my spam account, which i'm logged in by mistake when i do most of my searches anyway, i get one screen asking to enable web history, with a button saying "no thanks". and i never get to the calendar screen.
with my de facto account, it shows the calendar. and there's no way i can find a way to that "no thanks" button
I'm still not sure why people are afraid of Google's new privacy policy. I understand that there are people who have specific privacy needs, but outside that scope I doubt you have anything to worry about.
It's doubtful at best that Google's "log" of you would become compromised (unless your personal account were compromised, but then this would have been a problem anyways!). It also isn't the case that some Google employee is reading row after row of Google's customer DB snooping on individuals.
Google isn't some unified entity; your data is being manipulated by advertising algorithms to tailor ads for you. Unless you care about a CPU "knowing" your secrets, or you have specific privacy needs/concerns, none of this is a problem.
Maybe someone can surprise me with some good reasons to be concerned, but until then I am trusting Google.