(I'm totally speaking for myself personally here, not in any official capacity.)
As the post indicates, Thomasmonopoly's account wasn't disabled for something like misusing AdWords; it was an investigation of potential child pornography. Thomasmonopoly himself says in his write-up that "I too found the image bordering on the limits of what is legally permissible and hoped to highlight the fact that it is allowed to exist within a grey area of legality."
Google has a zero tolerance policy for child pornography. I am glad that Thomasmonopoly got his account reinstated after a full investigation, but it's also incredibly important that Google takes appropriate action on potential child pornography, and United States law compels companies to react to child pornography in certain very specific ways.
For what it's worth, I got a chance to do a question and answer session with some congressional staffers earlier this year, and one of the things I said was that (in my personal opinion), current laws on child pornography were suboptimal.
Here's a quick example from a few months ago: http://www.winknews.com/Local-Florida/2011-04-29/Fla-Senate-... "The Florida Senate voted to extend the state's anti-child pornography law to include not just possessing but also intentionally looking at such images." Looks like the text of the bill is here: http://www.flsenate.gov/Session/Bill/2011/0846/BillText/File... and I don't see any exemption for people who fight or take down child pornography. And that was literally the first link I found after doing a search on Google.
So in theory, looking at images in the process of trying to fight child pornography could be illegal. You don't need to dig far to find similar brittle examples. That's why I'm glad that I work on webspam and not on trying to stop child pornography.
> Google employs an automated system to scan user storage for violations of their ToS
So if I take pictures of my children in the bathtub with my Android phone, I am risking having my entire Google account deleted with no recourse except to try to 'make a stink on the internet'?
Because frankly, I would never take photographs to be developed at Walmart because they're well-known for calling the cops on parents who took pictures of their children.
What exactly is your complaint here? How do you expect a multinational corporation to deal with this sort of issue? How do they know that you're the parent, and not just some pervert who likes pictures of kids in bathtubs? I'm fully supportive of both Google and Walmart in this.
I'm a parent of four young kids, so I can appreciate that those tub pictures are adorable. If I really wanted that picture, I'll take the hassle of finding a smaller print shop willing to do it over enabling child pornography any day.
-- the police came and charged my tenant with selling drugs, so I kicked him out.
and
-- every day I obsessively searched through my tenant's belongings when he was at work to make sure he wasn't violating any laws, in my sole opinion, and then when I thought I found something illegal, I kicked him out.
Google can't judge what is or isn't child pornography. Lawyers and judges can't even do it. Nothing is child pornography, no matter how explicit, unless it appeals to the "prurient interest".
And in fact I'm not even located in the United States. The child pornography laws in my jurisdiction are less vague and more narrow than those of the United States. Is Google's crawler programmed with laws of every jurisdiction worldwide? I rather doubt it.
It would be perfectly legal for Walmart to take a non-proactive approach to photo developing. Machines do it all anyway - the only human step is picking up the stack of photos and putting them in an envelope. But Walmart has directed its employees to search through all photos, searching for kiddie porn, and to call the cops. That's a personal stance of Walmart's CEO.
Google is similarly protected - it has no legal liability in the United States for serving as a passive conduit for anything its users care to distribute. It's unfortunate that Google's CEO is adopting a similar stance.
Can you give us an idea of the scale of the problem? Is handling flagged accounts something that could plausibly be improved by creating a "large enough", dedicated team? Or do legal and/or practical constraints (e.g. too many cases to look at) prevent that?
(Again, just answering for me personally, not with my official Google hat on.)
Good question. Just like a good programmer finding a bug should ask "How could I prevent this bug from happening next time?" it's pretty common that when a situation like this occurs, the relevant people at Google ask "How could we prevent this situation from happening next time?"
In this case, I believe the solutions that have been proposed elsewhere on this thread (letting the outside person know about the suspected violation, letting the outside person have access to their data) can be fraught with potential legal difficulties.
I'm sure that people at Google will be discussing what different steps could prevent a situation like this from happening in the future.
As the post indicates, Thomasmonopoly's account wasn't disabled for something like misusing AdWords; it was an investigation of potential child pornography. Thomasmonopoly himself says in his write-up that "I too found the image bordering on the limits of what is legally permissible and hoped to highlight the fact that it is allowed to exist within a grey area of legality."
Google has a zero tolerance policy for child pornography. I am glad that Thomasmonopoly got his account reinstated after a full investigation, but it's also incredibly important that Google takes appropriate action on potential child pornography, and United States law compels companies to react to child pornography in certain very specific ways.
For what it's worth, I got a chance to do a question and answer session with some congressional staffers earlier this year, and one of the things I said was that (in my personal opinion), current laws on child pornography were suboptimal.
Here's a quick example from a few months ago: http://www.winknews.com/Local-Florida/2011-04-29/Fla-Senate-... "The Florida Senate voted to extend the state's anti-child pornography law to include not just possessing but also intentionally looking at such images." Looks like the text of the bill is here: http://www.flsenate.gov/Session/Bill/2011/0846/BillText/File... and I don't see any exemption for people who fight or take down child pornography. And that was literally the first link I found after doing a search on Google.
So in theory, looking at images in the process of trying to fight child pornography could be illegal. You don't need to dig far to find similar brittle examples. That's why I'm glad that I work on webspam and not on trying to stop child pornography.