Hacker News new | past | comments | ask | show | jobs | submit login
The Agency: An army of well-paid “trolls” in St. Petersburg (nytimes.com)
207 points by sergeant3 on June 2, 2015 | hide | past | favorite | 92 comments



> Volodin installed in his office a custom-designed computer terminal loaded with a system called Prism, which monitored public sentiment online using 60 million sources

Oh look, they've got prism too!

> According to the website of its manufacturer, Prism “actively tracks the social media activities that result in increased social tension, disorderly conduct, protest sentiments and extremism.”

Well that's comforting.

This is obviously still in it's infancy, but they're only going to get better at this. Obvious leads and dead giveaways will be replaced with more subtly and variance, and detecting whats bullshit and whats real will become even more difficult, if not impossible.

This is russia's answer to people rioting over corruption in the government. Not transparency, not change; social manipulation of information and spreading lies.

And they aren't the only ones doing this. They're just the only ones we know about. No doubt other countries, companies and criminal organizations are operating similar tactics already.

Certainly puts a new spin on the term "don't talk to strangers"


I'm not sure if you'd agree with it or not, but I'd want to emphasize something: this behavior is not limited to foreign governments that have antithetical interests to the USA. Even publicly, the CIA recognizes that it has teams of employees working on social media to push people (particularly in the Middle East) to support positions more consistent with American interests.

And it'd surely be pretty damn profitable in the USA domestically to offer a service where a bunch of trolls would push your product, service, or candidate on the Internet.

(This comment may or may not be brought to you by the Russian Foreign Intelligence Service.)


Israel commands a small army of these as well, it's almost comical to see them appearing in social media and comment sections, but I have no doubt they're effective.

http://www.usatoday.com/story/news/world/2013/08/14/israel-s...


Comparing an army of mis-information trolls pushing lies to students who are paid to "combat anti-Semitism and calls to boycott Israel" by telling the truth is pretty silly, and intellectually dishonest. There is a wave of antisemitism resurfacing in Europe, as well as antisemitic calls for the destruction of the state of Israel. And there is nothing wrong or dishonest about a country fighting back against attempts to delegitimize its existence with information.


and you've just conflated anti-semitism and calls to boycott Israel, which I would certainly class as intellectually dishonest.


Actually, no I did not. This is the purpose of the Israeli students being recruited, quoted verbatim in the article - not my conflation. Can I assume that you agree with my characterization of you comparison as intellectually dishonest (and factually inaccurate) since you did not disagree with my point about your post?


I do think it's strange that when Israel promotes nationalism in it's clearly more controversial status -- being defamed by all of its neighbors and a large segment of the western left and right -- suddenly this is a bad thing. If we had (serious) pressure to dissolve the U.S., I think we would see a backlash of equal or greater measure. Your use of language here also clearly shows your position: 'army' is a very harsh word to use for some students posting on social media.


...said the israeli agent.


Not sure if this is joking or not, but the fact that I can't tell is itself indicative. This is an example of just how toxic state- and corporate-sponsored trolling is to communities. Any given position will promote winners and losers, and any position anyone takes could, in theory, be the result of dishonest, arational, apersonal influencers who lack the desire or drive for a rational civil society. And I think it's even fair to discount all potentially corrupted opinions because of that risk--but the end result is having to give up on the Internet as a forum for discourse, which pretty much kills off its liberatory potential.

We need to find a way around this ASAP, before it gets worse. And it will most certainly get worse.

(This message has been bought and paid for by... well, who knows?)


>This is an example of just how toxic state- and corporate-sponsored trolling is to communities.

Rather, this is an example of how toxic the fear of it is. It's like that episode of the Twilight Zone, "The Monsters are Due on Maple Street," where the invaders only have to sow a tiny bit of paranoia and mistrust and watch people teach each other apart. Except in this case, there don't even need to be shills, the result would be the same either way.


Every opinion is potentially corrupted, especially when propaganda exists. The result isn't giving up on the Internet as a forum for discourse, it's giving up on discourse.


I like the results of having a civil society with rational discourse. Not simply that it's the result of the inalienable rights of thought and speech, but because society as a whole has massively benefited from it. The marketplace of ideas bubbles to the top the "best" ideas, but it is almost a requirement that its constituent agents at least recognize the value of reason, even if they themselves are flawed and biased implementations of reason.

It may be the case that it's impossible to save discourse, and the future is a world where brute force, power, and hierarchy drive which ideas dominate society. We'd be lesser for it, though, and I don't think we've explored many ideas for preventing that end result.


I agree that a society with rational discourse is clearly the best option, and that this is an extremely difficult problem, notably because everyone likes to think that they're rational. We're always going to be subject to external inputs that change how we perceive reality and the world around us, and not always in a rational way. Some people recognize the value of reason, and make it their business to exploit it, while others recognize the value of reason but fail to acknowledge that the world they live in affects their views and perception of reality, and a third group recognizes the value of reason but sees their identity as immutable -- this is the most dangerous group, in my opinion, and a group I regret to say I've been a part of.

My point is only that our opinions are swayed and altered by propaganda, media, and the world around us in general. To say an opinion has been 'corrupted' is a murky concept to me, because I'm not sure how much of our opinions I believe are original, and what corruption actually means in this instance.


Israel's recruiting power in Georgia is unrivaled. I know at least twelve other Georgia natives who are part of the shadow cabal, using our WASP heritage as cover!


Poorly titled and therefore hard to share on Hacker News, but lengthy and worth reading investigative reporting into a bizarre 'Internet Research Agency' that mainly produces pro-Putin propaganda, in both English and Russian.

I'm surprised, frankly, that they don't have better English-language proficiency.


> I'm surprised, frankly, that they don't have better English-language proficiency.

People with high English proficiency are probably employed at better-paying gigs.


> that mainly produces pro-Putin propaganda, in both English and Russian.

And also seems to be testing disinformation campaigns in the US, such as spreading panic about (fake) disasters.


I feel like this (testing disinformation campaigns within the US) wasn't addressed fully and is the most significant aspect of the "trolling" operations. Not many care if Russians are spreading lies to Russians about Putin, but attempting to cause panic in the US by faking a disaster is the most dangerous aspect of this practice.


It's also spreading lies about Russians to non-Russians. See for example The Guardian online forums - these guys dominate conversations which have something do to with Putin's policies (say the Crimea situation etc).


Ugh, fail.

The Guardian comments sections are a trashbag of political censorship and general idiocy, but I see no evidence that they are being trolled by the Kremlin. A whole lot of people THINK they are, and the Guardian has levelled such accusations (without presenting evidence) but my own experience is that I am routinely accused of working for the Kremlin there, based on no evidence at all. Their view is simply "you disagree with me therefore you must be a paid troll". Making things worse: the Guardian moderators delete vast numbers of comments that would be considered completely acceptable anywhere else, merely for questioning what their articles say.

I think the only way to respond to government-sponsored trolling is to just ignore it. Who cares what someone's motivation is? The only practical response is the same anyway: answer back and be more convincing than they are.


I've found the CBC comment moderators like that as well. The moderation policy is essentially, "Don't be vulgar or threatening"

Yet comments that violate the policy but agree with CBC writers views are allowed while polite, ad-hominem free dissenting comments are removed.

When there's too much dissent comments are simply closed.


Right. The Guardian has a rule that you're not allowed to insult the journalists. Unfortunately, suggesting that they're wrong, biased or maybe didn't do their homework is routinely considered to be insulting the journalists :(


> The Guardian comments sections are a trashbag of political censorship and general idiocy, but I see no evidence that they are being trolled by the Kremlin. A whole lot of people THINK they are, and the Guardian has levelled such accusations (without presenting evidence) but my own experience is that I am routinely accused of working for the Kremlin there, based on no evidence at all.

For me the strongest evidence (still non-conclusive obviously) is the fact that the users who are heavily pro-Putin, while clearly being a minority on the forums, almost always get ridiculous amounts of upvotes. For example, for a typical political comment on the forum, getting 50-100 upvotes is rare, while these pro-Putin posts routinely pull 200-500.


That's not evidence. That's another supposition. Again, you're just assuming that because a viewpoint you disagree with seems popular it must be due to some kind of manipulation.


That's the thing though - these posts clearly don't represent a majority in terms of volume and yet they get obcenely large number of upvotes.


Agreed. Why would they be doing that?

All I can think is that psychological operations always have been part of war (and conflicts short of war), and the obvious place to conduct them in 2015 is not on leaflets dropped from airplanes, but on the Internet. I don't think it's conspiratorial to expect that nations are developing capabilities and training.


I disagree, I believe that every cry of "Wolf!" regarding national disasters benefits the population at large, if only to desensitize them from sensational reports.

An analogy: do you know anyone who still falls for inheritance scam emails? Ten years ago, there were many reports of victims succumbing to fraud emails.


Russians have a very low English level by european standards, and people proficient in English cost more. Main profit model of this establishments was always cutting costs while charging more — especially when the person you're charging is a government official (whom you know very closely), who is hiring your company to get some of this budget in his pocket in the first place. So, the quality is not at all the priority, and when it is, the % of the money that gets stolen is still enormous (Sochi).


And perhaps people who are proficient in English are not only costly, but they might also be politically unreliable...


My dad claims that the level of brainwashing in Russia far far exceeds that of the Soviet regime, and the propaganda is by far worse (and worse than WWII Germany he says).

Of course statements like that evoke Godwin's law and a hard to swallow. However, my dad grew up in Soviet Russia and trained troops on the use of artillery equipment during the Soviet Afghanistan conflict. He's been around the block and has seen firsthand the Soviet propaganda in multiple Soviet republics.

If you think about the insidiousness of troll-style, FUD propaganda, it's way way more psychologically insidious than the overt Soviet propaganda. You can't cut off access to facts and appear "fair"--instead you make people question the truth.

Russia just raised a whole young generation that's a socially-backwards by a generation. Homophobic, racist, blame-the-west feelings are rampant.

During the early days of the Ukranian conflict, I was surprised to hear a close relative in Russia tell me how she has seen videos on Youtube of "blacks from Africa" killing innocent Russian youths ("gouging their eyes out" she said) in Eastern Ukraine.

This shit is real.


I was born and raised in Ukraine till I turned 14. Still have most of my family there and they visit regularly. They were present during most of the revolution and I can confirm nothing but love and pride to be Ukrainian. All that talk about Ukrainian nazi is garbage. Yes, we aren't perfect and we may have difficulties bringing order but it was all done with good intentions at heart. Most recent revolutions, including the Orange revolution, are some of the most peaceful times in Ukraine. People are truly united, crime rates drop dramatically, people make food and bring it to feed others for free. Both of my grandparents attended all of these events. I've never been more proud, I hope peace and order can be restored without Russia's involvement.


Taken out of context, this type of comment makes for good nationalist propaganda.


Nothing but love here, I didn't even say anything negative about Russia. You kind of surprised me with that actually.


Yeah good luck in Ukraine. I see the pro Russian 'rebels' have gone from claiming the Kiev government are nazis to claiming they are 'miserable Jews.' They seem a bit confused in their messaging. If it was just propoganda it would be comical but sadly the Russians seem to be sending troops and weaponry also.


Truly united? Sadly that's just your PoV.

I was too born and raised in Ukraine, in Odessa. As many Odessans, we will never forget May 2nd, 2014 and will never forgive.


[flagged]


Cool comment, what does it contribute?


[flagged]


I'm genuinely sorry you feel that way. What I offered was a first-hand experience of my family, not my government or some political garbage. Yet you chose to turn this into a discussion about our broken government and individuals within it.

If this at all helps, I've lived in US for over 10 years now. I am a citizen, I finished college here, got married and started my career. I'm never going back because I don't like the turmoil but nothing can invalidate my initial comment and the truth behind it.


The rest of the world has also gotten much better at propaganda. Listing to older broadcasts can almost seem laughable in comparison.

EX: Reefer madness vs. modern anti-smoking campaigns. http://en.wikipedia.org/wiki/Reefer_Madness http://www.usatoday.com/story/news/nation/2014/06/24/cdc-smo...

I find it deeply concerning how easily a modern state can alter how its population thinks. Outright lies are a lot easier to counter than selective truths.


Some of the change has just come from the entertainment industry. As it has gotten used to working in film/video it has discarded more and more elements of the stage (styling it as a stage show is at least partly why something like Reefer Madness was so silly).


Embedding thoughts, mindsets, attitudes, and prejudices is very easy for a government to do when they control the curriculum requirements for the nation's children and discipline teachers when they stray from the official positions.


I dunno about sophisticated. The presumably failed experiment with the #ColombianChemicals hoax was almost cute in its lack of direction or success[1]. It sounds like the sort of hoax Anonymous - teenagers without leadership or direction or payment - would successfully have pulled off. And the ironic thing is that even if they had the wherewithal to actually convince people, the rumour would barely have moved the needle compared with the hornets nests stirred up on a daily basis by rumours (true and false) spread by the mainstream Western press (much of which is sufficiently unprofitable for the price of influence to be affordable, incidentally[2])

If Russian propagandists are generally active in the Western media, it's difficult to view them as particularly effective: you'd have found more Western contemporaries willing to sympathise with Stalin than Putin, even during the McCarthy era, and Obama's approval ratings aren't doing too badly. I can't judge their effectiveness in Russian media, but I doubt that a particularly large proportion of the promoters of nationalism and bigotry are on Putin's payroll.... its not like their ilk is exactly nonexistent in the West.

Goebbels' message might have been more top-down and less subtle than Russia's social media spamming, but it did convince a reasonably well educated and well-to-do population that there was absolutely nothing wrong with rounding up all the Jews to deport, and convince the rest of Europe there was nothing to be particularly concerned about. That takes a little more effort than persuading Russians the West doesn't like them very much.

[1]to the extent that false explosion reports on September 11th can be considered cute

[2]Four mainstream British newspapers actually are owned by a wealthy Russian, though he's far too involved with Russian opposition parties to be inclined towards toning down their pretty standard critical editorial line on Russian policy


It has exceeded the level of the Soviet Regime during its later years because it's quite effective. Soviet Regime was falling apart so in the last few years of it everybody paid lip service to propaganda, but nobody really believed it because it was quite obvious that the system was falling apart. This is not really the case now in Russia. Yet


The astonishing thing about this propaganda machine is not that it is technically proficient. It is not in any way; it is technically crude. It's just a fairly direct political control by state of the TV and papers, and for on-line, just hiring people to spread propaganda on forums.

What is astonishing is its psychological effectiveness. It seems to encompass all levels of Russian society, especially the poorer people who are cut off from anything but the state TV, but also the middle classes. And the elite is wise to shut up and pretend to believe though they surely know what goes on. The conformity to social norms is so important: if you don't condemn the Kiev Nazis, you are a fascist yourself. Any disagreement will make even friends and family members turn against you, regardless of how good evidence you have for what you say.

I know Russians who live abroad and/or are net-connected and understand what goes on, but they are simply staying quiet, because they don't want to loose contact to their relatives and long-time friends. In a way, it's an example of "The Only Thing Necessary for the Triumph of Evil."


My dad claims that the level of brainwashing in USA far far exceeds that of the WWII Germany he says.

Of course statements like that evoke Godwin's law and a hard to swallow. However, my dad grew up in USA and trained troops on the use of artillery equipment during the Vietnam war. He's been around the block and has seen firsthand the US propaganda in multiple countries.

If you think about the insidiousness of troll-style, FUD propaganda, it's way way more psychologically insidious than the overt US propaganda. You can't cut off access to facts and appear "fair"--instead you make people question the truth.

USA just raised a whole young generation that's a socially-backwards by a generation. Homophobic, racist, blame-the-russia feelings are rampant.

During the early days of the Ukranian conflict, I was surprised to hear a close relative in USA tell me how she has seen videos on Youtube of "blacks from Africa" killing innocent US youths in Nigeria.

This shit is real.


Well they didn't really need propaganda in Russia. They had communism. Believe it or not some people actually believed that Communism is better than capitalism, and arguably at one time capitalism kind of sucked.


> Well they didn't really need propaganda in Russia. They had communism.

And how do you think they spread that and maintained it once it was spread? Propaganda.


The exact same way they maintain democracy and fascist dictatorships. Even though the fascists need more than propaganda to keep it up.


> arguably at one time capitalism kind of sucked.

Still does!


I you haven't read to the end of the story, I highly recommend finishing it; it has a pretty great ending.

Also! The article outs a couple of US-facing troll operations, one of which, "Spread Your Wings", is an unintentionally hilarious satire of US partisan politics, and another highly recommended read if you're up enough on politics to be in on the joke:

https://www.facebook.com/actoftruth?fref=photo

(My current favorite Spread Your Wings propaganda: GOP Senator and Presidential hopeful Lindsey Graham: warmonger and Adversary Of The 2nd Amendment.)

(Moments later: NO WAIT, this inspirational uplifting quote clearly wins the Internet today: https://www.dropbox.com/s/4r7r7nvbpn2kp3h/Screenshot%202015-...)


If you don't feel like reading it, this is an amusing tale of one arm of the pro-Kremlin propaganda machine. The kicker is at the end when Russian propaganda sources tried to make it seem as if the reporter were recruiting Nazis for a complicated CIA scheme.


Think about that: we know Adrian Chen did not in fact go to Russia to make common cause with neo-Nazi groups, so the most likely interpretation of those photos is: some or all of that neo-Nazi group is on the payroll of Russian propaganda groups.


Ah, the Russian trolls.

I run a crowdsourced news startup (https://grasswire.com), and along with it one of the bigger non-mainstream news Twitter accounts (@grasswire, 130k followers).

The Ukrainian conflict has died down a little bit, but when it was in full swing the Russian trolls were everywhere, and were pretty obvious, but that didn't matter.

For example, here are a couple of my favorites (and among the most active:)

https://twitter.com/gogigogi12 https://twitter.com/steiner1776

I'm pretty sure now they're mostly set to automatically retweet anything with a pro-Russian sentiment, but back in the day they would pretty much respond to every tweet. For the casual user it was very convincing stuff: "Oh wow, maybe @steiner1776 is right and that report was full of lies, and I just didn't know better - thank goodness I have someone who is willing to stand up for truth."

Most people don't have time to research, and just follow public opinion, even if the public opinion is created by a bunch of trolls. It's especially easy to fall for the trolls when they tell you that "the man" has been lying to you all along (and in most cases "the man" means the United States). (To be clear: The United States does some messed up stuff and is far, far from without fault.)

There were a few days when they would seemingly download photos from any tragedy that had ever happened, and just add a generic line - something like "Stop killing Donbass people," insinuating that these deaths were caused by the Ukrainian army (and by extension, somehow, the United States). The photos were from Syria, Iraq, Palestine, Israel - wherever. I'm sure they were just being pulled from some file somewhere with a generic message attached. But that stuff _exploded_. Hundreds of thousands of retweets per day, driving public opinion, based on complete and utter bullshit.

So there are some Russian trolls; this isn't the end of the world, right? The problem is that for many of the people on Twitter these were real people expressing legitimate concern. Some of their blatantly false tweets (they were easily discoverable to be such with a reverse image search) went legitimately viral (not just spun into their other accounts for retweets). Hundreds of thousands of impressions per tweet from people who aren't going to run a reverse image search, and get their news from Twitter.

I don't know what the solution is, but this is extremely dangerous for society. It's easy to pass them off as "trolls," but trolls change the way that people act and believe, even when they're only sharing false information.

We started Grasswire in part as a response to that. The fact-check feature on Grasswire isn't used as frequently now, but at the time about 50% of the stuff that went viral was completely and verifiably false - look at the history of the Grasswire Fact-Check account (@grasswirefacts) for examples of that. It helped, but I fear it was only drop in the bucket.


I'm really worried about the course Russia is taking under Putin, but people like Ludmila Savchuk make me hope there is still some hope... I wish her well.


I am sure this is not only happening in Russia.


Some PR strategies are similar. A company hold an annual conference several weeks ago. The company was under represented in the HN user base. What was eye opening was that many new users joined HN in a short timeframe to write very pro-company comments - it looked like an concerted effort to polish the company reputation. Another example is related to a recent video game release. The marketing was in a gray area almost a fraud and the launch was like a debacle, though the PR managed to get away with it with a black eye.


Do they really need a marketing department in the video game industry? because game news sites are just that, a outsourced marketing department, I mean just look at IGN,Polygon, GS and co, heavily pushing Batman these days with like 12 "articles" a day to build the hype,same thing with many games before. There is no criticism over the terrible commercial practices, the games riddled with bugs at launch, nothing. What the point of "reviewing" ? they are selling these games way before they launch and the reviews are published like one week after launch if not more ( like the order, and others ).

The game industry in collusion with "game journalists", succeeded in making people want less for their money. The dream of any big corp.


> The Columbian Chemicals hoax was not some simple prank by a bored sadist. It was a highly coordinated disinformation campaign, involving dozens of fake accounts that posted hundreds of tweets for hours, targeting a list of figures precisely chosen to generate maximum attention. The perpetrators didn’t just doctor screenshots from CNN; they also created fully functional clones of the websites of Louisiana TV stations and newspapers. The YouTube video of the man watching TV had been tailor-made for the project. A Wikipedia page was even created for the Columbian Chemicals disaster, which cited the fake YouTube video. As the virtual assault unfolded, it was complemented by text messages to actual residents in St. Mary Parish. It must have taken a team of programmers and content producers to pull off.


The article is really good, but this title is extremely vague.


We added part of the article's subtitle to make it more informative. We're open to suggestions for a better title.


Indeed. I assumed it would be about CIA.

The subtitle is much more descriptive:

From a nondescript office building in St. Petersburg, Russia, an army of well-paid “trolls” has tried to wreak havoc all around the Internet — and in real-life American communities.


> ISIS had claimed credit for the attack, according to one YouTube video; in it, a man showed his TV screen, tuned to an Arabic news channel, on which masked ISIS fighters delivered a speech next to looping footage of an explosion.

This linked to the wrong video, the actual link is https://www.youtube.com/watch?v=E2J6RvajSaA

Edit: He fixed it.


None of this is surprising behaviour (insert obligatory Tu quoque argument about other countries).

What seems funny that the behaviour of various workers is eerily reminiscent of various characters in Pelevin's "Generation P" http://en.wikipedia.org/wiki/Generation_%22%D0%9F%22



We should call them what they are: Sock Puppets.

Calling them trolls is an insult to trolls.


I doubt about "well-paid" part. From the article you can deduct they're pair around 25k RUR/month, $500 in today exchange rate.


Wikipedia says that's around the average Russian wage.


"average wage" != "well paid in a large city".


Astroturf happens everywhere... news at 11... it's easy to laugh at the paid Russians and their broken English but it wasn't too long ago you had stories of British intelligence rigging online polls and the US government contracting out software to manage mass amounts of social media personas to target jihadists(and I'm sure the same tactics are being used against Russia too).


The question for me is how to counteract this kind of trolling. If someone is saying something stupid, it's easy as an individual to ignore it ignoring the time wasted reading it, but it's genuinely deadly for communities.

I think some kind of web of trust, except this time for comments or social media artifacts, is probably the way to go.


Read the comments on the nytimes article...


The Ministry of Truth in the making.


This sort of thing has happened recently on HN, where trolls derailed the conversation so badly that it was flagged into oblivion. (dang brought it back to life after an email appeal).

https://news.ycombinator.com/item?id=9570202

What's the right way for a community to permit free and welcoming discourse and simultaneously blunt the ability of malicious actors intent on spreading disinformation? It's a hard problem.


Be careful. The notion that opinions we don't like are generated by organized trolling campaigns is toxic, and insidiously serves the objectives of the trolls.

If you feel like comments on a thread are coming from an organized trolling/shilling campaign, try to collect some evidence for it and then notify hn@ycombinator.com --- that's what they've asked you to do.

But don't start a conversation on the thread itself about how specific commenters, or even a particular style of comment, are being made in bad faith. It's the easiest criticism in the world for a troll to deflect: just start an argument that you're commenting in good faith and being persecuted for unpopular opinions. 100 comments later, the thread will seem to casual readers like nothing but litigation over who the commenters are. Mission accomplished.


My biggest concern with the death of that thread was that it died due to flagging.

Totally agreed that even the prior belief that trolling campaigns might exist is toxic.

Thanks, everyone, for the comments. They're thoughtful.


> Be careful. The notion that opinions we don't like are generated by organized trolling campaigns is toxic, and insidiously serves the objectives of the trolls.

Yep. See also: Sue Basko and the Rustle League. The more she claimed they were a FBI/CIA Psyop, the more they laughed and poked at her.


> recently on HN, where trolls derailed the conversation so badly

We found no evidence of trolling (in the sense of subversive shilling) in that thread. As far as I can tell, one commenter was who he said he was—an individual who felt passionately—and other commenters accused him because they didn't like what he said. His passion didn't help his cause, but he didn't deserve that.

Groundless accusations of astroturfing and shillage are not allowed on Hacker News. If the only evidence one has is "these comments seem so wrong to me", that's groundless. Accusations based merely on the strength of one's own opinions cost this community considerable health points, and we're not going to let it become a thing. If it gets worse, we'll make a top post about this, explicitly ban it, and add a rule about it to the guidelines. In the meantime, please flag such comments.

HN has users with many divergent views. The community spreads across countries where perspectives and politics are different from one's own [1]. It's common for X to seem obvious to one user while ~X seems obvious to another. Occam has no trouble accounting for most of what people call trolling, astroturfing, and shilling in controversial threads.

When any of you have concerns about astroturfing, you're welcome to send them to hn@ycombinator.com. We'll take them seriously. We'll also be careful and stick to what we can find evidence for.

1. This cross-cultural aspect of HN is a big deal. We see it all the time; I think it's underappreciated.


You are certain that trolls derailed the conversation?

It could just as well be one person stating his opinions. His opinions are controversal, and therefore they generate lots of responses (derailing) and he is downvoted because they go against what most people here believe. Doesn't quite seem like an army of professional "trolls" manipulating the discussion (If he is a troll, he would probably have presented his statements more effectively or subtly. The way he did almost guaranteed being downvoted).

Those "troll groups" are surely not a good thing, and you can never be certain this person wasn't one of them, but I also wouldn't label everyone I don't agree with a troll, just because they dare to write down their opinion, and my point of view is the one of the majority.


Shills and astroturfers are the future witches of the internet. I think the people concerned about these people will do more to derail conversation than the actual people being paid. On reddit it's really bad, many of them believe you literally can't have certain opinions without being paid to have them. Because of that, they literally won't even entertain opinions different from their own as sincere.

It really shouldn't matter if someone is paid to have an opinion -- if your own opinion is crappy enough that your only argument is that someone is being paid, then maybe they're doing you a favor by challenging your belief. Disinformation only works if you're too lazy to research the reality.

You'll never be able to confirm someone is a shill or whatever or not, so I don't understand why people obsess over it so much in comment sections. Just legitimately defend your view if it's that obvious.


> Disinformation only works if you're too lazy to research the reality.

The disinformation has an enormous advantage: It costs much less state a lie than to determine the statement's truth. I'm not lazy, I just don't have enough time to check the truth of all these statements. In the end, I have to trust someone.


> your only argument is that someone is being paid, then maybe they're doing you a favor by challenging your belief. Disinformation only works if you're too lazy to research the reality.

On the contrary, an astroturfer might fail to persuade but still chalk their effort up as a success if they manage to derail a conversation. A bunch of people walking away from a stupid conversation failing to reach any conclusions is a win for something like, for example, Putin's govenrment.


>Disinformation only works if you're too lazy to research the reality.

That's like 95% of the time for 95% of people though.


For some reason (which I'm not sure isn't a reasonable amount of paranoia based on hard-won experience under the regime that Putin is running) some people believe that anyone who expresses the Western minority view that the conflict over Crimea isn't entirely Putin's fault, and that its population trying to cling to Russia isn't the worst, least understandable thing in the world is being paid by the Russian government. I've been accused myself.

I'm a black American from Chicago, and that's honestly a majority sentiment amongst friends and family, for what it's worth.

Putin is terrible for so many other reasons than this struggle between duelling right-wing nationalists.


I disagree with whoever down voted you. I think that worthwhile discussion here on HN is impossible without a well-articulated adversarial view. Otherwise, we're falling into an echo chamber.

I disagree with your opinion, but I'm glad you provided it. I wouldn't mind some support behind your assertion though.


That position seems to omit the very obvious problems with Russia's actions, including invading and conquering territory and people, abusing its inhabitants, and forcing a dictatorship on them.

It's a little odd to even discuss the desires of the Crimean population; they have no say in the matter.


The internet has always been full of delusional people with extreme opinions untroubled by logic or argument.

I wouldn't worry too much - good moderation systems were built to handle these from the start. The places where paid trolls are likely to be most effective are also the places where you see the largest quantity of ridiculous posts from ordinary people. I'm thinking newspaper comment sections here, but Twitter might count too.

Whether someone is posting nonsense because they're paid or just because they can't sleep (https://xkcd.com/386/) the response to it is the same: robust forum software combined with people who point out their fallacies.


Here is a better example:

https://news.ycombinator.com/threads?id=yeahyeah

At the time this was the top comment and completely derailed the conversation. Pretty sure that wasn't a Russian agency, take a guess which one it was.

EDIT: amazing, I provide the community a crystal clear example of astroturfing and get down voted.


Interesting enquiry and article, though obviously they are much worse at hiding their tracks than the US Government trolls.


This story pops up regular as clockwork, going back before 2012. It's propaganda in its' own right - no doubt, with equally depressing regularity, I shall be downvoted into oblivion for having the temerity of actually pointing it out.


Got any references or links?


here is one of many http://www.businessinsider.com/putin-nashi-anonymous-opyoung... simply search google for kremlin troll army, restrict date range, voila.


Expect a follow-up article in the Washington Post about a covert Chinese internet propaganda organization which successfully planted a story in the New York Times attributing a series of alleged online hoaxes to a Russian organization. The hoaxes seemed real enough - there were youtube videos and twitter accounts and websites - but in reality there was no evidence that the hoaxes had ever actually happened.


Can we expect another follow up from Huffington Post that it was the NK propaganda agency which successfully planted the story about the Chinese in the Washington Post?

And we can continue this story until it comes back that it all started with the NSA.

... Until we learn that it is really an alien civilization that wants our natural resources trying to get us all to kill each other so they don't have to fight us?

I kid I kid, but seriously...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: