Hacker News new | past | comments | ask | show | jobs | submit login
By removing “extremist content,” platforms are purging human rights evidence (nytimes.com)
308 points by jbegley on Oct 23, 2019 | hide | past | favorite | 154 comments



Youtube and Facebook, while convenient, aren't really appropriate mediums for them to share this sort of footage. Liveleak comes to mind as a service better suited for a public repository of this video.

Archival of footage shouldn't be dependent on a third party platform that can, at will, delete every single video on a whim. Likely for this sort of thing a diligent archivist will need to put in the effort of storing it safely, because those videos will almost all end up being deleted on platforms that rely on ad revenue.


Been wondering about this phenomenon for awhile. Spotify can delete music history. Youtube can delete video history. If it's on a streaming service it may as well not exist. We're at this strange point where convenience may as well be access, because the same service you used for discovery is the one for consumption. I don't know if this phenomenon has a name.


You better start unthinking before you're the one getting put down the memory hole.


Long ago only very rich people or clergymen owned books. They were read out loud to the illiterate at gatherings.


Likewise Amazon removing books from its site is much more effective than any book burning.


You can buy them (or if curious, simply see which books have been banned) here http://www.unz.com/bookstore/


Fahrenheit 451, good movie about that. I just found out Amazon owns Goodreads. They also own IMDB.


An oft-overlooked aspect of Fahrenheit 451 is that it's everyday people that report others to the firemen in the book, very similar to how content is censored these days.

Goodreads will list any book for the time being, even ones banned from Amazon. One example is The Culture of Critique - the link to Amazon goes to a broken page, but links to buy it from other vendors remain.

Link to the book - https://www.goodreads.com/book/show/182136.The_Culture_of_Cr...


My feelings exactly.


During the Arab Spring I had savec gigabytes of Youtube footages of protests, because I 100% expected them to be removed.

"You should not show identifiable people being beaten!"

That's true, because it could go one way or another depending on whether the revolution succeeds. If it does, though, it can help identify the perpetrators of brutality.

Publishing it is irresponsible, but destroying them is as well. This is potential criminal evidences.


Exactly. You need something like Internet Archive or better that is archival quality, random internet vandalism-proof and FBI raid-proof.


Archive.is has constantly faced harassment from people filing DMCA claims and having domain names taken away for bullshit copyright claims.

I'm sure if they started doing this the Twitter outrage warriors wouldn't be happy about it and add to the difficulty of their job.


>the Twitter outrage warriors

Somebody should classify this as an addictive behavior.

'The immanent urge to get emotionally excited by interacting with an equally excitable peer group on issues only remotely related to one's personal life'.


Just in case there's confusion, the Internet Archive isn't Archive.is but Archive.org. They are 2 differents organizations.

Internet Archive seems to be handling this kind of trouble quite well.


I noticed a month or two ago that certain sites seem to be excluded even from internet archives like the Wayback Machine. The first example I stumbled on was Snopes. I only noticed because I saw a Snopes link on Reddit where the thumbnail was clearly different from the live version; the thumbnail appeared to show a human-like baby in the womb, whereas the live version had that edited out. Example: https://web.archive.org/web/*/https://www.snopes.com/fact-ch...


I get "Sorry. This URL has been excluded from the Wayback Machine."

To my understanding websites can chose to not be included in the Wayback Machine.


Not only that, if they are archived at one point and later decide to be excluded then all of the archives for that domain are deleted retroactively. This means that a valuable site can be archived for years but if the domain lapses and gets bought up by somebody who decides to exclude the Wayback Machine then all of that precious content is destroyed even though the destroyer never owned it!

I feel like we’re in the Internet equivalent of the early years of Hollywood. We’ve lost so many precious early movies because there were no backups of the film.


I don't think your interpretation of the latter case is quite right. In my understanding, the Archive is willing to disable display of the site content retroactively due to a new robots.txt, but they won't delete that content. It will still be present in their files.


What you’re describing may have been a recent change that I wasn’t aware of. If so, it’s at least some consolation. We could still do better.


I believe that that has been the case for the entire existence of Wayback, not just recently.


This policy was actually changed in 2017 to completely ignore robots.txt.

[0] https://blog.archive.org/2017/04/17/robots-txt-meant-for-sea...


Deleted or not made available?


If the site goes down and the robots.txt becomes unavailable, the content will become available on internet archive.


This is how it works, yes.

----> Why isn't the site I'm looking for in the archive?

Some sites may not be included because the automated crawlers were unaware of their existence at the time of the crawl. It's also possible that some sites were not archived because they were password protected, blocked by robots.txt, or otherwise inaccessible to our automated systems. Site owners might have also requested that their sites be excluded from the Wayback Machine.

----> How can I exclude or remove my site's pages from the Wayback Machine?

You can send an email request for us to review to info@archive.org with the URL (web address) in the text of your message.

https://help.archive.org/hc/en-us/articles/360004651732-Usin...


They've recently made the process intentionally more annoying. You used to be able to exclude a site you control via a robots.txt file. That wouldn't actually delete archived content and would merely hide it from viewing. They no longer honor that approach. You now have to send them an email and specifically request content removal.


I consider this to be acceptable.


Most rational people would.


That approach allowed domain spammers to (unintentionally?) disable access to content that predates their ownership of the domain, so it was a garbage approach. In the current situation, a content owner can still achieve removal, but a subsequent owner of the domain cannot. That's how it should be. If I buy domain x, I should not be able to request the removal of content that was present on domain x before I owned it.


Snopes kept quietly making politically-motivated changes, and they kept getting caught. Now they can get away with it.


Or even something more distributed. Like each public library has a storage cube and each cube has an overlapping portion of history. They checksum each other peer-to-peer to detect anyone trying to change history.

There are 110,000 public libraries in the United States. Plus thousands more private libraries. Seems like enough for U.S. history, at least.


Shouldn't be the public libraries though. Public libraries will not save certain videos. Especially if the local police or the guys in the blue windbreakers with three yellow letters on them show up.

Maybe that's a problem that everyone has? But it would be best if there was some offshore, FBI proof, kind of way to do this archiving.


Especially if the local police or the guys in the blue windbreakers with three yellow letters on them show up.

I wouldn't be so sure about that. After 9/11, the libraries were the first to stand up to the government's desperate search for information. The feds wanted everyone's lending history, and ended up backing off when the libraries put up a fight.


This is what BitTorrent and IPFS is for.


Those are ultimately transport and distribution systems.

At some point, you need curators and archivists.


The Islamic State posted their content for many years on the Internet Archive. I believe IA did take down content in some cases.


Modern day jihadist groups are now utilizing Telegram, Whatsapp, and TikTok to preserve and spread material. It’s quite easy to gain access to them — i’ve heard — as many researchers are gaining access to these messages to preserve said materials. Plus these apps give them an ability to easily communicate transcontinentally.


Here's the thing, we're never likely to find a place that allows everything. I think most people understand that.

But to find a provider of public access to archives more expansive than YouTube's, well, that's improvement.


With IA, I feel that they probably de-listed the content but are hosting it for a time when it may be safer to reveal it as part of a greater historical record.

I somehow don't get the same impression from same impression from YouTube


> Archival of footage shouldn't be dependent on a third party platform that can, at will, delete every single video on a whim

Thankfully censorship-resistant anonymous p2p file sharing systems exist, I hope they become more popular for archival work.


> Liveleak comes to mind as a service better suited for a public repository of this video.

Liveleak, seriously? I'm not convinced that a site known for hosting gory and violent videos is really what these human rights activists are looking for.

[Edit] Some context: https://www.businessinsider.com/profile-of-hayden-hewitt-fou...


It comes to mind for that very reason. The rules for uploading to it are less stringent than other services because it's there to provide a platform for things that don't fit on other video platforms on the web. If you're going to be hosting videos of the horrors of war, it's much more likely to stick on liveleak than any other platform.


I agree that it works from a technology perspective, but from the product/marketing perspective it's a non-starter.

People who want to watch violent videos != human rights activists. There's the "blood and gore" people, and the "social justice warrior" people, and not a ton of overlap.

In order to solve this use case there would really need to be a site that combines LiveLeak's content moderation policy with an activist ethos.


The reason I'm suggesting it as an an alternative is because of its policies, not because the platform is a great place for discoverability. The intent isn't to help people find these videos, just that they are somewhere online where people can view them that's less likely to be taken down.


Or at least with a filtered tag that's fairly reliable. All around a pretty risky site for this use case along other considerations.


Liveleak very knowingly remove any content that will embarrass the US' military-industrial-pharmaceutical complex, also.

So they're not really a great example.


> I'm not convinced that a site known for hosting gory and violent videos is really what these human rights activists are looking for

You mean gory and violent videos like war crimes and human rights violations? Sounds like exactly the right site.


In the media, you're damned if you do and damned if you don't. If tech companies weren't removing violent content, there would be twice as many articles complaining about it. For any choice, whether it's a blanket policy or a content moderator's personal call, either action can be painted as "dystopian" or "violating human rights" or whatever else is the flavor of the day. Reading hundreds of one-sided hit pieces over the years has totally desensitized me, to the point that I start feeling extreme skepticism whenever anybody starts talking about human rights or dystopias. That's not a good thing, but I would say it's not all my fault, either.

Across prestigious universities, mini-think tanks that study "tech and society" have been popping up. As far as I can tell, they exist solely to contribute to this tidal wave of narrative. They look at what is happening, and make up a reason it's bad. I can't think of a single example where they concluded that tech did something good.


Not only Youtube, I've noticed a huge disparity between Duck Duck Go results and Google results when writing queries that could be thought of as containing a controversial/politically/racially/etc charged topic. Youtube recommendations and search/ranking suppress searches as it is, even if they were to leave the content on there, most of it will never be surfaced through organic means without a direct link or a search with a direct title match.


The number of times I've taken a google search that gave me useless results, copied-and-pasted it into duck-duck-go, and immediately found what I'm looking for, is astounding.

I do think that they are starting to lose the power-user userbase when it comes to browsers and search engines. But their email, docs/sheets/whatever, drive, and now their DNS registrar services are all cemented into business flow.

It makes sense. They don't need the power users - they need the most users. As long as they have the most users, businesses (where the power-users are working) are going to use ancillary services like Google MyBusiness, Google Analytics, and embrace google-backed (and ranking-factor) tech like SSL certificates and progressive web apps.


> The number of times I've taken a google search that gave me useless results, copied-and-pasted it into duck-duck-go, and immediately found what I'm looking for, is astounding.

I'll bet it's dwarfed by the number of times I get useful results from Google and total garbage from DDG.

Earlier today I searched for "CADT syndrome" to figure out what another HN comment meant by it. Both engines automatically correct that to "cast syndrome" if you search without quotes. But only Google provided a relevant result after being corrected "no, I really meant CADT syndrome". DDG gamely provided the "cast syndrome" results after correcting the query.


To be fair, two points. One, DDG isn't perfect in all areas, and has lots of work to do. Their budget is a tiny sliver of what the google search teams is/was, and they are much newer. Some topics get better results than others, and that is to be expected.

Two, the others are referencing controversial opinion content in particular. They are saying that if the search is about something controversial or politically charged, the google result is noticeably lacking. This seems more concerning to many of us because this isn't just a newer search engine not quite up to par, this is the search engine purposefully pushing users into filter bubbles of Googles choosing!

I hope I cleared up the difference between what you are referring to and what the others are talking about. Anecdotally, I can confirm as a heavy youtube user (both web and youtube-dl/mps) that youtube search is also affected. Many searches for things that aren't even that "out there" return no results, even with the exact name of the video. If you don't have the actual link from somewhere else, you can't find it. I think what google is doing is dangerous and a very slippery slope. Many of my datahording friends have been on youtube downloading sprees because they are afraid google is about to memory hole important videos that don't exist anywhere else that is easily accessible.

ps. This is also the nice part of the bangs of ddg. Don't like the ddg results, just do it again with !gyear etc. (vs the google gui way, and I don't even know if googles daterange: operator even works anymore)


searching for "CADT" syndrome yields better results in both search engines.

As of now, the best resource (https://www.jwz.org/doc/cadt.html) is avaiable on both search engines as the first result for this query, and somewhere in the first ten or so results for "CADT Syndrome". The best set of results (for my Google history) is in Google's verbatim mode searching for "CADT syndrome" (the link under the search bar with the autocorrected version) which includes mailing lists and release notes from various opensource projects

interestingly there is also another meaning of CADT (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4610895/) which is less favored in google, but shows second place in ddg. Perhaps if we were into medicine papers this would show up higher in google.

Only ddg results include this (Hacker News) page and the comment you referenced (https://news.ycombinator.com/item?id=21337404)

Bottom line, both gave me useful results, ddg needs a better verbatim mode, google tracks me


You should use verbose mode


I think you mean "verbatim". It's silly for that to be necessary to get reasonable results, though - if verbatim mode works better then those should be the results returned in the first place.

I primarily use DDG, and in my experience when its results are bad, Google's usually are for that query too.


are quotes around the search term not the command for verbose mode?


Not anymore. Google has a verbatim mode under Tools > All results.


Yeah lately I've been shocked about the search result quality as well, way too often I am getting almost all seo spam results on google. I was actually reading their 10k earnings report a while back, Google specifically highlighted spam/seo results as a threat to their business. Most google users probably have never even heard of duck duck go before, I doubt they are in any kind of risk of losing market share.


I'm starting to notice more results from ddg. Google doesn't seem to be indexing most sites anymore. Not sure I understand why.


Also, power users don't click on ads pretending to be search results...


In fact, exact title searches will usually not turn up videos with that title unless you use quotes. YouTube will return clickbait and Jimmy Kimmel, entirely unrelated to the search, for exact title searches of videos with only a few tens of views.


Heaven forbid I ever accidentally click on or get interested in some random talk show video, video roughly about video games or a big media news video. All of my normal engineering, business, programming, guns, history, and mechanics recommendations go right the fuck out the window. For the next week, half my recommendations are just filled with dogshit because of one accidental video. I stopped watching movie trailers because of this too. For some reason if I ever choose to watch a movie trailer, youtube thinks I want to watch makeup tutorials by teenagers. Drives me insane.

YouTube is better off with meta tag recommendations instead of their fancy little AI.


I think that's a problem on every platform; on Apple News, if I read an article that's actual news that involves a celebrity somehow ("Tom Hanks murders 12, dies in hail of gunfire") I get a deluge of celebrity gossip ("OMG you won't believe what Kylie Jenner's cousin's brother's hairstylist said")articles for weeks.

I sort of feel like their recommendation AI is a bunch of if-then statements designed to funnel you into more heavily-monetized content if they think you display any interest at all.


Oh there was a LegalEagle episode on Kim Kardashian studying to pass the bar. He goes over the history and process of taking the bar without going to law school. But the title of the video has her name.

Reality TV recommendations... Jersey shore... celebrity gossip...everywhere...it took 2 weeks to disappear. Strep throat is cured in less time than contracting celeb gossip videos.


Yeah these aren’t content recommendation engines, they’re ad engines


I somewhat do agree with the ad engine... but it's a crappy one.

I'm a guy in my 30s. Nothing in my view history ever says, "Hey, this guy is progressive enough to wear makeup." At that, I watch english, polish, and french videos. Ads I get, Spanish. I dont know spanish. I'd accept general language learning ads. But tostitos and I think one was health or house insurance... in spanish. It's like the ad bot is some redneck that thinks there are two languages, American and not-American.

The recommendation bot and ad bot are just terrible. The moment theres a half way decent competitor to youtube, my ass is jumping ship.


Book a honeymoon, constantly Google administrative marriage questions ... still get ads for engagement rings.


You never know. Maybe people who got married recently are more likely to get married again soon?


I'm curious what happens then if you ever have to plan a funeral. Which, yea I'm being morbid... but it would be pretty messed up to continue getting funeral related ads over time.


Do you have the “tune ads based on view history” (or whatever it’s called) turned off by chance?


Already checked that a while ago when I complained to a friend who asked the same. I still get plenty of ads for stuff like CNC machines, mechanic classes and other stuff. I'm okay with some random stuff that feels like the marketer was going out on a limb. But some are just... selling paintings to Stevie Wonder.


Try delete it from your watch history: https://www.youtube.com/feed/history

Your recommendations may return to normal.


There is a minor chilling effect to this. I have chosen to follow a link because of the mess I thought it would make of my youtube/amazon recommendations.


On Desktop, you can avoid this issue by watching the video in a private browsing session. It doesn't work on mobile though.


On Android I use NewPipe, I can turn off the recommended videos, and even comments, for a more blissful YouTube experience.

Although one person did complain Google closed his account because he used that app: https://news.ycombinator.com/item?id=21247759


Ditto on the meta tag!


Exact title matches don't match on IMDB or Amazon books, either. I don't think it's nefarious, it's just IMDB and Amazon think I don't know what I'm searching for.


For anything not technical, I use DDG or Bing (If I need recency). Google results omit anything politically controversial.


Can you give some examples of politically controversial results google might be censoring?


I have no idea what Google is censoring in the search results, but compare the "search suggestions" for almost anything even slightly controversial. For example type "is stephen miller" (a Trump advisor) in Google and Bing and you can see how Google just removes stuff.

I suspect they are partly worried about results being gamed there...but everything is gamed. Maybe the main search results should be empty too.


I opened an incognito window and tried it.

Google suggestions for "is stephen miller":

* is stephen miller married?

Bing suggestions for "is stephen miller":

* is stephen miller gay?

* is steven miles married?

Maybe there's location stuff going on, or maybe they're guessing at what you're searching for based on previous searches or something.


I am getting racially and politically charged content suggested on youtube and don't have any interest on it. Maybe there is some more extreme content I could not find, but this stuff is going to me without me wanting it.


Sounds about right. They demonetize blancoliorio's in-depth commercial aircraft and pilot industry analysis as "sad content." Oh and donutoperator gets demonetized despite blurring almost everything and not swearing. Non-corporate content creators and random people have NO rights on corporate platforms like YT. The only solution is another platform that respects creators, has sane/straightforward policies and doesn't permit rampant scams and extortions.


YouTube denies creators of flourishing through favouritism. I don’t think Rights is the correct way to talk about it. It might even hamstring the dialogue. Discussing opportunity and flourishing is more precise.


> The only solution is another platform that respects creators, has sane/straightforward policies and doesn't permit rampant scams and extortions.

What is that alternate platform?

For many people, web video is YouTube.


You're speaking for other people when you can only speak for yourself, and you need to get out more. A not-yet-existent, co-op-owned platform where creators share in monetization and have straight-forward/transparent content policies and processes, not secret ones, and not shady algorithms that curate user filter-bubbles without telling them how they're being shown recommendations. YT isn't the only web address on the internet. Apps, integrations, and plugins can be replaced if people are willing to support quality and minimally-censored content that respects creators' ability, autonomy and sometimes livelihoods, rather than a power-law income distribution beholden to random countries' censorship and corporate advertisers' nearly arbitrary whims.


> You're speaking for other people when you can only speak for yourself, and you need to get out more.

Ask 100 random people on any city street where they'd find videos on the web and I am betting 0 of them will come up with any answer resembling yours.

> A not-yet-existent, co-op-owned platform where

Yes, so "not-yet-existent" answers the question. The alternative does not exist.

But you seem like you have an idea fleshed out, so why don't you build this alternate video platform?


>You're speaking for other people when you can only speak for yourself, and you need to get out more

Millions upon millions of people watching YT, but GP "needs to get out more" in order to experience your fantasy platform that doesn't exist. Ok.


You know the type, we've all worked with them before.


I didn't downvote you, I feel the same way.

But YT is where channels I watch such as Linus TechTips, BigClive, Cody's Channel, Chef John, Glen and Friends, Andrew Camarata et al publish their content to.

Realistically, how do we change this?


> The only solution is another platform that respects creators, has sane/straightforward policies and doesn't permit rampant scams and extortions.

This is how every platform starts out. Success is a curse that drags companies into politics and money games, causing them to succumb to censorship and other unintended policies.

The only solution is a platform that can prove that it has no ability to censor content. This requires advanced cryptography knowledge. Never trust the "word" of a company or a platform.


Is this pure populism, or does the principle apply to the "little guy" as well?

If I set up a machine under my desk at home with a domain (courtesy of dynamic DNS), and provide a few terabytes of free-of-charge storage to a small bunch of people, am I "erasing history" if I can't keep all their stuff forever?

Or do I have to be a Youtube-sized corporate entity before I'm a fucking asshole?


If you just say "Haha, suckers, you trusted me but I'm deleting your stuff, you lose", then yes, you're a colossal jerk. On the other hand, if you can't do it forever, and you give them warning so they have time to find alternate places to store their stuff, then I don't see any problem.


Imagine offering a family member a couple terabytes to host pictures and video of their kid growing up. If you deleted all of their data without notice, you would be a huge fucking asshole for erasing history. Proper notice goes a long way.


Depends what you say to them.

If you say "hey guys, feel free to use this storage to host stuff" and then delete it, then yes, you're an asshole.

If you say "hey guys, I have some hard drive space, send me stuff you think I might like and if I like it I'll host it for a while before deleting it" then no, you're not an asshole.

If you say something that sounds like the first statement but is actually the second statement because of some obfuscated legalese attached to a footnote, and then delete it, then you're back to being an asshole.


What if you promote this service with a name that starts with "You" and ends with a word referencing one of the most dominant media of public record for over a half century?


The only way that is a "public record" is if we consider the perspective that anything that was ever broadcast is still radiating in space.


Why would you trust a for-profit company to archive anything and care? They don't. It's unreasonable to expect anything different.

If you want things archived, non-profit solutions exist, or you should be doing it yourself and making sure the content is shared among people who are also able to archive it.


"To organize the world's information and to make it universally accessible"

Because they claimed they cared about this sort of thing. For a while, it seemed as if they meant it.


[flagged]


Parent is voted down but the cynicism, unfortunately, is well founded. Not to victim blame but there is a certain amount of distrust and suspicion we all should have with regards to any profit seeking entity - especially one that does not rely on you paying them - that makes outrage at these entities for ‘betraying us’ laughable at best.

Maybe “brain cells” is harsh but do at least use your instinct.


If you want things archived, non-profit solutions exist

You seem to be under the impression that non-profits don't ever go out of business, which is completely untrue.

I'm not saying that for-profits are the solution, but maybe something like a trusted .gov or network of trusted .edus would be better.


Maybe I should've said not-for-profit.


Not-for-profit outfits also have budget constraints, agendas, and bow to outside pressures (donors, politicians, lawsuits, etc.).


> They don't. It's unreasonable to expect anything different.

This is modern day cynicism. I fully understand and am unsurprised by their lack of good storage, but we shouldn't help companies lower the bar of their own existence to be purely profit driven entities.


I don't see why you would think they're anything but profit driven entities. That's literally what a corporation is. If they didn't make a profit, they wouldn't exist. It's not cynicism, it's reality.


> That's literally what a corporation is.

But it's not all a corporation is.


Stop relying on for-profit companies to host your videos for free. If you need reliable hosting try paying for it.


This is a digital content issue, not so much an issue of extremist content. Almost 10 years ago now, Geocities was shut down and took 38 million of the most popular pages of the Internet 1.0 with it.


Wasn’t there an attempt, neocities, to bring back the aesthetic?


The aesthetic is one thing, the content is another. The former is easy to replace, the latter irreplaceable.


I believe so, yeah, and there's an archive out there. But only because the site itself was so massive and notice was given that the end was coming, was the data saved.


I think most of the reason we have history, is due to physical media, such as print. newspapers books diaries and scrap books would live for decades and centuries even, so investigators could elucidate the past and bring it to light.

internet media in the current form is subject to great purges of intentional and accidental nature. no matter the corporate intention, there is a great danger of major events being deleted from social memory, and the engineering attempts currently in play, will easily become best practice in gov parlance.


This is an insightful comment. The audio engineer Steve Albini often comments that one reason he only records artists using analogue tape is because digital is easily lost either through obsolete and proprietary formats, or deliberate governance. Analogue tape, although inconvenient, requires a physical action to be destroyed in possibly many physical places and isn’t subject to evolving digital formats.


Oh yeah! Now they're taking down videos showing the Chilean soldiers and Chilean police unleashing their rage and violence against their own civilians.


Sadly the flip side of "one man's terrorist is another man's freedom fighter" is "a freedom fighter is always someone's terrorist."


> "Ogrish was a very serious site, it wasn't like a lot of the gore sites you might see now that are based off that model. It was tremendously serious, everything was researched, there was no laughing at dead people or anything like that"

...Just a bit undermined by the screenshot of Ogrish video titles in the article... https://media.businessinsider.com/images/54117ca9ecad04406f2...


I have said this before and will say again:

Honestly, why do platforms need to delete anything. Why not just have options like most search engines sort have... Let the users decide if they want to see objectionable or even violent content. They are adults they can make up their own mind.

Consider duckduckgo's search results you can choose:

  Strict
   *No Adult content*
  Moderate
   *No Explicit images or video*
  Off
   *Don't filter content*


Go ahead and add reddit into the mix as well. It's starting to be really concerning only a few people control these important discussion and meeting places, and exercise (often) power abuse from such a position to push their agenda. History isn't pleasant, that doesn't mean reddit and YouTube should be white washing it.


The sheer amount of pro-Chinese government moderators on Reddit is absolutely insane. So much dissent being either erased or forced into "megathreads." This is not how we should be reacting to genocide.


Thanks for helping me think differently about megathreads. political containment and pruning is the result.


Yup, another classic tactic is a moderator stepping in and doing some variation of "This thread has gone out of control! Locked." This has become very common in Hong Kong threads. I strongly believe that TenCent needs to be rooted out of U.S. tech and media, and fast.


I'm starting to come around to the idea of rejecting Chinese investment. They clearly have a negative influence. See Hollywood, political coverage, historical programming, video games, current affairs... If nothing is done, in a few decades they'll have manipulated the entire social political landscape of the West. All of us will be using the Social Credit system. Worse, we'll defend it, too.

Future is bleak if we do nothing.


Inappropriate content can be flagged, and continuing to view or partake in discussion of extremist things, that are counter to the interests of every living thing on the planet, waive the users right to privacy, and logs of activity automatically get passed on to public protection agencies.

Or something better.

Keep refining and optimizing the system.


They still haven't understood that they're the product being sold, not the customers.


Very similar topic discussed about 6 months ago.

Tech Companies Are Deleting Evidence of War Crimes (theatlantic.com) https://news.ycombinator.com/item?id=19864994


Which brings us to a couple good practices: never trust the cloud or any other online storage and always back up sensitive information offline, multiple times and in multiple places, before putting it online.


YouTube could solve this problem by throwing money at it, i.e. investing much more heavily in human moderators. They are extremely unlikely to ever do so, as the market would punish them for it in the short term and Google is unlikely to prioritize human rights over revenue.

This presents an opportunity for another Internet Archive-like repo as mentioned in earlier comments. Unfortunately the barrier to entry (just getting people to know about it) would be super high.

The WITNESS movement associated with one of the authors is worth checking out: https://www.witness.org/


Isn't it well documented that the human moderators who do exist at places like facebook and youtube end up having their mental health screwed after they watch so much disturbing content?

Just throwing human moderators at the problem probably has more cost than just the salaries involved.


How does 4chan do it? Last I checked they have moderators (well "janitors"). They are volunteers, work for almost no recognition (still Anon), and probably do not have the mental health problems that regular people do.

There is also clearly an ability to follow direction or enforce appropriate (e.g. the janitors of /b/ /pol/ /g/ and /fit/ are clearly all operating on different sets of rules).

I am 100% in agreement that you cannot hire an army of temps off the street and tell them to moderate the internet, but it's clearly not something beyond human capability. You just have to be hiring in the right places.


I’m sorry, if we’re looking to 4chan for inspiration on how to do the internet properly we’ve already lost.


What makes you think 4chan volunteers don't have mental health problems "regular people do"?


4chan is extremely tiny compared to social media platforms.


I am well aware that human moderators are currently treated like crap by Big Tech, and that their mental health is suffering as a result. When I say "throw more money at the problem", I mean, hire a lot more of them, and treat them all better.

The alternatives are

1. Algorithms, which always going to fail in human rights type "edge cases" like the ones described in the article, and which really shouldn't be seen as edge cases IMO, or

2. Bury your head in the sand and allow terrible content to exist.

Do you have any better ideas for how to deal with processing traumatic content at scale?


Some people are much better able than others to deal with such content. Do YouTube, Facebook, etc, implement some sort of psychological screening? I would think we know enough about PTSD, depression, etc to be able to filter out many of those most likely to suffer significant trauma, even if the screening was crazy imprecise. Plus, I assume we have many decades of experience with handling psychological issues among police and emergency personnel. Perhaps a large part of the issue is money; why bother with the expense if you can use ML, even if the false positive rate is abysmal.


> Perhaps a large part of the issue is money; why bother with the expense if you can use ML, even if the false positive rate is abysmal.

If I understand your position: you don't have any better ideas, and you're okay with using ML even if it means deleting evidence of human rights violations. Is that accurate?


Just thinking aloud. No position, as the article offers very little context and detail.

I'm not particularly sympathetic to the notion that YouTube should be obligated, either legally or ethically, to host content in perpetuity. OTOH, if rapid filtering results in such content being taken down so quickly that even archivists and reporters don't have a reasonable chance at identifying and saving it, then I do see that as a potential ethical (but not legal) dilemma, and an interesting technical challenge.

My initial thought upon reading the title was that this story--such as it is--was more a commentary on the unintended consequences of the censorship of terrorist-related propaganda videos. From that perspective, I don't think the distinction between ML and human filters is that important; the more interesting issue seems to be the scope and creep of censorship and how we can expect tech companies to draw equitable lines within modern political cultures that suffer from hyper-partisan political discourse. It's not hard to imagine a situation where simply permitting any atrocity videos will be seen as partisan--supporting one side or another by promoting particular narratives.


Bigger problem with human moderators is that the companies often keep them at arms length.

i.e. they will typically use contractors or contract houses for moderation work. Why? Because it allows for opportunities to minimize the cost of counseling (If you're doing serious moderation, they must provide mental health support services. But fixed-term contracts make this easier to contain/control, at the cost of the employee.)


As far as I'm aware, knowingly destroying evidence of a crime is illegal. So would this be illegal also?


Do they actually delete the files? Or do they just remove public access to them, but keep them on their servers?

Maybe they send the data to law enforcement automatically? (I've heard that they do that for child pornography; maybe they do, or could do, the same for terrorist material as well.)


This.

It's naive to think these videos are deleted. Homeland Security shows up and YouTube has no problems giving them the videos. (They probably have an entirely separate search interface that does nothing but surface that extremist content for the Feds. So Homeland Security probably doesn't even have to leave their office.)


Only if Google has a branch in Syria that is destroying the evidence for a legal action in Syria. Laws are linked to territories, US law is not applicable in Syria and the other way around.


> Laws are linked to territories, US law is not applicable in Syria and the other way around.

Doesn’t stop countries from trying. The CLOUD Act in the US is the US trying to extend its influence into other countries.


Are these videos newsworthy? If so, why isn't NYT hosting them?


People should go back to understanding the difference between a public service (i.e. operated by the people for the sake of the people) and a private company (i.e. operated by some people for their sake).


They don't have to delete it, just stop publishing it for consumption by their users. I would expect them to turn over evidence of crimes to appropriate law enforcement.


This is sad, I'm waiting for YouTube and others to f*ck up even more to move masses to decentralized solutions.


This is a volume problem that has no real solution. On one hand, it's a shame that this kind of content is getting removed, but on the other, it's also untenable for YouTube to hire thousands upon thousands of video moderators. It's a sad reality that if anyone uploaded content akin to Shoah[1] -- the 9-hour-long Holocaust documentary; a personal favorite and an absolute masterpiece -- it would get promptly removed. But it's a reality nonetheless.

YouTube is trying to straddle the line between platform and publisher and it feels like the hammer will soon drop.

[1] https://en.wikipedia.org/wiki/Shoah_(film)


>This is a volume problem that has no real solution. [...] it's a shame that this kind of content is getting removed

It's not a volume problem though, because the videos are never removed. They are just effectively delisted. Homeland Security shows up and I guarantee YouTube hands the videos over to them. So YouTube has plenty of space.

Of course that also means that the original complaint of YouTube destroying evidence is also bogus, but answering a bogus complaint with a bogus excuse as the pro YouTube crowd seems to be doing doesn't really get us anywhere.


While NYT themselves just shadow-edited the article about HRC calling Tulsi a Russian asset to a Republican asset.

Video from Tim Pool about it: https://youtu.be/tzTod38VS8c



“Paper of making up the record”, as they say.


For those who don't want to sit though 16 minutes of ranting..

At [1] the quote is:

Hillary Clinton waded into the Democratic primary on Friday by suggesting that Russia was “grooming” Representative Tulsi Gabbard of Hawaii as a third-party candidate for president.

By the next day at [2] it had changed to:

Hillary Clinton waded into the Democratic primary on Friday by suggesting that Russia was backing Representative Tulsi Gabbard of Hawaii for president and that Republicans were “grooming” her as a third-party candidate.

The NY Times published a correction:

Correction: Oct. 23, 2019 An earlier version of this article described incorrectly an element of Hillary Clinton's recent comments about Representative Tulsi Gabbard. While Mrs. Clinton said that a Democratic presidential candidate was "the favorite of the Russians," and an aide later confirmed the reference was to Ms. Gabbard, Mrs. Clinton's remark about the “grooming” of a third-party candidate in the 2020 race was in response to a question about the Republicans’ strategy, not about Russian intervention.

This matches other reporting of the same interview. Politco reported it like this[3]:

Clinton, the 2016 Democratic nominee, made the comments on the podcast hosted by David Plouffe, who managed former President Barack Obama’s 2008 campaign. Clinton also said that Republicans could boost Gabbard to become a third-party “spoiler”

Conclusion: The NY Times did the correct thing in updating their reporting, and in publishing the correction. The reporting is now more accurate that it was initially.

[1] https://archive.is/0G9uI

[2] https://archive.is/ZPZCU

[3] https://www.politico.com/news/2019/10/18/hillary-clinton-tul...


> Conclusion: The NY Times did the correct thing in updating their reporting, and in publishing the correction.

No, the correct thing would have been to not publish fake news in the first place. Oh, but they said whoops, so I guess all's well that ends well?


Given that the OP is arguing that the original version was correct (!!), because they didn't see the distinction either I'm unsure that saying it is "fake news" is useful.

But yeah, people do like easy answers rather than digging into complexity.


Edit: original title was about YouTube "erasing history"

They're erasing a good chunk of the present as well. It is widely known and observable that they algorithmically throttle and aggressively demonetize anything even remotely conservative in the US. E.g. Tim Pool (a liberal who nevertheless likes to point out the lunacy and hypocrisy of the leftist "free press") often gets demonetized under the flimsiest of pretenses.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: