YouTube, the company, will be fine. They'll just ban any content that triggers their enforcement systems. I'm worried about YouTube, the platform, and the bastion of free exchange of information and ideas that is has been until now. A response-type video or movie review will almost certainly be false-flagged by a system like this.
Of course, I agree with you that independent platforms are going to have a rough time. They either have to manually review all content themselves, not allow any user-uploaded content, or pay a company to do this for them.
> When defining best practices, special account shall be taken of fundamental rights, the use of exceptions and limitations as well as ensuring that the burden on SMEs remain appropriate and that automated blocking of content is avoided.
Well, of course the law applies to them too. But what happens in the instances that they don't comply? Laws are enforced by incentive structures, such as punishment. Who gets punished or otherwise incentivized to make PeerTube comply with the law, and what effect will that have on the functioning of PeerTube?
There are three main ways to do P2P content distribution. One is like BitTorrent, where you only host the content you yourself wanted. This has obvious privacy issues. Anyone can tell what you read/view based on what you host.
The second is you distribute content to random hosts, who don't even know what it is (so they can't associate it with whoever is downloading it). This solves the privacy problem and has adequate performance but it only works if you don't have bad laws that impose liability on people even if they aren't knowingly hosting something illegal. Otherwise the government can prosecute a couple of random innocent people and put enough fear into everyone else that they move back to Facebook.
The third is onion routing. Then it's hard to shut down specific hosts (you don't know who they are), but it's slow and if your laws are sufficiently bad it can be made illegal to use it at all even if you aren't doing anything wrong. At that point you go down the road into Tor Project vs. Chinese Firewall, but that's just a disgraceful way to have to operate your communities in a democracy. And for every bug an innocent person goes to prison.
Thats so depressing that we have to have serious discussions about technical countermeasures against our oppressive EU regime. Just because of some old ignorant evil assholes. Time to leave the EU I suppose.
If the solution were technical we would need a combination of 2 and 3. Distributed hidden services. Is this even possible?
The problem with technical solutions is basically Child Abuse Images. I am a big believer in freedom and privacy. I am also a big believer in protecting children. Many people understandably prefer protecting children to seemingly (to them) abstract concepts like freedom. Any technical solution needs a method to remove certain content - and as soon as such a method exists people will want to abuse it for political reasons.
The solution has to be political not technical - somehow we need a political situation where basic freedoms are respected. This can only exist as revisions to countrys' constitutions. Simple laws protecting freedom are too easy to overturn. And we can't carry on resisting re-heated versions of the same stupid law every two years.
> If the solution were technical we would need a combination of 2 and 3. Distributed hidden services. Is this even possible?
It is possible.
> The problem with technical solutions is basically Child Abuse Images.
This is a fake reason which is only used as a justification for censorship technologies. In practice it's better to allow distribution so that it happens in the open and new images can be discovered sooner and traced back to the perpetrators who created them. If you shut down a distribution network every time you find it then the only ones that exist are the ones you don't know about, which means you have no source of evidence to make cases against unknown active pedophiles.
The FBI had a successful campaign where they quietly seized a distribution server and then continued operating it so they could collect evidence and make cases against those using it. Naturally the headlines were "FBI distributes child pornography" rather than "FBI arrests many child pornographers" as though preventing distribution is somehow more important than arresting the creators and rescuing the children.
That's EU after all. They will mass punish everybody, then try to ban traffic.
> “Thousands of Swedes have received threatening letters from law firms which accuse them of illegal downloading. They are asked to pay a sum of money, ranging from a couple of thousand Swedish Kronors up to several thousand, to avoid being brought to justice,” Bahnhof Communicator Carolina Lindahl notes.
> “During 2018 the extortion business has increased dramatically. The numbers have already exceeded last year’s figures even though four months still remain.”
> This year to date, 49 separate court cases have been filed requesting ISPs to disclose the personal details of the account holders behind 35,711 IP-addresses. As the chart below shows, that’s already more than the two previous years combined.
> Also, the number of targeted people exceeds that of all US and Canadian file-sharing cases in 2018, which is quite extraordinary.
No, that's not true. If the law mandated running content filtering system that costs $10M/year to run (in addition to any existing systems), then it'd be worse for YouTube, but they will survive. It would absolutely kill smaller providers dead.
Unless by participating in free, decentralized, federated platform you expose yourself to legal liability for any content present on the network and passed through your node. In which case you probably can't afford it.
> A response-type video or movie review will almost certainly be false-flagged by a system like this.
To be fair, that hypothetical problem is caused by a broken classifier and not the law. After all, youtube already blocks and removes content like you've described but no one is accusing the classifier of stiffling free speech.
Of course, I agree with you that independent platforms are going to have a rough time. They either have to manually review all content themselves, not allow any user-uploaded content, or pay a company to do this for them.