Hacker News new | past | comments | ask | show | jobs | submit login
OnionShare makes it easy to publish anonymous, uncensorable websites (micahflee.com)
620 points by input_sh on Oct 15, 2019 | hide | past | favorite | 231 comments



I love Tor, but it's almost suspicious how fast it's become, like I'm wondering who's paying for all this bandwidth. Home, businesses, hotels, you name it, Tor Browser almost never takes more than 2x as long to load anything on the web as the raw connection does. I admit don't have an understanding of how Tor works mathematically. There are organizations funding Tor exits and relays (torservers.net, Emerald Onion, Tor Project itself) but has anyone added up the bandwidth they provide and compared it to the total?


The Tor Project tells relay operaters to set a value called MyFamily to declare which relays are run by the same person or group, so that Tor users won't use more than one relay in the same path with the same operator.

https://2019.www.torproject.org/docs/tor-manual.html.en#MyFa...

Of course, a malicious relay operator that wanted to increase its chances of being used as both the entry and exit nodes in a single path (and thereby easily being able to correlate traffic between its origin and destination) could add a lot of nodes and not own up to their relationship.

Some people in the Tor community try to watch for relay-creation behavior that they consider suspicious. A common example is a large number of new relays that appear within a short period of time with similar characteristics and don't declare common ownership.

Edit: one of the main tools for this is OrNetRadar, which seems to primarily use the autonomous system number in which the relays are located, as well as the timing of their creation: https://nusenu.github.io/OrNetRadar/

In this case, the relays can be given a flag like BadExit by the Tor developers, which will stop any Tor client from selecting those relays as exit nodes in a path. However, a sufficiently malicious attacker could add a lot of network capacity in a way that isn't recognizably associated as belonging to the same entity, for example by adding nodes in different data centers, with different speeds, with different software environments, and not all coming online at the same moment. In that case, there wouldn't be a way to easily infer that these nodes are associated with the same operator.

As someone has said elsewhere in this thread, there's still the hope that if different organizations add network capacity with malicious intent, they tend to undermine one another's chances of succeeding, at least as long as they aren't directly colluding (because the basic security goal in Tor is that clients choose paths whose constituent relays don't share information with one another).


It seems like I can query BadExit status (which of course makes sense, as the client wants to avoid bad exits) so that makes it a bit useless as a protection mechanism against determined malicious agents that operate at scale. In fact it makes it quite easy to set up some experiments to find the best ways to circumvent this.

Experiment A: fire up a baunch of nodes one way, see how many are marked as BadExit; experiment B: vary the way you fire up on one variable (time-delay, node server Os etc.), measure, repeat


True. In fact, many of the discussions within the Tor community related to detection of colluding nodes even happen in public, so you could observe them on mailing lists and try not to repeat your mistake! This mechanism is very fragile.

By contrast, being marked as a BadExit due to tampering with content can be due to tests whose exact nature isn't disclosed and changes over time, and it doesn't happen instantly, so it might be hard for an individual deliberately malicious exit to deduce which action it took that resulted in the BadExit flag.

Determining whether nodes are secretly colluding (or, equivalently for some purposes, whether their communications can be closely observed by the same adversary!) is a mostly unsolvable problem, and that's an important limitation for Tor's security. There have been some papers that do a statistical analysis about the probability of an individual adversary winning against an individual user, given some assumptions about that adversary's capabilities (what fraction of nodes the adversary controls or observes). Tor has changed its path selection algorithms a bit based on ideas from these papers.


Do Tor circuits always use 3 nodes? It seems like it should randomized/configurable (maybe 3 or 4 by default), such that no nodes can know whether they're an entry node or somewhere in the middle.


They do always use three nodes.

https://2019.www.torproject.org/docs/faq.html.en#ChoosePathL...

If you're using public Tor nodes, as opposed to bridges, it's not really feasible to prevent nodes from knowing whether they're being used as middle nodes (they can just check whether the previous node corresponds to another node in the public directory consensus).


I'm operating three 10Gb Tor exits in a DC in Amsterdam. [1] It seems like a meaningful contribution. I assume the other operators who are not IAs have similar reasons.

[1] https://metrics.torproject.org/rs.html#search/family:38A42B8...


What datacenter do you use? How much does it cost? Did you get any legal troubles?


Why would you get in legal trouble for hosting tor exit nodes? There’s nothing illegal about that is it?


The exit node could be the last known trace of illegal activity so someone could perhaps take a shot at you for aiding & abetting or similar. Right or wrong, can you afford that fight?

This is one of those cases where often the question is not whether your actions are legal, the question is whether you can afford the time and resource required, and in some cases take on the potential reputational risk, to successfully defend yourself against someone (commercial, law enforcement, political, ...) who says they aren't and tries to make you stop.


Hosting an exit node is not illegal, but some of the users of that exit node may be doing illegal things.


Tell that to the SWAT...after the smoke grenade clears out.


Did you miss the part where it wasn't the US?


every country has their own version of SWAT, some will make USA SWAT seems like angels. Also, I replied to "there's nothing illegal in running a TOR exit."

In short, my reply was: maybe it's not illegal but you might suffer quit a bit by the time it's all cleared.


Are smoke grenades a legit thing outside of counter strike?



Personally or on behalf of a company? I ran two in the US and folded due to legal pressure.


If you don't want legal trouble, don't run exit nodes. I've configured my relay to serve as a middle node. I'm using a provider that's known to be hostile towards Tor relay operators with no repercussions for about two years now.


Personally


Good job! And thanks for running it. How much is that NForce server per month? I'm looking to move my nose away from hetzner.


Thank you.


I run 3 nodes. They have 1TB traffic cap per month, so they serve 30GB per day per node, and then hibernates. It costs 15$ per month to run. I also use those nodes for personal use, so it’s not that expensive.

Also I’ve convinced my company to run nodes with their VPS credit. 6 nodes, 30GB per day per node. It costs them 30$.


DigitalOcean?


Tor asks not to host on them so no.


Why is that?


a tor exit node is nearly guaranteed to attract a flood of DMCA notices (relevant to digitalocean, since they're in the US), subpoenas, legal threats, national security letters, regular search warrants, and the rare but not unheard of no-knock bust down the doors search warrant.


I've seen what I remember to be credible information, possibly leaks via something like Wikileaks, saying that there was a tremendous number of US government owned Tor nodes. To the point that it was likely your traffic wasn't actually anonymous. I wish I could remember where I saw it.


You can see at least the first half of any Tor circuit, though, and more-often-than-not I get mostly nodes with IPs from European countries. Are those US-Government-owned nodes, that just happen to be in Europe?


Physical location is not really relevant to who owns a node.

Most countries allow anyone to setup a company and operate servers. Any government organization with a small amount of resources can setup a front company and turn on a server.


That would be insanely easy for the US government to set up.


I also wish you could remember, as my memory is something along the lines of they ran a few nodes on on EC2 for like a few weeks.

I'll try to find a source when I'm off mobile.


That doesn't matter unless they operated all nodes in the routing chain.


Operating the entry and exit on a circuit is almost as bad as operating the entry, middle, and exit.


Don't all those crazy FBI/CMU/(NSA..?) correlation attacks require you to control some majority percentage of the relays and/or exit nodes?

It would definitely be interesting to actually come up with some estimates of how much was being spent in aggregate bandwidth costs, say, 7 years ago, and compare it with what is being spent at present, and see if it matches what would be expected with 'normal user growth' although normal user growth for a relatively niche project like this would be pretty hard to quantify.


I wonder if there aren’t multiple us agencies competing for a majority share Unbeknownst to each other, all of them limiting the abilities of the others, all of them making it a better platform for those wishing to remain anonymous


They're most likely working together.


In modern times I bet you are right but historically information-sharing wasn't common across the IC

(based on my understanding of post-9/11 reality)


More likely US agencies are competing with Russian and Chinese agencies.


I've run Exit nodes at home and in DCs for the past 10 years.

https://metrics.torproject.org/rs.html#details/786926E8C497A...


Well, the NSA is funded by taxes, so ... the US taxpayer pays for it.


Home internet speeds have improved dramatically in the last decade, especially outside of the US. Most of the EU has gigabit available, and in many countries it's crazy cheap.


That's a great initiative. I was hoping IPFS would be the de facto way to publish uncensorable websites, but I have grown more and more disappointed in it, as it fails a lot for me in its most basic purpose (fetching files). Hopefully one of the other competitors will do better, but IPFS got a bunch of theoretical stuff right, in my opinion (mainly immutability of objects).


I've worked with IPFS and I completely agree with you here. There are so many basic bugs that haven't been fixed because of the over abstraction of LibP2P.

Currently, it connects to bootstrap nodes and tries to discover peers asynchronously. Obviously, it doesn't handshake with the bootstrap nodes in time for it to start discovering peers so it waits another 1-5 minutes depending on the setting to try discovering again.

Startup usually takes a few minutes due to this trivial problem and an issue has been open for it since May: https://github.com/ipfs/go-ipfs/issues/5953


IPFS person here - there are a ton of improvements to the libp2p DHT in the works (many which also relate to bootstrapping efficiency), but a key component of shipping these to an active 100K+ node network like IPFS is testing and validation, so it's taking us longer to ship these improvements while we build a testground able to simulate network conditions to increase network _reliability_ as well as performance. IPFS is still actively WIP and directly focused on improving performance, reliability, and usability - so feedback and collaborators welcome to help us address these issues head-on. =]


I like where IPFS is going and I think it's an important bit of software, so it's good to hear you're working on it.

I do have a question though on the immutability, how do you address problems where people contribute harmful content, such as revenge porn pictures, or doxing someone? If the content is immutable, it means these things would always exist and never be removable? Is there any moderation that can stop this kind of stuff?

Just wondering, I still like the idea, it's just a question I have been thinking about.


What IPFS means by 'immutability' is that the mapping from id-->content will never change. That doesn't mean that the content will always be available on the network and accessible to any viewer. If you remove the content from the network (aka all nodes stop hosting the content), then no one will be able to find and access it.

For example, if bad person A adds something illegal to a node they host on IPFS, then person B can legally compel them to stop providing that content (ex by suing them). While the id-->content mapping would still be immutable, the content can no longer be found/viewed because bad person A is no longer hosting it. There is no "automatic persistence" in the IPFS network, so only nodes that choose to host a piece of content will become providers (aka no automatic process will keep bad content around if all hosts have taken it down).


This is actually where I feel like there’s only trivial purpose to IPFS. Any content can still be trivially taken down simply by going after anyone that’s hosting it. Yes, there are legal (and moral) benefits here, but it no longer represents a technology to free information that is restricted for personal or political reasons.


Immutability on ipfs just means a given address will always point to the same content. It doesn't mean that content is or will always be available. Nodes who host harmful content are responsible for hosting it whether it's on a peer-to-peer network or a server/client network.


Unfortunately if you want an uncensorable network you won't be able to censor the nasty stuff either.


This is great to hear, I'd be extremely happy to see IPFS become fast and reliable, and I think it could be a hugely important piece of software.


Absolutely. In fact, I've been following those developments very closely. I hope to see (and possibly contribute to) some significant improvements to IPFS, especially the DHT.


That's awesome! See active work in progress on DHT bootstrapping here: https://github.com/libp2p/go-libp2p-kad-dht/issues/387


I've been working with Dat, which I think is similar to IPFS and really enjoying it. Being able to modify your site after you publish it is hugely useful (previous versions are always accessible - edits are stored in an append only log). It has its own problems (editing from multiple computers is risky but that's being worked on), but in my opinion that's essential for a decentralized web


Been working with libp2p, IPLD, etc (IPFS stuff). It's improved a lot. Still rough on the edges but definitely improved.

Fundamentally, the spec are good. These guys might not have the most mature development practices but it's a completely different category than tor. The project are broken down in a very [theoretically] elegant way and one project that truely exemplifies this is IPLD: https://github.com/ipld/specs

IPLD (among other protocol labs projects) is applicable to so much more than just IPFS!


From what I can tell, one of the main goals of IPFS is decentralization. Anyone can be hosting the file you want and it will download from any of the peers. Does OnionShare do this?

As you mentioned, I've also found IPFS to be fragile, particularly in that it is unlikely for stuff to stay up reliably, as ironic as that is. If I'm trying to host a website with not too much traffic (nobody else replicating), I have to keep my computer online or it will become inaccessible.


There's an annoying misconception that IPFS is "free storage" - it's not. If you care about something sticking around on the network, you need to take action to ensure that happens: either as an individual pinning it to a persistent node you run, as a developer paying a distributed pinning service to host your dapp data, or as a community creating a social agreement to all peer their nodes to replicate each other's data. All these solutions are doable today, but you shouldn't expect others to host your content for free (if it isn't popular enough for other people to be proactively caching it). IPFS is the distributed network for addressing files, you have to bring, buy, or share your own storage - anything else (ex forced hosting of content you don't choose) would have problematic liabilities!


Isn't that the point of Filecoin? As payment mechanism to incentivize node operators to host content


> Does OnionShare do this?

IIUC, it's onion-routing request traffic to your computer, but your computer is still acting as the sole server of the website. Turning off your machine gives incoming requests nowhere to resolve to.


In practice it is the case for IPFS too most of the time.


> it is unlikely for stuff to stay up reliably, as ironic as that is

Well, it's kinda inevitable with these systems, since there's limited available space. Freenet lets you "insert" data into the network (copy to other nodes) and disconnect your computer, but if nobody requests that data, it'll also eventually disappear from every node.


Does IPFS even try to provide anonymity? Even if it would be technically "censorship-resistant", what good is it when everyone can see who hosts the content, track them down, and subpoena/arrest them?


No, anonymity isn't a goal of IPFS. If you add anonymity (for both consumers and publishers) to IPFS, you get the design of its older sibling: FreeNet.


Why does it have trouble fetching files?


I'm not sure why, but if you have a node that has been running for a while and you add some files to it, some gateways will find them right away and others will keep timing out for many minutes. Discovery seems to be very hit-or-miss, and it's the most important component of IPFS...


Maybe OP is talking about IPNS? Which is unfortunately quite slow... I had to fuss with my TXT entries in DNS a bit to speed things up to an acceptable level (and reference the site via IPFS instead of IPNS)


Unfortunately even normal fetching is iffy. Content I add to some content isn't discovered by other nodes quickly.


Yeah that's IPNS as the GP suspected. It's definitely the hardest part of working on top of the IPFS ecosystem. Direct IPFS lookups are much, much better.

On the OpenBazaar project we're experimenting with a centralized IPNS record server that is used to provide a fast response while a real IPNS lookup is done in the background to ensure we're not censoring.

Because records are signed the only thing we could do is censor newer IPNS records and serve old ones, but then nodes just revert back to the slower-to-update records like they had before.


I'm not talking about IPNS, I'm talking about normal IPFS lookups. Hopefully this will improve (maybe has improved, I haven't tested lately), but it used to be bad, to the point where some gateways wouldn't discover content that already existed on other gateways.


IPFS has built-in denylists so it isn't very uncensorable.


Not sure where you got that info, but AFAIK that is incorrect. While everyone running their own infra has the responsibility to comply with local laws (or face local consequences) - there is no central network allow/denylist. Every single node has the power to decide what content they pin and make accessible to other nodes in the network, or how to consume content coming from other nodes (ex they might choose to filter out certain types of content). There is no central ipfs authority involved in any of that. If an individual node wanted to opt-in to a collaborative denylist (ex to block spam, etc), this is the most recent proposal for how that might work: https://github.com/ianjdarrow/content-decider (still in design phase)


https://github.com/ipfs/faq/issues/36#issuecomment-140567411

My understanding is that it is not possible for Tor nodes to block (or even know about) what you are accessing.


A Tor exit node can block destinations openly by declaring limitations in its exit policy (in principle by IP address or port, although I don't know if any exit policies use IP address this way) or by simply firewalling connections from itself to particular destinations. In the latter case, the node may be marked as a BadExit if a scanner detects that it intentionally blocked a connection that it offered to relay.

A node in an earlier position in a circuit can't know the ultimate destination of the circuit, or the content exchanged over it, although it could in theory blacklist individual nodes as a next hop (which doesn't typically really accomplish anything for content-blocking purposes, and could also be detected if it were a pervasive practice).

I think in HSv2 (the old hidden services protocol) there were a number of different opportunities for relays to selectively disrupt hidden service connections, while in HSv3 (the new revision) there are fewer.

A difference between Tor and IPFS in regard to what you're mentioning above is that in Tor, a specific entity (a specific clearnet or hidden service site operator) has to continue actively hosting a resource in order for it to remain accessible at its original location, while in IPFS, it's more like that there need to be some participants who haven't actively declined to host it, but they don't have to be the same as the original publisher.


Unfortunately, I think projects like this that make it easy to create onion resources are quite dangerous. The reduction of friction to host material online in a completely anonymous way only enables adverse actions like the sharing of CP or other illegal content. Please look here for a good explaination of friction in this space -> https://stratechery.com/2019/child-sexual-abuse-material-onl...

If it is used by political or marginalized communities, then this project is also not useful. It does not have the support or attention required to enable it to be secure enough like the TOR project does. This application opens up legitimate users to incredible risk.

If I were a dissident, I would take my time and do the research required to set up a system like this in the correct way. This project may give people a sense of false security.


While IPFS has been used successfully for censorship resistance (ex the Catalonian independence movement: http://la3.org/~kilburn/blog/catalan-government-bypass-ipfs/), I completely agree that anonymity is not something that IPFS claims to support right now - and users should be very careful in how they use the network for censorship resistance. While there have been some experiments in this space (ex OpenBazaar's onion/tor transport for IPFS: https://github.com/OpenBazaar/go-onion-transport), none of them have been audited for reader/writer privacy.


Ah, yes. The good old "don't make that tool because bad people might use it" argument.

Similarly, we should ban sharing information on hacking/security because bad people might use it, and require that all communications be visible to the government to prevent abuse.


That is not the argument being made. It is an argument that having friction in the process to do something that may have dangerous outcomes is sometimes positive.

It is much the same as arguments that there should be gun training required to purchase a firearm. Adding friction to a process sometimes provides positive benefits.


> That is not the argument being made. It is an argument that having friction in the process to do something that may have dangerous outcomes is sometimes positive.

The argument being made, as far as I can tell, is that this tool will make doing bad things easier, and thus this tool should not be made. As any tool to resist censorship will also enable said bad things, I do not find this to be a valid concern. (And since some governments would list censorship resistance itself as one of those bad things, this is always going to be true)

> It is much the same as arguments that there should be gun training required to purchase a firearm.

I think the tech equivalent to this would be mandating a "don't sexually exploit children" class before allowing purchase of a computer. (As you might guess, I'm not a fan)

> Adding friction to a process sometimes provides positive benefits.

Sure, but you have to make sure the collateral damage is minimized AND that you're actually catching the offending segment of the population. Mandating gun training before purchase is actually a great comparison here - the vast majority of gun crime in the US is committed with illegally acquired weapons (which would be unaffected by the mandate), just like I imagine that the majority of CSE imagery shared on the internet uses something more robust than OnionShare. So in both cases you'd have high collateral damage with minimal impact to the population you actually care about affecting...


> I imagine that the majority of CSE imagery shared on the internet uses something more robust than OnionShare

The majority is shared on normal social media and Bittorrent, although the worst is shared on dark web sites of the same robustness as OnionShare.


> and Bittorrent

What? No, far from it. Bittorrent is pretty clean regarding illegal material (excluding piracy ofc) - which makes sense, after all you leak you ip address and the torrent that you are downloading to the whole network (if you have DHT enabled) and/or to your tracker, plus your ISP can see what you are sending and receiving (while there is a standard for encrypted transmission in bittorrent I do not think that it is widely used). What do you consider as "CSE" anyway? Would a picture of a girl at the beach be considered a "CSE"? If so you might find such torrents (I wouldn't know), but I think that calling it "CSE" is dishonest.


I'm a detective that works exclusively on online child abuse, I regularly arrest people who have downloaded and/or distributed IIOC over Bittorrent.

The definition of IIOC is provided by the Home Office and split into three categories. Ultimately it is decided by a jury although the categorisation is rarely contested.


> > I imagine that the majority of CSE imagery shared on the internet uses something more robust than OnionShare

> The majority is shared on normal social media and Bittorrent, although the worst is shared on dark web sites of the same robustness as OnionShare.

By what measure?


Number of files shared


Number of files shared that you know about, which I think is a pretty important distinction. Bit harder to track sharing over Tor, after all. (Which is the point!)


I meant the distinction between majority and worst. What is "worst" in this context? And by what measure?


The argument for friction was lost the moment we got cheap access to home recordings, which at this point is any mobile phone that was produced in the last 20 years.

From all my history in computer security, conferences and just picking up trends in what criminals do, the most common channel for sharing CP should be, as an educated guess: Plain text email. A smartphone, a camera app and the email app that is already installed, as an educated guess, is the tool of choice for the wast majority of cases. For the remaining portion we got people who do not take most friction-less method, and here I doubt OnionShare will cause any change in the availability of CP.

A bit of a tell is that during the crypto wars there were officials that forbode that there would be an explosion of CP if free encryption was allowed. CP and nuclear proliferation was the standing bet between Eben Moglen and Phil Zimmermann on what the opposite side would first bring up, and I think its fair to say that neither issue occurred after Phil Zimmermann won.

Taking a perspective from behavior science, CP production and sharing is unlikely to be a rational decision where risk and reward is fairly balanced. As an example I would predict that increased jail sentences does not actually reduce the crime rate, nor would turning a very hard tool into only slightly hard to use tool. Effective measures would have to either address the emotional state of the person right before the crime, or implement catching mechanism in the tools with lowest friction like email and email replacements like chat.


> like the sharing of CP

A victimless crime.

> or other illegal content

Such as for a chinese citizen to post criticism of the party.

> The reduction of friction to host material online in a completely anonymous way only enables adverse actions like the sharing of CP or other illegal content

More like "only enables the less tech-savy and saves time from the more tech literate", I do not see how it correlates with the sharing of illegal material.

> It does not have the support or attention required to enable it to be secure enough like the TOR project does

It is a front-end for tor basically, it inherits its security properties. Moreover just because a project is made by one person it does not mean that it is any less secure. Consider libsodium/nacl/monocypher vs most other crypto libraries for example. Also, by the same logic it would not be of use to these that share CP because "It does not have the support or attention required to enable it to be secure enough like the TOR project does", it's not like people that share CP need any less security than a dissident in china.

> legitimate users

This implies that these who share illegal content are not legitimate users. Please be more clear with your terminology, usually illegitimate users are people such as spammers and attackers.


> > like the sharing of CP

> A victimless crime.

Most certainly not. A lot of victims that were exploited in the production of child porn have to live with the knowledge that for the rest of their life that material will be out there, viewed by every dirty slimy degenerate wanting to do so. It's also one of the reasons a lot of them commit suicide. The fact that the justice system makes even viewing this material illegal is not for nothing, but comes from a proper understanding of how child porn victimizes people.


That isn't quite the case, it is more so due to negative attention in the media from child killers and the like. Most of it actually tends to be thirty year old magazines, nudes and things the minors (perhaps foolishly) upload themselves.

This may also seem morally wrong, so to speak, but the consequences for such are exactly the same as if you were to watch someone getting whipped and violently raped. There isn't really an incentive to go for anything "less bad", especially considering it is an innate attraction you can never get rid of.

In some states, the sentences for outright molesting dozens of kids is lighter than looking at some images or videos on your computer. It is ludicrous and it is almost as if the state cares more about "Out of sight, out of mind" than anything else.

They're even throwing people in prison for looking at cartoons or having the wrong books.

The question at the end of the day is... What exactly are people supposed to do then? Commit suicide? Perhaps genocide 0.1% to 1% of the population who have these "wrong thoughts"? The paranoia has gotten so bad that some viruses even plant CP on your computer.


> A lot of victims that were exploited in the production of child porn

Good thing, sharing existing CP is not the same as exploiting people to create CP.

> by every dirty slimy degenerate wanting to do so

You are aware that you managed to insult even the people who want to but refuse to watch CP for moral reasons, right? You are basically calling a group of people with a certain attraction that they themselves can't change as "dirty slimy degenerate"s.

> It's also one of the reasons a lot of them commit suicide

I honestly doubt that, but if we assume that was indeed the case, shouldn't the authorities simply not inform them that their pictures/videos are being distributed online? Even then though, I still can't see this as a reason to illegalise it. Should we also make rejections(from jobs/relationships/etc) illegal because some of the people who receive said rejection commit suicide?

Also, I find it dishonest of you not to consider the people who have been unjustly jailed if not for distributing at least for possession of CP, some of which have committed suicide.

> The fact that the justice system makes even viewing this material illegal is not for nothing, but comes from a proper understanding of how child porn victimizes people.

Just like how the justice system in most countries until recently illegalised homosexuality and sex before marriage, right? There are many unjust laws, especially in the area of personal freedoms/victimless crimes (growing weed, owning cocaine, sending a nude picture of yourself from when you were underaged, etc)/copyright (piracy/unauthorised modification/breaking drm).


> > A lot of victims that were exploited in the production of child porn

> Good thing, sharing existing CP is not the same as exploiting people to create CP.

No it isn't the same thing. Sharing child porn is victimizing people though, it is victimizing the people that are abused in that material.

> > by every dirty slimy degenerate wanting to do so

> You are aware that you managed to insult even the people who want to but refuse to watch CP for moral reasons, right? You are basically calling a group of people with a certain attraction that they themselves can't change as "dirty slimy degenerate"s.

I didn't mean to insult those people. I like however how you've turned this around from the victims of child porn on to the victims of a comment on HN. And it's absolutely not a fact that people that suffer from an attraction to children can't change that. Some cases seem induced by excessive porn viewing and revert when those people stop doing so for a couple of months. Though there are also people who seem to be incurable.

> > It's also one of the reasons a lot of them commit suicide

> I honestly doubt that, but if we assume that was indeed the case, shouldn't the authorities simply not inform them that their pictures/videos are being distributed online?

This idea is laughable. Do you honestly believe just not telling the victims the material is out there will convince them the material isn't out there?

> Even then though, I still can't see this as a reason to illegalise it. Should we also make rejections(from jobs/relationships/etc) illegal because some of the people who receive said rejection commit suicide?

This is a straw man, I didn't cite the suicides as a reason to make child porn illegal. It's is part of the picture though.

> Also, I find it dishonest of you not to consider the people who have been unjustly jailed if not for distributing at least for possession of CP, some of which have committed suicide.

Oh yeah, how dishonest of me not to consider people punished for possessing child porn. Do you hear yourself?

> > The fact that the justice system makes even viewing this material illegal is not for nothing, but comes from a proper understanding of how child porn victimizes people.

> Just like how the justice system in most countries until recently illegalised homosexuality and sex before marriage, right?

No, this is absolutely not the same thing. Straw man again. Practicing homosexuality doesn't victimize people. Sex before marriage doesn't victimize people. Watching child porn does victimize people, namely the ones being exploited.


You do realize that FBI and similar state agencies shared CP online to catch people? So, did FBI victimize children and should face appropriate punishment (the same as other people who are sharing it)?

And about your last point: FBI and other people who investigate it absolutely do watch CP. Again - who is the victim in this specific case? Does children somehow not victimized when "good" people watch CP?


It is worse than that. In one case, the Australian Police kept a site up for an entire year (the site is said to have had as many as million accounts, although account numbers often don't map precisely to actual people) to try to collect evidence on people.

A lot of these sites only tend to stay up for a year or two before they get shutdown or get skittish, so keeping it up for a year is very significant and aids in the proliferation of this content.

They also deliberately circulated quite a few relatively uncirculated images to try to gain their trust after they took over the site.


> This idea is laughable. Do you honestly believe just not telling the victims the material is out there will convince them the material isn't out there?

I'm not really agreeing or disagreeing with your viewpoints, but your logic here seems weird. In that particular case, nothing would ever convince them that people aren't looking at it, even if they weren't.

You can never really tell if someone is or isn't doing something on the internet, especially with the proliferation of strong encryption.


Shameless plug (that I haven't had much time to work on lately), you can also programmatically do similar in a few lines of Go: https://github.com/cretz/bine


It's good to see Tor becoming more easily accessible to non-technical people.


Next thing we need is a search engine specifically for those websites, hosted in the same way. Reinventing the internet all over again!


There are a few already :) https://www.reddit.com/r/onions/ has some .onion links on the sidebar to search engines.

They aren't decentralized, though; some people are running YaCy (a decentralized search engine) over Tor to create a decentralized index of onion sites: http://wiki.yacy.net/index.php/En:YaCy-Tor


Very cool! I created urlpages [0] as a proof-of-concept with a similar goal. It is always interesting to read about how others try to solve the same problem.

[0]: https://github.com/jstrieb/urlpages


It's very elegant implementation, I wonder though if it's accurate to call it "uncensorable" when there's a single point of failure? Given that searching or seizing the machine hosting the site is still possible (even if that's just by coincidence)? Obviously for the side of the aisle here who are thinking of criminal uses, that's a feature not a bug, but for a dissident who is disappeared that looks different...


To everyone saying "but this will be used for child porn!"... guess what else is used for that? The internet.

Should we just ban the internet as a whole?

I understand the sentiment, but I hard disagree with it. Especially because this tool (and similar to it) can be used for good, and might be necessary in the future.

Try to see the glass half-full.


But what is the need for Tor outside of illegal activity?


The more illuminating question is "what is the need for anonymity outside of illegal activity"?

One answer is that you might someday like to say something which some other party would have a problem with.

We might say that if you truly believe in what you're saying, the principled course would be to say your piece and let the other party be upset. But if the other party has a lot of power (for example a parent, an employer, a university, a government official, etc) they can threaten retribution for your words. If the threat is serious enough, you will likely decide not to speak.

The ability to say things anonymously removes the threat of retribution. It helps people speak who otherwise wouldn't, increasing the range of perspectives and information the rest of us have access to. This doesn't mean that it's beneficial in every single case, but on balance it's a social good - in fact, it's a necessary component of a free society.


Is that mostly what Tor is used for?


It's pretty hard to evaluate. If you're asserting that Tor is used more for criminal purposes than non-criminal purposes, I think you might be right. But that would be a simplistic way to evaluate the impact of Tor (or really any tool) on society.


Plenty of reasons. The most obvious use that comes to mind is to avert government censorship. Then there are whistleblowers. There are more reasons to desire anonymity than illegal activity.

Edit: also, ISPs can sell your browsing information now: https://www.usatoday.com/story/tech/news/2017/04/04/isps-can...


Don't those governments just block Tor?


You can use the Tor network through bridge nodes and in fact the Tor Browser Bundle will ask you if you need to connect this way on first run.


Even perfectly legal activity can result in undesirable retribution. For example, some rich company could decide to sue a person in response to a negative review. Whistleblowers will almost certainly face retaliation if their anonymity is not guaranteed. Anonymous discussion boards are inherently more free: because there is no identity associated with opinions, there are no consequences for being wrong or politically incorrect and this leads to people speaking their minds without fear.

Also, not everything that is illegal should be illegal. Civil disobedience is impossible in a monitored society.


Journalists, whistle blowers, people in oppressive countries, unjustly censored information, the government, etc.

The Navy didn't Tor for criminals, so obviously there is a use case beyond that.


Maybe the Navy is using Tor to trap criminals.


If that's the case they've enabled a whole lot of criminal activity to trap relatively very few criminals so far.


The Navy originally invented the protocol for clandestine communication.

One of the contractors who wrote the software later open sourced it.


Sometimes illegal activity is an ethical necessity.


Nice! I feel like this project is mirroring the development of the internet itself. :D


It would be nice to at least see some hazy consideration of the impact of this. I know a probation officer for sex offenders and she tells me trying to keep up with their technology use is a nightmare. I can only imagine how this complicates her job.

That's not saying that on balance this shouldn't exist. There are of course good uses of technology like this. But I'd at least like to see builders grappling with that balance. Especially given recent prominent reporting around this: https://www.nytimes.com/interactive/2019/09/28/us/child-sex-...


There is a fundamental, irreconcilable tension between the need for people to be able to protect themselves against abusive governments and the need for governments to be able to catch criminals. While it seems good to consider the full impact of anonymity, it's not going to generate any meaningful discussion. You can either be anonymous online or you can't. There isn't much middle ground there. There is no balance that can exist.


>There is a fundamental, irreconcilable tension between the need for people to be able to protect themselves against abusive governments and the need for governments to be able to catch criminals.

Aren't these the same thing to the technology? To someone labeled a criminal, the government doing so is an abusive government. To a government that considers itself right, those who don't follow the rules are criminals.

We each have our own personal views on what should and should not be allowed which we judge individual cases with, but technology does not have a sense of morality to make such judgment as to when the criminal is in the right and when the government is in the right.

I think people far too often try to humanize computer algorithms with notions of right and wrong. The technology to avoid and oppressive regime is the exact same technology that protects the horrible person from justice.


If that's truly the case, I'd like to see the developer grapple with that.

However, I don't think that's the case at all. We can see in the real world there are many ways to strike a balance between safety and privacy.


> We can see in the real world there are many ways to strike a balance between safety and privacy.

So how would this work online? "We have private chats, but the government can look into any of them at any time"? I suppose it depends on your definition of privacy ... are your chats still private if I'm unable to see them but your friendly neighborhood police department reads them?


As I said, this gets handled a variety of ways in the real world. If this is your first time encountering the topic, maybe a good place to start is a talk: https://securityboulevard.com/2019/06/dataedge-2019-alex-sta...


Some people use cars to crash into large crowds of people on purpose. Should automakers implement devices to make sure the cars can't be misused?

The reality is everything can be misused.


Your analogy only works if it were possible to do that:

1) With a car without a VIN number / registration

2) With a car without a license plate

3) With no driver and no way to know who was controlling the car.

The point is, it’s not good to be able to commit serious crimes with zero risk of punishment.

The issue at debate is whether the gain from the legitimate (legal) use cases of technology like this outweigh the loss of making it easier for sex offenders / drug traffickers to more easily subvert authority.


The police have other ways of catching people doing crime. We don't need to give up all of our privacy and rights for them to do their job. If you think having privacy or rights stops the police from catching criminals then you are quite mistaken. If the police can catch el chapo and the rest of the drug cartel bosses then any other criminal can be taken down regardless of the technology they're using.


I would agree with you if the debate were “should we abolish SSL so we can MITM all internet traffic to catch more criminals?”

In that example, there is legitimate argument that SSL encrypted connections to more good than harm (it makes logging into your online bank possible on public Wifi, along with 1000’s of other examples like it).

But I’ve still yet to see a clear argument for absolute internet anonymity for anything that would benefit the average person (yet, I can think of dozens of ways it harms people).

It’s very likely that I’m just not aware of the legitimate (practical) use cases of TOR.


Anything the government knows about you COULD become (and in many nations, already are) a liability. Even things that are completely harmless.

On top of that, let's say that you do become a suspect of a crime you didn't commit. Every little shred of information that can be found is going to be used to paint a false narrative against you. Why would you want anybody to have ANY information on you at that point.

Surely you've heard of "the right to remain silent" and "pleading the 5th". Anonymity online is the internet version of that.


Sometimes the crime is pointing out that the police are violent thugs, or that your government is interring people in "reeducation" camps.

It's true that anonymity is a double-edged sword, but it's better than having no blade when one is wielded against you.


> But I’ve still yet to see a clear argument for absolute internet anonymity for anything that would benefit the average person (yet, I can think of dozens of ways it harms people).

anonymity doesn't make it impossible to be held accountable for your actions. people tend to (unintentionally) dox themselves without any outside help. what it does is make it a lot harder for law enforcement to catch people without specifically targeting them and investing resources.

I know you're an upstanding, respectable citizen who would never do anything illegal, but just think of privacy/anonymity as a hedge against that tiny chance that the government might actually make an unreasonable law.


There are over a billion people under a particular authoritarian regime that could potentially benefit from practical use of this technology.


> it makes logging into your online bank possible on public Wifi

It makes internet banking possible, period. Cleartext banking, in any public Internet context, would be simply unworkable.


I disagree, authentication is not tied to encryption at all.


Why would we care about the benefit of the average person? How is that even an argument? Like imagine forbidding to publish some books or music, because we somehow decided that they might not benefit the average person.


> I’ve still yet to see a clear argument for absolute internet anonymity for anything that would benefit the average person

Absolute anonymity should be the default. There is no need to justify it. It's the government that needs to justify it's need to know stuff about specific persons.


It is possible to deface or alter the VIN number, steal a license plate, and rig a car for remote control.

If you commit, or attempt mass murder this way, you'll probably be caught due to the amount of resources a law enforcement agency will spend investigating such a crime. If you put up an anonymous website distributing child pornography, but you're not the one producing it, law enforcement will spend several orders of magnitude less resources investigating that crime.


It's called an investigation and it worked for hundreds of years without the internet.


Exactly. A screwdriver and a file can make a deadly weapon, a piece of car tire inner tube paired with a Y-shaped piece of wood and a few bearing balls can make a slingshot capable of killing a person at distance. Bows and arrows are easy to build, and the necessary materials are easily obtainable in nature. If we restrict everything that can be used also to harm others, we'll end up needing a weapon permit just to enter a hardware store or take a walk in the park.


The counterargument to this is always what is the tool designed to do?

- Should the layperson be allowed to own weapons-grade plutonium?

- How about being able to buy and cultivate your own anthrax cultures?

- Why shouldn't I be able to spray DDT when it's super effective?

Sure, a screwdriver is design to screw, just as fentanyl is designed to kill pain, but in the hands of a bad actors they can cause different levels of harm.

I'm not saying Onionshare is good or bad, just that you need to step back at look at any tool and consider the net potential impact if bad actors are able to exploit it.


That's why I emphasized the "also"; one thing is a tool whose primary purpose is not to harm people but can be misused to do so, and a whole different thing is something which can only be used to hurt people or restrict their liberties.

"I'm not saying Onionshare is good or bad, just that you need to step back at look at any tool and consider the net potential impact if bad actors are able to exploit it."

True, and that's why we must think very carefully before enacting laws that make this or that tool illegal just because some criminal found a destructive use for it; there are so many things potentially dangerous in the wrong hands that blindly banning them all would either be impossible or would bring us back to stone age.


No, "designed to do" and "functions it can provide" are never the same.

You shouldn't limit smart people because some OEM is stupid and didn't think up an innovative use case when deciding and designing on the intended use of a product.


Let's not forget that unlike cars, data on the internet can not actually physically harm people. Especially for crimes with an overwhelming moral component such as sexual abuse, people seem to have a tendency to conflate someone deriving pleasure from pictures of the act with the actual act. (See also UN pushes against pornographic manga, where there has not even been an actual act)


Something feels wrong about this analogy. There is a difference between something that's designed to maim and something that could maim if it's misused.

I am not sure where Tor fits into that distinction, and I think that's the issue.


There are circulation codes and policemen watching the roads, trying to reduce reckless behaviour. How do you police and bring down misuse of onion sites for hideous crimes?


Well, for one thing, in order for the hideous crime to have real world impact it needs to by definition exist somewhere beyond just the onion site.

Otherwise it literally just is information.


Information can be harmful. For example, people who were raped as a child do not want footage of that rape distributed as a masturbation aid, and yet numerous dark web sites are dedicated to exactly this.


It reminds of how EU can order facebook to delete its posts for the entire world, because EU wants to control what other people outside of EU are allowed to see.

https://www.euractiv.com/wp-content/uploads/sites/2/2019/10/...


They might not want it, but I do not see how it is harmful, it does not affect them at all after all. They could as well live their whole lives without knowing that their images are online. It is pretty similar to piracy if you think about it. Some people thinking they are entitled to dictate if and how certain numbers are used and distributed.


You are essentially advocating for a society without a right to privacy which while logically consistent is an extremist position not shared by the vast majority of people


I would consider a society in which you are not allowed to communicate with others without the government getting access to be one without a right to privacy. It's interesting how we can come to polar opposite conclusions about the implications of this technology.

(It seems that the difference may be in what exactly we consider to be a privacy violation. I would say that in the child-pornography case, the victim's privacy has been destroyed the moment the pictures have been observed and recorded by the abuser, and further reproduction is not a privacy violation because there is no privacy in the pictures left to violate. Since Tor and co. have no impact on the ability of a child pornographer to make the initial recording, they are an unalloyed good for privacy.)


> further reproduction is not a privacy violation because there is no privacy in the pictures left to violate

Is this honestly how you'd feel about footage of someone you know being raped?? It should not be illegal to distribute it...? Can you explain precisely what you think the law should be?


I don't know how you get that out of my post. Privacy violation is not the only grounds on which something may be illegal.


You just infiltrate the site and use social engineering to hack it. TOR site does not stop police from investigating and catching the people running the site.


no one employs perfect opsec at all times and all it takes is one small mistake. ross ulbricht (sill road guy) caught caught because of one silly forum post that linked his two identities.


It's a politico-technological arms race. Governments make laws. People make technology that nullifies those laws. The government reacts and makes laws that circumvent the technology. People make even more technology to deal with the new laws. This can end in one of two ways:

1. Subversive technology becomes so ubiquitous and easy to use it's impossible to fight it. The government stops trying.

2. The government's control increases to the point of tyranny and people are no longer free.

I want the government to give up. Anything else will mean the destruction of the free internet and free computing we all enjoy today. I don't want a future where encryption is military use only because of laws that make it impossible to run code not signed by the government.

> But I'd at least like to see builders grappling with that balance.

There is no balance to be had. The internet contains the full spectrum of humanity. The very best and the very worst of humanity are both be found on the internet. The governments were oblivious for a while but now they want to narrow down the spectrum. As they impose laws and exercise control, the internet becomes safer but it also becomes less international and more regional. The flow of information is no longer free and unrestricted. The spectrum is narrowed down from both sides.

Sadly it will only get worse from here. I am happy to have enjoyed the internet while it was in its infancy, and I am sad that future generations will probably never experience it.

> Especially given recent prominent reporting around this:

Every article about encryption regulation is like this. Pedophiles and terrorists. Children in particular are the perfect political weapon. The exploitation of children provokes reactions in people that are so visceral they are ready to accept any law that would supposedly make it stop. Anything can be justified with these arguments and anyone who disagrees is labeled a pedophile apologist and instantly shunned.


> and she tells me trying to keep up with their technology use is a nightmare

This sounds like a good thing. It might come as a surprice but a lot of people in the sex offenders registry are there for victimless crimes (such as sending nude pictures of themselves when they were underage - consider https://metro.co.uk/2019/06/27/boy-15-made-sign-sex-offender... as an example of the >10 instances in news reports that I have read of such situation). The probation officer being unable to keep up with the technology that these people use will guarantee them similar levels of privacy that people like you and me have.

> https://www.nytimes.com/interactive/2019/09/28/us/child-sex-...

Just an offtopic note, this site is painful to read (on the desktop at least). Whatever happened to simple text sites?


Law enforcement will always chafe at the chains of privacy. Hell, if the police could read your thoughts, they would.


Got to be honest, that was my first thought - if anybody actually comes up with something that's actually anonymous and actually uncensorable, it will be shut down, by whatever means necessary, by the government with the full blessing and support of the public.


The Tor network hasn't been shut down, has it?

Dark sites of all types still continue to run uninhibited for the most part. From my readings, the major or at least known/researched security holes have been plugged already by the developers, and the rest of convictions have happened because of user error, not the technology involved.

If memory serves me correctly, one particular pedophile abuser was caught after years of being investigated because he was identified through skin marks (I want to say it was freckles on his hand, though I don't recall the exact details of the article) that led investigators to him, not anything technological.

I'm a proponent of total anonymity online because it allows for whistleblowers to leak information that is untraceable to them (again, without consideration of human error). In the case of Wikileaks, for example, even though they weren't on an anonymous network, it still led to the largest classified information leaks known, from NSA tools to national and international communications between heads of states, diplomats and everyone in between. The general consensus was that it was a good thing. I don't necessarily agree with that, but one can't discount its value.


There was a large bust of a pedophile forum years back. Police took over the site and run it for years and it was ran on TOR. Pretty interesting story overall:

https://www.theguardian.com/society/2016/jul/13/shining-a-li...

From what I recall, most of these TOR sites are only taken offline because of some server misconfiguration or some other opsec weakness in the people who run these sites.


The interesting part about that to me is the authorities seem to break the law with impunity under the guise of enforcing the law.

Seems a bit disingenuous to do this.

Why not sell drugs as a cop to arrest people for buying them... er wait... they do that too.


If they just shut down the site, those who weren't busted would just migrate to a new one. If you run the site under the guise of the owners, you can keep catching more and more users as they come in. Not just the people who are seeking the kids, but possibly also the ones supplying them. It's a grander version of the old trope of an FBI agent posing as a 14 year old girl in a chat room to nab a predator.

I agree with you that selling a dimebag to a teenager to bust them is shitty and shouldn't be done. But there's a world of difference between busting kids for weed and catching pedophiles. If they're taking these people off the streets I really do not care if they're "disingenuous" to get them.


We criminalize possession and distribution of child pornography on the basis that doing so constitutes further abuse of the child in question.

If we allow the FBI to distribute child pornography, under this rubric we are basically allowing them to commit child abuse. We allow law enforcement to commit some acts -- like buying or attempting to sell illegal drugs -- which would normally be crimes. But we would find acts of actual harm to innocents by law enforcement abhorrent. For example, if an FBI agent posed as a hitman online and then actually committed the contracted murders, that would be abhorrent and not a legitimate law-enforcement use.

Either distributing CP is child abuse or it isn't. If it is, the FBI shouldn't be doing it irrespective of how many criminals they may catch by doing so, because they are further victimizing innocents. If it isn't, that calls into question the basis on which distribution of CP is exempted from First Amendment protection.

Oh wait, I forgot. Different rules apply to those in power.


>It's a grander version of the old trope of an FBI agent posing as a 14 year old girl in a chat room to nab a predator.

If it was I wouldn't have a problem with it. The reality is, by running the sites, they are actively enabling what they consider harmful.

Consider other examples of police stings. When doing drugs, they often use actual drugs because the mere presence of drugs at a sting is not an issue. But when dealing with more dangerous material, they use fakes to ensure that people don't get hurt, and the times when they haven't used fakes it has been considered extremely bad and had people push for policies to change it.

In this particular case, the sites are not like drugs which are inherently harmless just by being present, but are far more dangerous material which results in significant harm to those the police allow to continue to be victimized.

Imagine if police ran a human trafficking front and actually kidnapped people, justifying it by saying they were working to get the guys running the whole thing. Would that be acceptable?

I think not.


The comparison in your last paragraph doesn't really work unless the police were the ones producing the child pork in the first place...


You explanation only works if you assume only production and not distribution is harmful. Current moral and legal standards strongly disagree with this, including those of the police enforcing the laws.


You should care and at some point you will.

When you let the rules slip eventually that slippage will be used against you.

History shows this...

First they came for the socialists, and I did not speak out— Because I was not a socialist.

Then they came for the trade unionists, and I did not speak out— Because I was not a trade unionist.

Then they came for the Jews, and I did not speak out— Because I was not a Jew.

Then they came for me—and there was no one left to speak for me.


I have seen and appreciate your quote, but I will not be the one speaking for pedophiles.

If you think that running a sting operation to catch pedophiles is going to lead to a Third Reich-esque roundup and extermination of minorities I really don't even know what to say.


If you think breaking the law to get bad people doesnt set a precedent that will be abused you havent been paying attention at all.

The patriot act and the revoking of rights without due process was sold as something to fight terrorists. Now we have common street criminals being charged as street terrorists to circumvent due process.

It simply is fact that govt is the most serious threat to citizens. This is why the constitution was created and the nation was architected as a constitutionally bound republic.

This tripe of "think of the children" has always been used to get more power by govts and will continue to happen.


You know that you can easily replace pedophiles with most groups in your post, right? Such as for example "jews". The sad part is that there are actually people who think and thought in the past the same thing, but it is a real danger (as history has shown).

I suggest an alternative: how about judging people by the harm that they have committed to others rather than their race, attraction, opinions, etc?


The US Government developed Tor so that covert agents around the world would be able to effectively hide their internet traffic. See https://www.torproject.org/about/history/

They need other people to use it to be effective. They aren't going to shut it down until they have a replacement. They might even control >50% of the network.


The reason why it hasn't been shut down is because they already have ways to compromise it, so it's better to act like they want to get rid of it, all the while knowing that they can crack it if they really need to. It creates a false sense of security.

If there's truly a guaranteed secure way of doing things, the government would act a lot quicker.


This makes the assumption that the government has the ability to go after everyone using it to harm people, including tens of thousands of children, but chooses not to. Under such an assumption, one can no longer consider the government the good guy.


The USG routinely ignores crimes that would reveal sources and methods. Some known moles have gotten off scot-free while they were fed disinformation because it was more valuable to keep their adversary in the dark about what the US knows.


And you've reached this conclusion how?

The sense of security in anonymity is not false as long as the dark web continues to proliferate its services.

Your conclusion is invalid without providing further proof. If you want counter-proof, just visit the dark web one of these days. it's as vibrant as ever.


I'm not so sure. .onion sites are already close to anonymous and uncensorable. the people who do illegal stuff on the darknet seem to get caught because of personal mistakes much more often than a vulnerability of the medium itself.

even if you had some perfectly anonymous, uncensorable medium with zero security flaws, people would still make mistakes and get caught. it's never impossible for the state to catch someone, but it can certainly be too expensive to be worth it. this is a feature imo, not a bug. it should be expensive to investigate people.


" ... if anybody actually comes up with something that's actually anonymous and actually uncensorable, it will be shut down, by whatever means necessary, by the government ..."

This is correct.

However, the building blocks of circumvention technologies (et. al) and the building blocks of basic consumer Internet services and communications channels are indistinguishable.

You can't preemptively disable secure, covert communications and also provide rich, secure Internet services (such as banking, corporate email, etc.)

It is very much "Free Speech or Stone Age".[1]

[1] https://blog.kozubik.com/john_kozubik/2009/06/free-speech-or...


You do not need covert communications for things like banking or corporate email. You only need a safe way to perform authentication.


"You do not need covert communications for things like banking or corporate email. You only need a safe way to perform authentication."

Agreed. However, the building blocks of that authentication (and other security protocols involved) are the same building blocks used for covert communications.


It already exist and called freenet. If you read the arrest cases for freenet - they are almost non-existent.


Make a list of allowed technology items. Any internet connection out of the house must go through the specified router, which monitors traffic and proxies SSL/SSH to allow monitoring. Only allow a phone that the probation officer controls. Any attempt to tamper is a violation.


Please don't do this: https://xkcd.com/793/


I’m sure the correct solution is to attempt to control the entire rest of the world.


If you are that worried about children you should think about inventing new ways to monitor children, so that it becomes impossible to do something to a child without a lot of people knowing about it. Instead of trying to find excuses to spy on everyone on the internet.


Fredrick Brennan calls this the "Pedo Problem", whenever you create a free space for people, pedophiles will flock to it.


I guess it makes sense in countries that have a authoritarian government to be able to subvert it, but in others it's hard to reason why a person should be totally able to untether themselves from meatspace and not be held culpable for their actions. That article on child sex abuse is absolutely sickening. (It's terrifying how common child abuse/pedophilia is)


What country doesn't have an authoritarian government?


Sounds like http://arweave.org


Definitely. OnionShare requires your own computer to be connected to the interwebs - the data is only accessible while your OnionShare app is running. With Arweave the hosting is taken care of (the data/website/etc is hosted on the Weave). Plus, OnionShare sites are not verifiable/ reliable as they are a website running on someone's computer. An Arweave site is timestamped and unforgeable; it's also resilient (replicated on many nodes) and permanently stored.


So, I used to be on-board with these type of things, but now I dont understand what the true purpose of an uncensorable website? It just seem like a place for hate groups to unite and bad things to happen... I would love a different perspective on this :)


It's easy to feel that way in, say, the U.S., where only hate groups are getting censored right now, and even that only by private companies. If you think about a country like China, where for example it's hard to find anything online saying a massacre at Tiananmen Square ever happened, then the purpose of these things becomes more obvious.

The time to set up uncensorable communications is before you're being censored. After, it's a lot harder to get traction.


Websites that publish child sexual exploitation material are getting heavily censored in the U.S. too. Think through what uncensorable implies.


> Websites that publish child sexual exploitation material are getting heavily censored in the U.S. too. Think through what uncensorable implies.

To me, it implies that things you agree with and things you don't agree with cannot be censored. This seems like a prerequisite for a free society.


This is a bit meta and for that I apologise, but it's kind of interesting that the "drug addicts, paedophiles and terrorists" rhetoric seems to have lost almost all of its potency.

I found the reference to hate groups to be a much more effective argument, I think purely because everybody is so used to the kids/drugs/terror angle being overused and abused to chip away at liberties.

Surveillance-state dystopians take note.


Hate groups also seem to be much more abundant and visible. You don't see your local pedophilia lobby getting involved in politics.


That material could also be loaded on a flash drive and sent via the mail. I suppose the USPS and flash drives should be banned.


If I'm not yet censored then why wouldn't be better off saving time by using much easier tools to publish my speech?


So that's essentially a public goods problem. If the censorship-resistant tools aren't popular, they won't be there for you if the censorship does start. But you'll likely have little effect on their overall popularity by publishing your own content on them, so you do better by publishing on the easiest tools, and hoping to freeload on others building popularity of the resistant tools.

The best way to address this is probably to make the resistant tools as easy and compelling as possible.


FWIW, Tor is blocked in China.


Isn't "But think of the dissidents in China" the libertarian version of "Think of the children"?

In any case, OP was visibly grappling with a tradeoff. The question is if there are compromises, i. e. methods to make such projects more useful for the good dissident, and less so for criminals.

One idea would be to restrict the types of files: video would seem to be more useful for illegal pornography than anti-government organisation. Please note: more useful. I'm not saying video is completely useless for good causes.

You could even make features dependent on the language used, or do some rudimentary geolocation. Or correlate directly with freedom: "if Wikipedia is not blocked, your upload speeds are severly limited".

None of these are perfect, and most are easily circumvented by people with a working compiler. But But they might help mitigate the possible misuses. Denying even the possibility is just lazy.


I would say video is one of the most useful tools for dissidents. A lot of good has come from video of police abuses, for example. And if the news media isn't covering your protest, it might as well not exist unless you're able to publish your own video of the event.


This is such a Western-centric point of view. The West (largely) enjoys free speech, other countries do not. Other countries have incredibly repressive regimes...where merely questioning your government means you are either under surveillance or disappear.


That's due to Western philosophy, which itself is seems to be OK with the idea of mob rule for short periods of time before it burns out due to idiocraty. The Bonnie and Clyde phenomenon as one example, and Trump as another. No Trump supporter could stand to enter a rational debate on the subject, and even if they did, they would attempt to be irrational anyway.

Think of our philosophy as starting a fire on purpose to prevent an out of control fire later. We know we've got a bunch of rowdy idiots here, and love them for it, but every once in a while they find an amazingly lazy way to grab power and shake things up a bit - like a randomization function.

I'm still hoping 5 years later all of them will forgot what was so important and go back to discussing Jennifer Lopez's butt.


Sure. Basically, it all comes down to whether or not you trust the government to be good, always. If you believe this is the case, you're fine. If you don't, you will probably prefer un-censorable technologies. America has (historically) tended to fall pretty far to the "free speech" side of the line, though there is a push to change that.

Even if you trust the American government, people in repressive regimes can still make use of this. This could enable a western news outlet to make itself accessible to those who are brainwashed by state media.

Trying to prevent "hate groups" from congregating or criminals from opening marketplaces doesn't usually work, either. It's the prohibition principle: if you ban the tech, criminals will use it, but an American NGO won't be able to reach those with censored internet.

You may not use it, but I don't see why your "being on board with it" ought to affect what others create.


>Even if you trust the American government, people in repressive regimes can still make use of this.

I think an important point to make is that just because you trust the American government now, doesn't mean that this won't change in the future. Which is why the core tenets of free speech and privacy should be protected at all costs.


Even if you have "free speech" you could still end up on some form of subversive persons list. We're not far from McCarthy 2.0.


It is a place to discuss anything without someone removing it. Hate groups, LGBT, people for legal reasons who cannot disclose can, someone releasing information on power entities...

It is a place to find prespectives that are not allowed.

Reminds me of a star trek voyager show where bad thought has been outlawed so they have a black market for those type of thoughts. From the start there are people who want to express and someone who wants to stop them. This stops the person stopping them.


Indeed, I don't understand why anyone thinks they need freedom anymore, we've practically won. There's nothing left to have freedom from, other than the remaining vestiges of hatred we must always be fighting. There can be no freedom from justice and accountability. Obviously, these people are using freedom to mean something other than what society already provides, as if their interests were somehow more legitimate, or they were trying to preserve the seeds of hatred that we've finally made progress against. We live in a society and an international system of co-ordination, while not perfect - it is against terror and hate, and so only hate groups and other terrorists could want to limit its reach or find alternatives.

/s

I could go on, but that's about where we are.


> It just seem like a place for hate groups to unite and bad things to happen... I would love a different perspective on this

Just because the world's loudest whiners have decided that people disagreeing with them is censorship, doesn't mean that there aren't legitimate examples of actual censorship and oppression in the world.

There are plenty of countries where criticizing the government on social media results in prison, beatings or being disappeared. There are regimes that aren't above using a dissident's family members as collateral or for retribution.


You must be really excited about the current administration's effort to fight end-to-end encryption then. Can't have them thoughtcriminals spreading their wrongthink¹.

¹ definition is subject to change depending on which entity is currently in power.


Because one may want to express an opinion that isn’t sanctioned and approved by a government agency/Twitter mob/middle class moral crusaders in general?


Historically, oppressive regimes have killed and raped far more than racist groups and child pornographers.

There are other ways to investigate or deradicalize fringe extremists. There is only one way to have free speech.


Historically, a lot more people died from smallpox than aircraft crashes. Does that mean we should focus on eradicating smallpox (again) and not on airplane safety?


Oppressive regimes are not "eradicated" [0]. They exist even in our own time. There is nothing geographically special about Western countries that makes our governments better, that credit belongs to civil liberties and a culture of respecting mutual freedom.

[0] https://en.wikipedia.org/wiki/Whig_history


free speech is bad when people i dont like say things that make me uncomfy :)


Free speech includes hate speech. Ban that and you may be next.


Free speech requires anonymous speech in many places.

On the other hand where we already have free speech, I’m not a big fan of anonymous speech. Speech should be subject to the self censorship that comes from thinking about whether someone is hurt/offended/etc. Free speech means being free to say what you want without legal consequences, not without social consequences.


Thanks for the different point of views everyone. Wish this wouldn't have got down-voted by two points. Did not consider parts of the world that have government censorship, makes more sense in that case.


Any tool that can be used by dissidents will be used by hate groups.

Conversely, any method for censoring hate groups will be used against dissidents.

I used to be anti-censorship, now I'm pro-censorship because let's face it, the dissidents are fucked regardless and the fascists (and worse) are the only ones who are going to get much use out of these tools. But technological solutions to social problems aren't going to work. The computer knows nothing about the meaning of the bytes it processes. They're all just ones and zeroes.


If only it was this easy to publish a website from your home computer to the web regular.


Yeah a small Raspberry Pi hosting the website at home is my dream


The book I am reading, main character is named Rohan. #synchronicity


What book are you reading?


It was "The Invincible" by Stanislaw Lem


How is it comapre to zeronet ?


Zeronet is sites-over-bittorrent: you connect to people seeding the site, download it, and then you also become a seeder for other people. By default everyone can tell you (that is, your public IP) are serving a particular site, but you can use Tor to make it much harder to find out what is your IP.

OnionShare is just a regular webserver (every visitor accesses the site by connecting to a single machine: yours), but it automatically only serves it behind Tor, for the same reasons as above.

The advantage of this seems to be mostly simplicity (both in implementation and in setting Tor and such up) and possibly speed of first download for lightly used sites.

The advantage of Zeronet is that more people can serve the site, even if the original machine is offline or overwhelmed.



Amazing!


very cool!


I've been in tech my whole life. I really enjoy technology, I'm pleased with the career I chose. But man am I tired of technologists "moving fast and breaking things" without any concern whether those things being broken might be better left whole. Just because you can smash social norms, doesn't mean you should.


The social norm that is "privacy is bad"?


Privacy for pedophiles is bad. Privacy for terrorists is bad. Privacy for illegal acts is often bad (this is entirely the point of whistle-blower legislation, after all)


Sure, sure, my basic assumption is that you can't have privacy for certain acts only. If the idea is that somebody watches and completely ignores all legal behavior but steps in when they witness illegal behavior, that's not privacy, it's something else.

Privacy means that nobody watches, so if you want to build tools for privacy, you'll always end up also enabling people to use them for illegal things. We (as Western societies) have mostly decided that general privacy is so valuable to citizens that we accept the collateral damage of crime being possible. I've not heard a lot of arguments against privacy in general, the arguments tend to be "I don't need privacy in this particular part of life, so it better be removed".


Sometimes it is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: