Hacker News new | past | comments | ask | show | jobs | submit login

That's a great initiative. I was hoping IPFS would be the de facto way to publish uncensorable websites, but I have grown more and more disappointed in it, as it fails a lot for me in its most basic purpose (fetching files). Hopefully one of the other competitors will do better, but IPFS got a bunch of theoretical stuff right, in my opinion (mainly immutability of objects).



I've worked with IPFS and I completely agree with you here. There are so many basic bugs that haven't been fixed because of the over abstraction of LibP2P.

Currently, it connects to bootstrap nodes and tries to discover peers asynchronously. Obviously, it doesn't handshake with the bootstrap nodes in time for it to start discovering peers so it waits another 1-5 minutes depending on the setting to try discovering again.

Startup usually takes a few minutes due to this trivial problem and an issue has been open for it since May: https://github.com/ipfs/go-ipfs/issues/5953


IPFS person here - there are a ton of improvements to the libp2p DHT in the works (many which also relate to bootstrapping efficiency), but a key component of shipping these to an active 100K+ node network like IPFS is testing and validation, so it's taking us longer to ship these improvements while we build a testground able to simulate network conditions to increase network _reliability_ as well as performance. IPFS is still actively WIP and directly focused on improving performance, reliability, and usability - so feedback and collaborators welcome to help us address these issues head-on. =]


I like where IPFS is going and I think it's an important bit of software, so it's good to hear you're working on it.

I do have a question though on the immutability, how do you address problems where people contribute harmful content, such as revenge porn pictures, or doxing someone? If the content is immutable, it means these things would always exist and never be removable? Is there any moderation that can stop this kind of stuff?

Just wondering, I still like the idea, it's just a question I have been thinking about.


What IPFS means by 'immutability' is that the mapping from id-->content will never change. That doesn't mean that the content will always be available on the network and accessible to any viewer. If you remove the content from the network (aka all nodes stop hosting the content), then no one will be able to find and access it.

For example, if bad person A adds something illegal to a node they host on IPFS, then person B can legally compel them to stop providing that content (ex by suing them). While the id-->content mapping would still be immutable, the content can no longer be found/viewed because bad person A is no longer hosting it. There is no "automatic persistence" in the IPFS network, so only nodes that choose to host a piece of content will become providers (aka no automatic process will keep bad content around if all hosts have taken it down).


This is actually where I feel like there’s only trivial purpose to IPFS. Any content can still be trivially taken down simply by going after anyone that’s hosting it. Yes, there are legal (and moral) benefits here, but it no longer represents a technology to free information that is restricted for personal or political reasons.


Immutability on ipfs just means a given address will always point to the same content. It doesn't mean that content is or will always be available. Nodes who host harmful content are responsible for hosting it whether it's on a peer-to-peer network or a server/client network.


Unfortunately if you want an uncensorable network you won't be able to censor the nasty stuff either.


This is great to hear, I'd be extremely happy to see IPFS become fast and reliable, and I think it could be a hugely important piece of software.


Absolutely. In fact, I've been following those developments very closely. I hope to see (and possibly contribute to) some significant improvements to IPFS, especially the DHT.


That's awesome! See active work in progress on DHT bootstrapping here: https://github.com/libp2p/go-libp2p-kad-dht/issues/387


I've been working with Dat, which I think is similar to IPFS and really enjoying it. Being able to modify your site after you publish it is hugely useful (previous versions are always accessible - edits are stored in an append only log). It has its own problems (editing from multiple computers is risky but that's being worked on), but in my opinion that's essential for a decentralized web


Been working with libp2p, IPLD, etc (IPFS stuff). It's improved a lot. Still rough on the edges but definitely improved.

Fundamentally, the spec are good. These guys might not have the most mature development practices but it's a completely different category than tor. The project are broken down in a very [theoretically] elegant way and one project that truely exemplifies this is IPLD: https://github.com/ipld/specs

IPLD (among other protocol labs projects) is applicable to so much more than just IPFS!


From what I can tell, one of the main goals of IPFS is decentralization. Anyone can be hosting the file you want and it will download from any of the peers. Does OnionShare do this?

As you mentioned, I've also found IPFS to be fragile, particularly in that it is unlikely for stuff to stay up reliably, as ironic as that is. If I'm trying to host a website with not too much traffic (nobody else replicating), I have to keep my computer online or it will become inaccessible.


There's an annoying misconception that IPFS is "free storage" - it's not. If you care about something sticking around on the network, you need to take action to ensure that happens: either as an individual pinning it to a persistent node you run, as a developer paying a distributed pinning service to host your dapp data, or as a community creating a social agreement to all peer their nodes to replicate each other's data. All these solutions are doable today, but you shouldn't expect others to host your content for free (if it isn't popular enough for other people to be proactively caching it). IPFS is the distributed network for addressing files, you have to bring, buy, or share your own storage - anything else (ex forced hosting of content you don't choose) would have problematic liabilities!


Isn't that the point of Filecoin? As payment mechanism to incentivize node operators to host content


> Does OnionShare do this?

IIUC, it's onion-routing request traffic to your computer, but your computer is still acting as the sole server of the website. Turning off your machine gives incoming requests nowhere to resolve to.


In practice it is the case for IPFS too most of the time.


> it is unlikely for stuff to stay up reliably, as ironic as that is

Well, it's kinda inevitable with these systems, since there's limited available space. Freenet lets you "insert" data into the network (copy to other nodes) and disconnect your computer, but if nobody requests that data, it'll also eventually disappear from every node.


Does IPFS even try to provide anonymity? Even if it would be technically "censorship-resistant", what good is it when everyone can see who hosts the content, track them down, and subpoena/arrest them?


No, anonymity isn't a goal of IPFS. If you add anonymity (for both consumers and publishers) to IPFS, you get the design of its older sibling: FreeNet.


Why does it have trouble fetching files?


I'm not sure why, but if you have a node that has been running for a while and you add some files to it, some gateways will find them right away and others will keep timing out for many minutes. Discovery seems to be very hit-or-miss, and it's the most important component of IPFS...


Maybe OP is talking about IPNS? Which is unfortunately quite slow... I had to fuss with my TXT entries in DNS a bit to speed things up to an acceptable level (and reference the site via IPFS instead of IPNS)


Unfortunately even normal fetching is iffy. Content I add to some content isn't discovered by other nodes quickly.


Yeah that's IPNS as the GP suspected. It's definitely the hardest part of working on top of the IPFS ecosystem. Direct IPFS lookups are much, much better.

On the OpenBazaar project we're experimenting with a centralized IPNS record server that is used to provide a fast response while a real IPNS lookup is done in the background to ensure we're not censoring.

Because records are signed the only thing we could do is censor newer IPNS records and serve old ones, but then nodes just revert back to the slower-to-update records like they had before.


I'm not talking about IPNS, I'm talking about normal IPFS lookups. Hopefully this will improve (maybe has improved, I haven't tested lately), but it used to be bad, to the point where some gateways wouldn't discover content that already existed on other gateways.


IPFS has built-in denylists so it isn't very uncensorable.


Not sure where you got that info, but AFAIK that is incorrect. While everyone running their own infra has the responsibility to comply with local laws (or face local consequences) - there is no central network allow/denylist. Every single node has the power to decide what content they pin and make accessible to other nodes in the network, or how to consume content coming from other nodes (ex they might choose to filter out certain types of content). There is no central ipfs authority involved in any of that. If an individual node wanted to opt-in to a collaborative denylist (ex to block spam, etc), this is the most recent proposal for how that might work: https://github.com/ianjdarrow/content-decider (still in design phase)


https://github.com/ipfs/faq/issues/36#issuecomment-140567411

My understanding is that it is not possible for Tor nodes to block (or even know about) what you are accessing.


A Tor exit node can block destinations openly by declaring limitations in its exit policy (in principle by IP address or port, although I don't know if any exit policies use IP address this way) or by simply firewalling connections from itself to particular destinations. In the latter case, the node may be marked as a BadExit if a scanner detects that it intentionally blocked a connection that it offered to relay.

A node in an earlier position in a circuit can't know the ultimate destination of the circuit, or the content exchanged over it, although it could in theory blacklist individual nodes as a next hop (which doesn't typically really accomplish anything for content-blocking purposes, and could also be detected if it were a pervasive practice).

I think in HSv2 (the old hidden services protocol) there were a number of different opportunities for relays to selectively disrupt hidden service connections, while in HSv3 (the new revision) there are fewer.

A difference between Tor and IPFS in regard to what you're mentioning above is that in Tor, a specific entity (a specific clearnet or hidden service site operator) has to continue actively hosting a resource in order for it to remain accessible at its original location, while in IPFS, it's more like that there need to be some participants who haven't actively declined to host it, but they don't have to be the same as the original publisher.


Unfortunately, I think projects like this that make it easy to create onion resources are quite dangerous. The reduction of friction to host material online in a completely anonymous way only enables adverse actions like the sharing of CP or other illegal content. Please look here for a good explaination of friction in this space -> https://stratechery.com/2019/child-sexual-abuse-material-onl...

If it is used by political or marginalized communities, then this project is also not useful. It does not have the support or attention required to enable it to be secure enough like the TOR project does. This application opens up legitimate users to incredible risk.

If I were a dissident, I would take my time and do the research required to set up a system like this in the correct way. This project may give people a sense of false security.


While IPFS has been used successfully for censorship resistance (ex the Catalonian independence movement: http://la3.org/~kilburn/blog/catalan-government-bypass-ipfs/), I completely agree that anonymity is not something that IPFS claims to support right now - and users should be very careful in how they use the network for censorship resistance. While there have been some experiments in this space (ex OpenBazaar's onion/tor transport for IPFS: https://github.com/OpenBazaar/go-onion-transport), none of them have been audited for reader/writer privacy.


Ah, yes. The good old "don't make that tool because bad people might use it" argument.

Similarly, we should ban sharing information on hacking/security because bad people might use it, and require that all communications be visible to the government to prevent abuse.


That is not the argument being made. It is an argument that having friction in the process to do something that may have dangerous outcomes is sometimes positive.

It is much the same as arguments that there should be gun training required to purchase a firearm. Adding friction to a process sometimes provides positive benefits.


> That is not the argument being made. It is an argument that having friction in the process to do something that may have dangerous outcomes is sometimes positive.

The argument being made, as far as I can tell, is that this tool will make doing bad things easier, and thus this tool should not be made. As any tool to resist censorship will also enable said bad things, I do not find this to be a valid concern. (And since some governments would list censorship resistance itself as one of those bad things, this is always going to be true)

> It is much the same as arguments that there should be gun training required to purchase a firearm.

I think the tech equivalent to this would be mandating a "don't sexually exploit children" class before allowing purchase of a computer. (As you might guess, I'm not a fan)

> Adding friction to a process sometimes provides positive benefits.

Sure, but you have to make sure the collateral damage is minimized AND that you're actually catching the offending segment of the population. Mandating gun training before purchase is actually a great comparison here - the vast majority of gun crime in the US is committed with illegally acquired weapons (which would be unaffected by the mandate), just like I imagine that the majority of CSE imagery shared on the internet uses something more robust than OnionShare. So in both cases you'd have high collateral damage with minimal impact to the population you actually care about affecting...


> I imagine that the majority of CSE imagery shared on the internet uses something more robust than OnionShare

The majority is shared on normal social media and Bittorrent, although the worst is shared on dark web sites of the same robustness as OnionShare.


> and Bittorrent

What? No, far from it. Bittorrent is pretty clean regarding illegal material (excluding piracy ofc) - which makes sense, after all you leak you ip address and the torrent that you are downloading to the whole network (if you have DHT enabled) and/or to your tracker, plus your ISP can see what you are sending and receiving (while there is a standard for encrypted transmission in bittorrent I do not think that it is widely used). What do you consider as "CSE" anyway? Would a picture of a girl at the beach be considered a "CSE"? If so you might find such torrents (I wouldn't know), but I think that calling it "CSE" is dishonest.


I'm a detective that works exclusively on online child abuse, I regularly arrest people who have downloaded and/or distributed IIOC over Bittorrent.

The definition of IIOC is provided by the Home Office and split into three categories. Ultimately it is decided by a jury although the categorisation is rarely contested.


> > I imagine that the majority of CSE imagery shared on the internet uses something more robust than OnionShare

> The majority is shared on normal social media and Bittorrent, although the worst is shared on dark web sites of the same robustness as OnionShare.

By what measure?


Number of files shared


Number of files shared that you know about, which I think is a pretty important distinction. Bit harder to track sharing over Tor, after all. (Which is the point!)


I meant the distinction between majority and worst. What is "worst" in this context? And by what measure?


The argument for friction was lost the moment we got cheap access to home recordings, which at this point is any mobile phone that was produced in the last 20 years.

From all my history in computer security, conferences and just picking up trends in what criminals do, the most common channel for sharing CP should be, as an educated guess: Plain text email. A smartphone, a camera app and the email app that is already installed, as an educated guess, is the tool of choice for the wast majority of cases. For the remaining portion we got people who do not take most friction-less method, and here I doubt OnionShare will cause any change in the availability of CP.

A bit of a tell is that during the crypto wars there were officials that forbode that there would be an explosion of CP if free encryption was allowed. CP and nuclear proliferation was the standing bet between Eben Moglen and Phil Zimmermann on what the opposite side would first bring up, and I think its fair to say that neither issue occurred after Phil Zimmermann won.

Taking a perspective from behavior science, CP production and sharing is unlikely to be a rational decision where risk and reward is fairly balanced. As an example I would predict that increased jail sentences does not actually reduce the crime rate, nor would turning a very hard tool into only slightly hard to use tool. Effective measures would have to either address the emotional state of the person right before the crime, or implement catching mechanism in the tools with lowest friction like email and email replacements like chat.


> like the sharing of CP

A victimless crime.

> or other illegal content

Such as for a chinese citizen to post criticism of the party.

> The reduction of friction to host material online in a completely anonymous way only enables adverse actions like the sharing of CP or other illegal content

More like "only enables the less tech-savy and saves time from the more tech literate", I do not see how it correlates with the sharing of illegal material.

> It does not have the support or attention required to enable it to be secure enough like the TOR project does

It is a front-end for tor basically, it inherits its security properties. Moreover just because a project is made by one person it does not mean that it is any less secure. Consider libsodium/nacl/monocypher vs most other crypto libraries for example. Also, by the same logic it would not be of use to these that share CP because "It does not have the support or attention required to enable it to be secure enough like the TOR project does", it's not like people that share CP need any less security than a dissident in china.

> legitimate users

This implies that these who share illegal content are not legitimate users. Please be more clear with your terminology, usually illegitimate users are people such as spammers and attackers.


> > like the sharing of CP

> A victimless crime.

Most certainly not. A lot of victims that were exploited in the production of child porn have to live with the knowledge that for the rest of their life that material will be out there, viewed by every dirty slimy degenerate wanting to do so. It's also one of the reasons a lot of them commit suicide. The fact that the justice system makes even viewing this material illegal is not for nothing, but comes from a proper understanding of how child porn victimizes people.


That isn't quite the case, it is more so due to negative attention in the media from child killers and the like. Most of it actually tends to be thirty year old magazines, nudes and things the minors (perhaps foolishly) upload themselves.

This may also seem morally wrong, so to speak, but the consequences for such are exactly the same as if you were to watch someone getting whipped and violently raped. There isn't really an incentive to go for anything "less bad", especially considering it is an innate attraction you can never get rid of.

In some states, the sentences for outright molesting dozens of kids is lighter than looking at some images or videos on your computer. It is ludicrous and it is almost as if the state cares more about "Out of sight, out of mind" than anything else.

They're even throwing people in prison for looking at cartoons or having the wrong books.

The question at the end of the day is... What exactly are people supposed to do then? Commit suicide? Perhaps genocide 0.1% to 1% of the population who have these "wrong thoughts"? The paranoia has gotten so bad that some viruses even plant CP on your computer.


> A lot of victims that were exploited in the production of child porn

Good thing, sharing existing CP is not the same as exploiting people to create CP.

> by every dirty slimy degenerate wanting to do so

You are aware that you managed to insult even the people who want to but refuse to watch CP for moral reasons, right? You are basically calling a group of people with a certain attraction that they themselves can't change as "dirty slimy degenerate"s.

> It's also one of the reasons a lot of them commit suicide

I honestly doubt that, but if we assume that was indeed the case, shouldn't the authorities simply not inform them that their pictures/videos are being distributed online? Even then though, I still can't see this as a reason to illegalise it. Should we also make rejections(from jobs/relationships/etc) illegal because some of the people who receive said rejection commit suicide?

Also, I find it dishonest of you not to consider the people who have been unjustly jailed if not for distributing at least for possession of CP, some of which have committed suicide.

> The fact that the justice system makes even viewing this material illegal is not for nothing, but comes from a proper understanding of how child porn victimizes people.

Just like how the justice system in most countries until recently illegalised homosexuality and sex before marriage, right? There are many unjust laws, especially in the area of personal freedoms/victimless crimes (growing weed, owning cocaine, sending a nude picture of yourself from when you were underaged, etc)/copyright (piracy/unauthorised modification/breaking drm).


> > A lot of victims that were exploited in the production of child porn

> Good thing, sharing existing CP is not the same as exploiting people to create CP.

No it isn't the same thing. Sharing child porn is victimizing people though, it is victimizing the people that are abused in that material.

> > by every dirty slimy degenerate wanting to do so

> You are aware that you managed to insult even the people who want to but refuse to watch CP for moral reasons, right? You are basically calling a group of people with a certain attraction that they themselves can't change as "dirty slimy degenerate"s.

I didn't mean to insult those people. I like however how you've turned this around from the victims of child porn on to the victims of a comment on HN. And it's absolutely not a fact that people that suffer from an attraction to children can't change that. Some cases seem induced by excessive porn viewing and revert when those people stop doing so for a couple of months. Though there are also people who seem to be incurable.

> > It's also one of the reasons a lot of them commit suicide

> I honestly doubt that, but if we assume that was indeed the case, shouldn't the authorities simply not inform them that their pictures/videos are being distributed online?

This idea is laughable. Do you honestly believe just not telling the victims the material is out there will convince them the material isn't out there?

> Even then though, I still can't see this as a reason to illegalise it. Should we also make rejections(from jobs/relationships/etc) illegal because some of the people who receive said rejection commit suicide?

This is a straw man, I didn't cite the suicides as a reason to make child porn illegal. It's is part of the picture though.

> Also, I find it dishonest of you not to consider the people who have been unjustly jailed if not for distributing at least for possession of CP, some of which have committed suicide.

Oh yeah, how dishonest of me not to consider people punished for possessing child porn. Do you hear yourself?

> > The fact that the justice system makes even viewing this material illegal is not for nothing, but comes from a proper understanding of how child porn victimizes people.

> Just like how the justice system in most countries until recently illegalised homosexuality and sex before marriage, right?

No, this is absolutely not the same thing. Straw man again. Practicing homosexuality doesn't victimize people. Sex before marriage doesn't victimize people. Watching child porn does victimize people, namely the ones being exploited.


You do realize that FBI and similar state agencies shared CP online to catch people? So, did FBI victimize children and should face appropriate punishment (the same as other people who are sharing it)?

And about your last point: FBI and other people who investigate it absolutely do watch CP. Again - who is the victim in this specific case? Does children somehow not victimized when "good" people watch CP?


It is worse than that. In one case, the Australian Police kept a site up for an entire year (the site is said to have had as many as million accounts, although account numbers often don't map precisely to actual people) to try to collect evidence on people.

A lot of these sites only tend to stay up for a year or two before they get shutdown or get skittish, so keeping it up for a year is very significant and aids in the proliferation of this content.

They also deliberately circulated quite a few relatively uncirculated images to try to gain their trust after they took over the site.


> This idea is laughable. Do you honestly believe just not telling the victims the material is out there will convince them the material isn't out there?

I'm not really agreeing or disagreeing with your viewpoints, but your logic here seems weird. In that particular case, nothing would ever convince them that people aren't looking at it, even if they weren't.

You can never really tell if someone is or isn't doing something on the internet, especially with the proliferation of strong encryption.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: