Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That’s literally what git is designed to do from the outset. That’s why it’s called a decentralised version control system.

I’m sure this project brings something new to the table but they’d need a better elevator pitch than “peer-to-peer” to explain that.




But git is not user-friendly. You have to figure out how to push to people directly, and that isn't obvious to a commoner like myself. It would be great to have a nice front-end like Transmission is for bittorrent that makes the experience seamless.


Git works over email; it works across folders (including remote ones); it can read/write several standalone file formats (patches, bundles, etc.) that can be sent via carrier pigeon, etc. As far as I'm aware, there are no technical problems with "push[ing] to people directly" (or likewise allowing others to pull from you).

There is a technical problem in the P2P setting, where it's not just one individual connecting to another. P2P protocols which just replicate data, like Bittorrent, can't negotiate a delta when pushing/pulling, so users have to keep pushing/pulling the entire repo. Radicle's network is smarter, allowing deltas to be calculated.

They also seem to be proposing many other things, which I'm more skeptical of; but at least that point seems valid.


Bittorrent, can't negotiate a delta

what do you mean by that? as long as you don't recompress the git storage, bittorrent will only transfer the missing blobs. the only problem is that for each change a new torrent hash needs to be created. but you can stuff a new torrent with the git repo you already have and then bittorrent will only transfer the missing blobs just like git does.


Yes, only the missing file data needs to be sent. Working out what's missing is the problem, since protocols that aren't aware of git's structure (blobs, trees, commits, etc.) cannot exploit that knowledge to e.g. walk up from each ref and stop when we hit existing ancestors. Instead, they're stuck comparing the file contents of two entire copies of a repo (via their Merkle trees). That's probably not a big deal for smaller projects (e.g. I've played with hosting my own git projects on IPFS), but it's a lot of overhead for projects like the Linux kernel with massive histories, lots of refs, many developers frequently pushing and pulling many changes, etc.


I agree. However there are already plenty of frontends to git and this isn’t marketed as one of them. Perhaps intentionally because of my earlier point that there are lots of frontends already? But focusing on the p2p aspect doesn’t do any better of a job explaining what this project does.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: