Hacker News new | past | comments | ask | show | jobs | submit login
Transmission 4.0.0 beta 1 (github.com/transmission)
527 points by Torlan on Oct 7, 2022 | hide | past | favorite | 254 comments



Good to see it being in active development. But the one feature I'm waiting for is a network whitelist. For example, it should never connect over my home or office network, but only via anonymous VPN.

(Edit: I know you can do this already with firewall rules, or containers, but I don't trust myself to not make a mistake. Ideally this should be part of the OS. You should be able to right-click somewhere in the window and say "use this network for this app".)

The other feature I'd like to see is sequential downloads, but the developers object out of principle so that is never going to come.


This is a plug-and-play solution bundling Transmission and the most common VPN configurations in a Docker container, ensuring that all torrent traffic goes via an active VPN connection. After configuring it, you just run "docker-compose up" and can then access Transmission via its web UI. It could be nice to have this built into the client, but using a container feels safer.

https://github.com/haugene/docker-transmission-openvpn


This is a great container! Been using this for years. (I've contributed!)

The only downside is that @haugene has gone missing for sometime now, as such there are no firm releases. If you want new features/bug fixes, you have to pull dev, which isn't very ideal. Otherwise the maintainers are doing a great job (shout out to @pkishino!)


For me, the real power of containers is that all the mistakes I could've made during the install are in ~a single file instead of a long forensic goose chase


I’ve been using this for a while now, and it has been working like a charm!


I’ve got a home lab with proxmox as the hypervisor. I’m running openwrt (x86) for my household router (in a qemu VM) and I’ve got transmission running in a (lxc) container. When the transmission container boots up, it gets its IP address via DHCP (server in openwrt). I configured dhcp to allocate the same lP address to the container every time. You could also just use a static ipconfig in the container.

Within openwrt, I’ve installed WireGuard and OpenVPN for accessing different vpns.

Finally, again in openwrt, I installed the policy based routing package[1]. This package makes it super easy to route all traffic from the transmission container to the VPN network.

There are probably many ways you could do this, but this setup is great for me!

1: https://docs.openwrt.melmac.net/pbr/


Sounds like a lot of fun, and a huge time sink.


Tldr: use OpenWRT to create a static IP reservation for a dedicated torrent instance, then use source-based routing to send traffic over a VPN running on the router.

Sounds pretty awesome. I've not dabbled with PBR on home equipment. We did use it in prior employment to force all PCI traffic over a dedicated physical network between datacenters.


Agreed. Users shouldn't have to understand firewall rules, containers, VMs, or even VPNs in order to tunnel traffic for a specific app through a (more) trusted third party.

We need an open protocol for establishing tunnel/VPN connections. Apps implement the protocol, which lets you enter your tunnel provider and go through a quick OAuth flow to establish a tunnel. This would be a big win for VPN providers as well because people would be sending only specific traffic through them instead of everything.


users that don’t understand those things are doomed to make the wrong trust choices. vpns are decidedly not more trusted. differently trusted, sure.


Users should understand how to use their computer. Asking users to comprehend a firewall or a vpn (if using a vpn) is not unreasonable.

Coddling users is harmful.


Circa 1989 I remember people complaining that the NeXT computer (which eventually turned into the current Mac OS) was needlessly coddling users, and that anybody who wanted the power of Unix needed to know how to recompile their kernel. I also remember people objecting to MS Windows because people really should be using MS-DOS so that they understand what's going on.

The main point of computers is to make things easier for people. Most of us make our livings because computers are very good for that. If most users really had to deeply understand the technologies involved in solving their needs, a lot fewer of us would have jobs and much more time would be wasted by people who were just trying to get something done. So yes, let's absolutely coddle users.


The main point of computers is not to make it easier for people, that mindset leads us to cede control of our lives to technology.

The point of computing is to be a tool to make people more powerful and efficient, an extension of our abilities, able to do more or better.


Yes, I think we agree. That would fit within my definition of "things" in "make things easier". They can only do more/better with computers if the software is built to support them in that. If instead we insist that people focus on understanding the computer rather than doing the things they want to do, then they will do less and/or worse.


I would be fine with users doing less with their computers, if they in turn understood how they worked and why we all get angry here about things like: politicians likening cryptography to criminal activity, companies hoovering your data from every site you visit, etc.


I see. In what areas are you prepared to make the symmetrical deal? That you'll do less and get less while investing more time in understanding areas other people are less expert in?


I’ll cook more if we had less shitty pre made food in stores and less fast food. I could make several other examples but that’s just one I thought of.


> The point of computing is to be a tool…

The point of a tool is to make things easier for people.


There are millions of job openings for coders now. Illiteracy was not a good idea.

Also, youtube just served me an un-skip-able 12 minute add 30 seconds into a video. It was the 3rd add. The thought police is all over the web now. My windos cant download photos from my iphone. My windos is rebooting when it feels like it. The glued in phone battery life is garbage after 2 years. I must have purchased 30 chargers now. My new laptop requires a microsoft account. Google search is not working anymore. Adsense is driving up prices.

Its like trying to get nutrition from MacDonalds. The mc drive is convenient tho. Very easy for users.


This isn't about coddling users, it's about not reinventing the wheel.

Sure you could spend hours reading pf man pages. But maybe it would be better use of time if one person figures out how to do it, writes code to do it automatically, and shares that with other people so they can spend more time watching all the movies they download instead of configuring their bit torrent client.


Making secure practices simple (or automatic) for less-technical users is not “coddling”.


"Hey why are my torrent downloads running super slow?"

"Oh, well, your torrent client decided to automatically use your vpn because it saw one running and somebody decided that was the right behavior because they thought you were dumb."

Users have preferences. Users can read. Let them learn and sort it out. Don't ruin a simple program with tarred assumptions of what is correct for users. Prefer simplicity.

Also don't assume using a vpn is more secure. That is a wrong assumption.


I don't have a horse in this race, but I think you're presuming simplicity where there is none.

A bittorrent client, even in CLI form, is not simple software at all. Have you ever taken a look at the myriad of available settings?

Most GUI clients have a sort of "wizard" that helps you pick acceptable connection parameters. This feature alone has a truckload of assumptions coded in. And that kind of feature exists for a reason: no nonsense-seeking users. They're the majority. Just let me download my file bro.


It also makes sense that the developer chooses parameters. They are probably the person with the most detailed knowledge of the problem, they are probably the person who knows best what the parameters should be.

It baffles my mind when developers just expect users to tweak low level settings. Sure, it's good if stuff is configurable, but it should work correctly out of the box. Like, you spent days or weeks working on this feature, don't expect your users to do the same...


Yes, I agree that your assumptions about how you think I think this should work are bad. :^)

But to be more specific with my feedback, it seems a bit silly that Transmission (I'm trying 4.0.0b1 right now) doesn't allow me to specify that it must use "VPN X" (as opposed to "VPN Y", which is also available) for downloads.

I understand that I could mess with namespaces or containers or VMs and probably make that work, but "use Bittorrent client with VPN" is more or less a universal use case and should be straightforward.


> Also don't assume using a vpn is more secure. That is a wrong assumption.

I agree with this statement, but what should a user ‘read’ in order to understand this?


Under Linux you can run transmission as another user and have an iptables rule to only allow outbound traffic through a specific interface.


you don't even need the extra user: use a network namespace.

- ip netns add vpnonly # create an empty namespace

- ip netns exec vpnonly wg quick ... # connect to your VPN

later, launch transmission inside this namespace:

- ip netns exec vpnonly transmission

has the nice property that as long as you do that exec step right (or even half right), the failure mode is no connectivity rather than accidentally sending traffic in the clear.


Or just uidrange based policy for a separate routing table that just routes to the VPN. Just one thing to set up.


Or as your user in a cgroup, probably.


It already has the ability to specify an IP address to bind to. I use an OpenVPN up script to stop any running instance, replace the bind address with the current VPN interface address and start an instance.


I have to agree with them, it’s antithetical to what torrent is about.


Depends on what torrent is about. For me, it is about watching shows that you can't officially watch where I live yet. Once I week, I hit up a site and, click a magnet link, and a couple of minutes later I can watch my favorite show. (The cost of the VPN is incidentally about the same as a streaming subscription. I think for this reason it is tolerated by the powers that be.)

I don't expect torrents to be available for a long time. Only very few people keep the torrent entry in their client alive for long, even though I keep around all the files. This is as opposed to eMule and older systems, where you can download any file in the user's media folder.


>(The cost of the VPN is incidentally about the same as a streaming subscription. I think for this reason it is tolerated by the powers that be.)

I really don't follow you there


I believe the previous commenter meant that sequential downloads are orthogonal to the design of Bit Torrent.


Superseeding (and whatever you would call it from the client point of view) allows for peer-to-peer broadcasting.


Why? It doesn't need to prevent concurrent uploading.


I'm using namespaced-openvpn to restrict transmission to one network and you could limit the amount of concurrently active files to one, effectively downloading them sequentially. Or do you want to download each file sequentially?


I would hazard they are asking for file-block sequential downloads from start of file to end, so they can start watching a movie while it is still downloading and be assured that the first parts of the moview are present without gaps ...

This goes against the general cluster inter-peer everybody swaps what they have so far w/out reference to file position for maximising transmission within a swarm design of the torrent protocol.


I run my torrent client in a container and have that container only go out the vpn. Works very well without screwing with computer wide network settings.


Set http_proxy environment variable to a socks proxy that goes through the vpn. Some vpn clients offer a socksproxy built in. That way if the vpn is not connected the proxy address is dead.


Not ragging on Transmission here but qbittorrent has zero problems getting out of my network and I didn't open up any ports or upnp either, give it a try. Transmission just has issues on my network that I don't care to figure out or open up ports for, qbit is quite good at navigating through NAT.


Lately I've noticed that public trackers are behind Cloudflare. If you're on a VPN, you have a high likelihood of getting the Cloudflare captcha. Your bittorrent client will be unable to scrape those trackers, or bootstrap DHT, etc because of the cloudflare block.


If you don’t trust yourself to not make a mistake with firewall config, why do you trust yourself not to make one when setting up an allow list? One wrong digit or dot and you’ll be in the same place.

Start with a default policy of deny and work from there, if you’re only creating rules for allowing traffic explicitly then there’s less possibility for error.


alternatively qbittorrent has this built in.


bitport.io cloud torrenting and done (after downloading at high speeds from their site), for me at least.


I'm really excited to see support for torrents v2. If only because transmission will stop chocking on hybrid torrent files, which should lead to more adoption across the board.

I wish transmission supported mutable torrents (BEP46). It would allow completely re-thinking distribution: grab a torrent of your favorite distribution, and have it auto-update to the latest version.

Or grab a "super-torrent" that lists isos for all known distributions, select the ones you're interested in, and let it update... Of course, that can be used for file-sharing repositories too.

Maybe transmission is not interactive enough for such a use-case (you'd have to ask the user what they want to do when the torrent mutates, and possibly add simple scripting interfaces for more complex heuristics).


> I wish transmission supported mutable torrents (BEP46). It would allow completely re-thinking distribution: grab a torrent of your favorite distribution, and have it auto-update to the latest version.

That sounds horrible. My own expectation is that if I have a torrent file/magnet link and I download from it, I always get the same content. This is why I commonly keep the magnet link/torrent file around even if I delete the content itself, so if I want the same thing in the future, I just re-download it and I'm 100% it's exactly the same.

This change would break that expectation, for what benefit? This could be implemented as a HTTP call instead, that serves different torrent files, so you can just change whatever serves the torrent file in order to mutate what gets downloaded.


> This change would break that expectation, for what benefit? This could be implemented as a HTTP call instead

The main benefit is not having to rely on a centralized HTTP server and repository, as well as the associated infrastructure: DNS, TLS, CA, etc. One could host a blog over torrents (more or less what IPFS is doing).

We already have RSS feeds for what you describe (though not supported natively by transmission).

> My own expectation is that if I have a torrent file/magnet link and I download from it, I always get the same content.

That wouldn't change significantly. The mutable torrent content is basically a .torrent that points to whatever the current version is; so just save that instead.

The good thing with mutable torrents (or mutable content in the DHT, as pointed out in another comment) is that instead of the hash (content-addressing) being stored in the DHT, a public key is retrieved. Once you got a torrent ("trust on first use" of sorts), you can verify that the new version is from the same uploader (or at least, one that knows the private key).

I would present it with a new set of checkboxes in the torrent client: "download new versions", then if checked "delete previous version after downloading updated content" and for more granularity "download new files", "download updates to existing files". Of course, for ease of use it would be better to have a historical view. I think the main use-cases would be to archive every intermediate version, or to just grab whatever is latest. There's infinite variations in-between. One thing I would like to stress, and that I didn't point out enough in my original comment is that this goes extremely well with v2 torrents and their de-duplication feature: if the same file exists in multiple versions of the torrent, peers with any version can contribute to the swarm.

It's not as good for deduplication as a rolling hashes could be, but you could host whole distribution repositories with mutable v2 torrents.


It doesn't unless your magnet does not contain an infohash and only contains the btpk.


First it needs to support BEP44 "Mutable DHT", which is arguably a much more useful feature than mutable torrents. It is also a much simpler feature.

http://www.bittorrent.org/beps/bep_0044.html


I agree that's a dependency, but I am not sure how much more useful it would be comparatively?


Pretty sure it gets you most of the way there by simply putting a mutable magnet URI in DHT. The rest is just a clever way of reusing already downloaded blocks.


This is a cool feature that I've idly wondered about in the past, but I didn't know it was actually spec'd/implemented in Bittorrent.

You could imagine hierarchically pushing the entire of TPB or whatever into the DHT. And even, each torrent having an `index.html` that renders a page containing links to the rest of the sub-torrents, so that the entire site is hosted in the DHT.


Right, and we saw some examples of Sqlite+torrent in the past; those databases could be made editable more easily (although it would be nice to de-duplicate unchanged parts, not sure if sqlite DBs can be chunked).

Not necessarily sqlite, that said. It might be useful for openstreetmap tiles/db, or indeed repositories like library genesis, wikidata/wikipedia dumps, etc.


Oh that’s an interesting idea. I saw someone built SQLite over HTTP with the Range header: https://phiresky.github.io/blog/2021/hosting-sqlite-database...

So presumably in a similar manner as https://github.com/phiresky/sql.js-httpvfs you could map SQLite pages to leaf torrents too and get the chunking you are looking for.


When I first came to Linux I adopted Transmission. It seemed less featureful than others but now I like its simplicity, does everything it needs and it is also very reliable.

It is not very known but there actually is a transmission-cli for common torrent operations that may come in handy some time.


FYI, transmission-cli is considered deprecated by the Transmission developers. Using both transmission-daemon and transmission-remote is the recommended approach. Actually, transmission-cli uses transmission-daemon internally and they share the same configuration. transmission-cli is still useful for one-off downloads, but I still prefer the recommended way because I'd like to seed older torrents some more time.


> transmission-cli is considered deprecated

Wah! Bad news! I like to kick off a download from an SSH shell. Now I have to find out what transmission+remote is.


I was greatly disappointed too when I found out about transmission-cli's deprecation, but I have since found and replaced all uses of it with stig[1] which, for me, does the same thing and more.

[1] https://github.com/rndusr/stig


You can still do that

   $ transmission-remote -a one.torrent two.torrent


You could also check out rtorrent: http://rakshasa.github.io/rtorrent/, but I remember reading that development has stalled on that project.


aria2 can download a torrent from a magnet link given as CLI parameter.


I'm not a user of transmission-cli in general, but transmission-create (part of transmission-cli) is indispensable for me.


There's also the Transmission Remote GUI, which is a nice GUI for remotely controlling the Transmission daemon, on say, a NAS: https://github.com/transmission-remote-gui/transgui

It's great for keeping downloads going overnight without having to leave all the power-hungry devices powered on.


FWIW, qbittorrent also has a built-in web interface. It's also one of the most feature-rich BitTorrent clients.

https://github.com/qbittorrent/qBittorrent


And you can install it with just the web interface as qbittorrent-nox.


It's a great piece of software. I'm running a web remote v2.92 on an rpi with simple basic auth protection, and it has been working flawlessly for years now.


Back in my day, I was running Transmission on a WRT350N router…

It was quite an ordeal to set up before there were DD-WRT builds where it was integrated into the build.


Ran it on a NSLU2 in the past. Worked great. Brilliant piece of software.


The cli is also used by Fragments, a Gnome torrent app.


15.7 MB binary on macOS, in an age where it's usual to deliver a 400 MB Electron package. I kneel.


I wonder how much stuff its including there. I use the alpine package for transmission and its 1mb https://pkgs.alpinelinux.org/package/edge/community/x86/tran...


These 1mb are only for the daemon and not taking into account the dependencies. Although I’m pretty sure the linux version is more lightweight in general because the gtk dependency is very likely to already be installed.


Let me stay upright. 15M seems a lot. What's in there ? High-def country flags for Retina ?

edit: ok, size is ~4.5M on my Debian, of which transmission-gtk accounts for 3.6. Maybe graph libs are bigger on mac ?


3.8mB is an auto-update framework – https://sparkle-project.org

3.5mB is for a Quicklook plugin, so that Torrent files can be previewed in the Finder.

4.5mB is UI 'stuff'. The Cocoa 'nibs' that define the application UI tend to be bigger than comparable GTK XML files.


It's a multi-architecture package - ARM and Intel.


As a Chinese, I really need the anti leeching capabilities, as too many people here use Xunlei that only download but never upload. Sad to see that the maintainers skip this again.


It’s probably impossible to track that with any meaningful precision. Historically it’s the (central) tracker that implements the bookkeeping.


And as the reported stats originate from the torrent client... They're able to report whatever they want.

I remember using some obscure tool before 2010 to intercept these packages and update them. I was in my early teens back then and used it to overreport my upload statistics by something like x2 on some private trackers, so they wouldn't ban me.


It used to be easy to falsify that information, but no more. Most private trackers would easily ban you for that nowadays, unless you can get all the other clients to also cheat with you using the same parameters. It's easy to cross-reference the information coming from N clients, so you'll be booted relatively quickly.


That sounds like a useful development to counter such abuse, but i don't think this feature helps your torrent client wherever it should upload data to any given peer.


It’s quite easy to block Xunlei. First, they identify themselves in the connection. Second, no matter how much you upload to them, they always report zero progress.

The maintainers reject the proposal to block by client ID. But they never consider the second method.


Both methods can be circumvented quite easily and it will happen, if Transmission were to implement this. It's a cat and mouse game you can't win.


Ad blocking is a cat and mouse game. Anti censorship is a cat and mouse game. Anti fraud is a cat and mouse game. Anti spam is a cat and mouse game. I don’t see people giving up so easily on those problems.


I can't think of any way to solve this problem if a majority of clients in a swarm is malicious.


The swarm eventually gets abandoned as new users can't reach 100% completion?


Impossible to track with 100% accuracy, yes. But not completely meaningless to do some basic tracking. A torrent client could implement some tracking on their own, where they keep track of seen clients and keep track how much been uploaded vs downloaded from them (effectively ratio per client) and upload slower to clients you've never encountered that have been uploading.


How would your client know what other clients have uploaded to other clients?


Obviously you wouldn't (the network is decentralized, you can't know other peers communication) but also you don't care (peer A <> B might be able to upload/download, but peer B <> C can only download in one direction).

The important part is to keep track of what's happening with your own connection to others, not what other peers are doing.


> The important part is to keep track of what's happening with your own connection to others, not what other peers are doing.

99%+ of the time any pair of peers has unidirectional traffic. The situation where two clients will have incomplete torrents and are exchanging data with each other is rare, like the first few hours after a brand new torrent hits the network.


In my country, noone will bother you if you download pirated content but they will bother you if you seed it.


Switzerland.

The birthplace of BitThief.

https://github.com/pmoor/bitthief


I just read that Xunlei is the most popular BitTorrent client in China.

Does it not upload to none Xunlei clients or does it leech Bittorent entirely? I can’t imagine one that is a pure leech could exist without harming BitTorrent


I'm not sure. Some say that Xunlei does upload to other Xunlei clients, so they have to identify themselves in the connection (rather than try to emulate other clients and secretly leech the swarm).


You need to sign up for a private tracker to get that.


Write it on their github maybe? Also I would be interested, how many people there use a VPN for torrents?


There are already GitHub issues like this. The maintainers simply close them.


As is their right and somehow even their duty if it is not in alignment with the project's philosophy or goals.

You can create a fork and implement the feature or pay someone to implement it for you. That's the wonderful benefit of free software.


So is my right to complain on HN.


If someone asks me to install a Torrent client on a single machine, my favorite would be QBittorrent, however at home I use XigmaNAS' Transmission client extensively, and operate it from other machines in the LAN using the Transgui interface (also available on Windows and MacOS). This allows me to turn off everything but the NAS, which with all RAID disks spun down and a TDP of 15 Watts makes for some good energy savings.

https://github.com/transmission-remote-gui/transgui


Qbittorrent unfortunately turned into a buggy mess and has no leadership. The github has hundreds of known issues that are not addressed. No qbittorrent client works reliably on Windows or Mac for at least a year or two. It gets hung up scanning downloaded torrents etc.


I ran into the scanning bug. Turned out to be because I was leeching and had a max upload of 1kbps. I bumped that up to something more sensible and it started working again. This is an issue with most Torrent clients, although QBittorrent is definitely the worst with it that I've encountered. Unsure if it's what you're running into, but figured I'd point it out since it took me weeks to troubleshoot.


I run 5+ instances of qBittorrent on my unraid server and have not experienced any of these issues, same on macOS.


This really bums me out to hear. I have used many torrent clients over the years, Azureus -> uTorrent -> Deluge -> (transmission on mac and linux, although it didnt work great on Windows), until finally landing on qBittorrent and loving it.

I hope they figure out their maintainer issues.


qbittorrent on Linux has a crazy memory leak that makes it use more than 1gb of memory very fast and possibly. I had to stop using it for quite a few months. I refonwloaded it recently and it wasn't fixed.


I found out that's just disk cache. Try launching some memory hungry programs and you'll see the cache released and qBittorrent running at around 150 MB again.


Didn't know about that, thanks for the heads up.


Seems to work OK on my Pi with the web interface.


> The project is much more responsive to bug reports and code submissions than it has been in the past.

Cool to see community listed as a feature up there with the tech stuff


If I die I want to be deprecated in the changelog


"He was deprecated doing what he loved... sniffle"


This and the code modernization is the biggest announcement.

That means future improvements are going to be 2x easier and we’ll see the product mature much quicker in the future.


This release is really important if only because it supports BitTorrent V2 spec.

Transmission has a huge market share, particularly in open source circles. Lack of support for the V2 spec has stopped many open source projects from migrating their torrents to the V2 spec, because transmission clients won't work.

Please support and use other open-source torrent clients, because Transmission's long-standing history of going years between updates and their significant market share simply is not healthy.

The first client to support v2 spec was two years ago; same year the libtorrent library that some clients use supported v2: https://torrentfreak.com/biglybt-is-the-first-torrent-client...


I think Transmission works great. What's the v2 killer feature?


The protocol should be able to find same files in other torrents - you could be downloading a bunch of files (like a season of TV), but are able to find and download one file from another torrent that is better seeded (like one single episode torrent). But I don't think any of the clients implement that yet.


That's great!

I suppose they removed bittorrent's weirdness of allowing chunks to overlap multiple files. That was the original sin that caused all this


There are lots of changes that are nice-to-have efficiency improvements.

But there are no new killer features. I would guess that the extra user headaches from having outdated software not work with new torrents will exceed the benefits of slightly faster downloads and slightly better availability.


SHA256


> Transmission's long-standing history of going years between updates and their significant market share simply is not healthy

Agree that it's been a long time but isn't this V4 release about to speed up development? New cleaner codebase, new contributors, ...


Part of my Ubuntu post install script is removing Transmission and installing Deluge.


Deluge showing your public IP in the UI is a big plus for me.


ohh, I had an addon on my Firefox statusbar that did this. Gone it is with the status bar, sadly.


Still no support for webtorrents, making it impossible to use it for helping to lighten the load of my favorite peertube VODs :(


Seems like that libtorrent support for webtorrent would be helpful for that.

That was done recently in https://github.com/arvidn/libtorrent/pull/4123, merged to master, but hasn't been released yet.

Per the master changelog, that should be soon, I guess... https://github.com/arvidn/libtorrent/blob/master/ChangeLog


I might be wrong, but afaik transmission doesn't use libtorrent. It's using their own libtransmission.


That's true, I didn't express it clearly, but someone was proposing/suggesting libtorrent for the webtorrent part in a github issue.


Could you explain what it is why it would be nice to have ?


Webtorrent is a modification to the bittorrent protocol to allow it to be used from a web browser with no plugins/extensions.

It's built on top of WebRTC.

Webtorrent and regular torrents can exist of the same file, but crucially, at least some small percentage of the people in the network have to support both webtorrent and regular torrent to act as a kind of bridge between the two.

If this isn't the case, you might have a webtorrent user who has the file, and a regular torrent user who wants the file, but the two people can't connect to eachother because they're talking different protocols, so the file won't be delivered.


This would be cool but part of me think that most web users will just leach to watch movies in their browsers.


Simply seeding for the 1+ hour you are watching the movie is typically enough to keep a torrent swarm healthy now that the majority of users have 10 Mbps+upload speeds. In that time, you can easily upload more data than you downloaded, meaning you are a net benefit for the swarm.

In a general sense, any video content where the bitrate is lower than the typical users upload bitrate could work for a webtorrent only swarm of 'regular' users who aren't going to do anything special for network health.


Even if you seed for a very short time is speeds up the process. No need to consume their data.


Only if they seed more than they leech. It has little to do with time. If you take a year to seed more than you leeched, you're a positive contribution to the swarm.


IMHO the most honest comparison is to hosting the entire file yourself.


Problem is: they can't watch movies in their browser because they can only leech from people who are seeding from a webtorrent client.

Let's say I have a movie I want to let other people stream with their browser. I am obligated to use some javascript horror webtorrent-compatible client. If I already have a seedbox running transmission I simply cannot use that for this purpose, which is a shame


So where would transmission support come into play ? In being able to recognize files that include both web and regular torrent ? Or in being to transparently talk to web clients from transmission ? Both ?


> In being able to recognize files that include both web and regular torrent

The actual .torrent file is the same for both, so it already has support there.

> Or in being to transparently talk to web clients from transmission

This. Currently if all peers who have the data of a torrent are webtorrent clients, then transmission won't be able to download that file.

Most importantly, the webtorrent ecosystem is still small, and browser-based webtorrent-only clients suffer because frequently they can't get the file data they need because all the other peers only talk the original-torrent protocol.


Well, webtorrents are just regular torrents using websockets as a transport mechanism, since you can't just have raw sockets from a browser sandbox.

If it supported these as a transport mechanism, transmission would be able to communicate with torrent clients that run in the browser, making the potential number of peers much higher (both for swarms that are mostly in-browser, and for those that have a majority of traditional clients). It would be very useful for seeding peertube videos from transmission, for instance, which was the quoted use-case.


Webtorrent allows you to download torrents from a stock iPhone: you use a JavaScript interpreter from the app store (that is allowed as per the app store guidelines, same for Python interpreter), then just run Webtorrent that works in JavaScript only. Like that I downloaded a lot of things while outside.


Wait, you're not allowed to run anything using the BitTorrent protocol on iOS? I never had the need myself, but I just assumed it would be allowed, because there is no reasonable explanation you wouldn't. Like the whole protocol is banned from the AppStore?


I just recently ranted about this exact thing in an Ask HN thread: https://news.ycombinator.com/item?id=32767277


Sadly yes


As soon as it lands in transmission, I will start seeding my favourite peertube videos 24/7, and I hope others will do the same. Unless peertube videos are typically being seeded like this, I'm sceptical of the platform.


Which clients do have support for webtorrent?



I meant torrent client programs, not libraries or websites.


I like how expecting real torrent clients to bend over for the WWW is in any way considered to be a reasonable stance.


Is it still single-threaded and the UI chokes when doing a blocking operation like moving a large torrent to a different hard drive? I didn't find any mention of parallel/threads/concurrent in the release notes.

I've been torrenting since the early 2000s and using torrent clients on Mac and Linux for over ten years, and there have been various issues in both Transmission and qBittorrent for as long as I can remember. I'm not sure if rTorrent is better, but lots of people seem to like it.

QBT has a tendency to break on upgrades and then it tries to rename files and do re-checks and all sorts of messy stuff - maybe due to my configuration since I have "append !qB" to incomplete files - but I personally don't think a client should ever rename a file from 'video.mp4' -> 'video.mp4.!qB' unless the checking is complete and it has already determined the file needs to be partially re-downloaded. Also, checking torrents is basically a broken feature since v4.4 - according to a bunch of people on the internet. I'm on the latest version, but going back to a working version could be a nightmare with hundreds of torrents on multiple hard-drives.

I used to use Transmission exclusively but pretty much stopped using it because of the single-threaded limitation. Also, Transmission on linux didn't support categories or labels like the Mac GUI version did, which I required for organization.

I'm not sure how much goes into a Torrent client, though I did read [madreyels]<https://mandreyel.github.io/posts/rust-bittorrent-engine/> post on writing a bittorrent engine in Rust. I'm not saying rust is the answer, but I would welcome a new and improved bittorrent client meeting my requirements in any language if it works better than the existing options.


Congrats! It would be interesting to hear more about the rewrite, 18% code reduction is significant. Have any of the contributors written about it somewhere?


It's a shame Transmission is not more popular. It's small, reliable and clean. More software should be like this nowdays.


Wow that's an impressive changelog. Both user facing features and getting rid of what sound like lots of tech debt. Congratulations to the people who made this happen!


Now that's a great cleanup!


quite impressive effort. v3 was already very polished, i cannot wait to try v4.


There's a very stupid bug with Transmission: when you switch between a local or a remote daemon the UI to download new torrents doesn't update the destination path and it will sometimes (being charitable, happens all the time to me) pick up the destination path from the other daemon setting thus recreating the default transmission download folder or the mounted path to an external HDD plugged to your PI running the transmission daemon.

/media/pi/WD\ Elements on your user@local machin instead of ~/download/torrents.

I hope it's fixed.


Are you talking about transmission remote gui or something else?


The GTK client. You can switch between remote or local daemons and it messes up the next torrent download because it uses the previous ly loaded config.


Been using Transmission for years, my instance's stats is up to 1,902 days active across 72 sessions. Really rock-solid software, excited for the new version.


Oh, Transmission has QT client. And I somehow always thought of Transmission as GNOME application. That makes my migration to KDE easier.


Gotta say these are some gorgeous release notes.


Love Transmission but went to picotorrent a few years ago, even smaller footprint https://github.com/picotorrent/picotorrent Sadly it’s Windows only


I stopped using clients on my computer as my NAS has a built in client that allows me to quickly download content straight to that device without using my computer as a middleman. Transmission was my go to for a very long time though!


NAS bittorrent clients are often based on transmission. For example, Synology's is.


There does seem to be a docker image for it through

https://hub.docker.com/r/picotorrent/server

but last updated 2 years ago, so not sure what's going on there



I don't know if it's noted on there somewhere, but it fails to run on my old High Sierra Mac I use for torrents.


After Netflix and others streaming services, I stop using Transmission. What’s the current usage of BitTorrent?


For example, buying a 4K HDR movie on iTunes does not allow you to even cache it locally (https://discussions.apple.com/thread/251320670), it is 100% streamed meaning you have to have a very sizeable and stable internet connection to watch it uninterrupted. Downloading it from a side channel means you can buy it on your favourite platform and actually keep it on your computer.


> Downloading it from a side channel means you can buy it on your favourite platform and actually keep it on your computer.

That would give the distribution platform the idea that what they are doing is OK. In this case, just download it and don't pay for it, if they are not actually providing you of any service. Vote with your wallet.


> Vote with your wallet.

Just so you know (and stop using that phrase), there is no way scientifically or otherwise, to attribute the effect of non-sales to a feature, a service or anything else.

Your opinion on the matter just never shows up during product decisions, planning or analysis.

Your "vote" just doesn't exist.


So because there is no way of "scientifically or otherwise, to attribute the effect of non-sales to a feature, a service or anything else", I'm supposed to buy content on one platform and then download it on another, just so I can actually watch what I purchased?

Maybe the terminology is wrong, but I don't find the concept of "don't pay for what you cannot/won't use" wrong.


How are the producers reimbursed in your example?


"Voting with your wallet" means reimbursing producers who can provide content in the way I prefer.


Well after Netflix, Amazon Prime, HBO Max, Disney+ and dozens of other regional streaming services, I started using Transmission again.


This. I currently pay for Netflix, Disney Plus and Apple TV+. If I encounter something I still can't watch, I'm reasonably likely to fire up Transmission.


Or, when the TV Series you've been watching gets removed from Netflix from one day to another


I can not speak about movies and shows, as I don't watch either, I can speak only about music.

For me, streaming services for music is no-go. It is not about money, it is about control.

I prefer to listen only albums (not playlists, tracks, sets, mixes, etc), and I prefer to choose masterings & presses. Yes, Spotify has whole King Crimson catalog, but which masterings/remasterings among many? I don't know, Spotify doesn't provide this information and can change material at any moment without notification. Also all streaming services has stupid regional limitations (and I'm in process of transitioning from sanctioned country to "normal" one, good luck to use streaming services in this situation), and could remove whole catalog on whim due to some problems with label.

If I can buy music in ownable format (which means mostly "if it is presented at bandcamp"), I buy it, no problem. If I cannot - I download it. Also, I'm still buy CDs on concerts. Again, it is not about money (I've spent on bandcamp much more than several years of Spotify subscription), it is about owning music I love in exactly form I want.


I massively reduced using torrent once I got proper high speed internet. As it turns out, streaming is much more convenient but the streaming world is not all candies and rainbows.

See, I do have Netflix, Amazon Prime, Disney+ etc. but they each have their own app and even their combined coverage is not complete.

Do you know who has the (almost)complete movie&series coverage? Fmovies. That's where I watch most of my stuff lately, because of convenience. The legal content providers screwed it up again, once more they are not the most convenient way to watch movies and series.

Every now and then Fmovies would not have something, or I would like to watch something immediately after it's premiere and it would take more hours to be added to Fmovies. Sometimes I really like something I would like to keep it and make sure not lose it for re-watch or some movie has such a nice visuals that I need it in the best quality possible.

That's where Transmission comes handy.


Not only that, but some services like Britbox purposefully gimp their system.

Supports AppleTV, Android, Android TV, Roku and Fire TV in the USA.

AppleTV, Android, Roku and FireTV in the UK.

Why have they decided to not support my Shield TV in the UK? Who knows but NowTV has the same limitation so sometimes you have to find alternatives.


I like having true 4k media that I don't have to suffer through shitty streaming for, which often doesn't deliver the true bitrate of the media.

Also I can go to rarbg and find whatever I want, whenever I want, without ever needing to check whether this or that service has this or that particular movie. It's like how Netflix used to be, a single directory for all media, only even better due to the aforementioned constant and high bitrate.


Streaming services are increasingly going over cable TV route and screwing customers over. The age of piracy is back.


Please, don't call copyright infringement 'piracy'. Piracy is a penal crime that involves violence, theft of property and is often accompanied by murders. Torrenting is just copying data.


I quite like embracing the word personally, I suspect it dates back to pirate radio in the 1960s which in parts of Europe actually did come from ships anchored just outside of territorial waters.

Another interesting example of people embracing the pirate image was British submarines in WW1. The First Sea Lord of the day saw them as ungentlemanly and called for all German submarine crews to be hanged as pirates, this outburst birthed the tradition that a British submarine returning home after successfully sinking an enemy vessel flies the jolly roger! The last time this occurred was after the 1982 Falklands War, I believe HMS Conqueror to this day remains the only nuclear submarine to have engaged an enemy ship with torpedos.


It's much older than that, from etymonline:

pirate (n.)

Meaning "one who takes another's work without permission" first recorded 1701; sense of "unlicensed radio broadcaster" (generally transmitting from a ship outside territorial waters) is from 1913.

pirate (v.)

"to rob on the high seas; commit piracy upon," 1570s, from pirate (n.). By 1706 as "appropriate and reproduce the literary or artistic work of another without right or permission; infringe on the copyright of another."

-- https://www.etymonline.com/word/pirate

sadly it doesn't explain the how of the word coming about.


Embracing is exactly what corporations that abuse copyright want you to do. To quite rms,

"The term "piracy" is used by record companies to demonize sharing and cooperation by equating them to kidnaping, murder and theft." [0]

(Better read it in entirety, if you haven't yet)

[0]: https://stallman.org/articles/end-war-on-sharing.html


Yes, but "The Copy Data Bay" just doesn't have the same ring to it.


It's better than the "copyright theft" slogan pushed by the media.


we had literal pirate parties as fairly legitimate political actors, I think the term has long been reappropriated


You should probably learn the difference between 'literally' and 'figuratively'.

Literal pirate parties exist in Somali and prey on nearby shipping lanes.


If your political party documents and registration are under the name "The Pirate Party," you're quite literally the pirate party.


And you plunder ships in high waters?


literally has been used as figuratively for rhetorical purposes for a long time my overly literal friend


Even if the technical process is just "copying data" there is a whole world of legal and ethical issues around it.

Torrenting non-free products from unofficial sources is illegal.


This is the silliest argument ever. Helping slaves escape to Canada was once illegal. The society understood that such laws do more harm than good and changed them. Same will happen with copyright laws.


Koptimism is the practice of the religion of Kopimi.


Some TV series aren't available everywhere. As an example, I'm currently watching "Dark", a wonderful 2017 German series (think about a grittier, darker, Stranger Things) that was dubbed in English but still doesn't seem to be available in other parts of Europe.

There are also other more legit usages of BitTorrent: I use it often to download Linux install images so that the network load is distributed among users and I do not tax excessively the servers.


I remember Dark having a pretty large mainstream following as it was getting released. I was under the impression that it was funded by Netflix and as such it would be available everywhere that Netflix is available (and by extension assumed Netflix was available everywhere in Europe).

Where in Europe is it not available on Netflix? Or is it the case of Netflix being unavailable?

> There are also other more legit usages of BitTorrent: I use it often to download Linux install images so that the network load is distributed among users and I do not tax excessively the servers.

Yeah, personally I use it to grab Path of Exile updates. The team release a torrent earlier so that people with bad internet connections can grab it early, it works great.


That doesn’t make sense to me.

There are 200,000 movies, and Netflix and others catalogue 2000 of them. If you think of a movie to watch there’s literally 1% chance that it will be available for streaming.

I have almost never watched a movie I seeked out on the streamers. It’s the other way round, the streamers offer me some movies and if I’m interested I watch it. But their catalogues are laughably small. Completely incomparable to the music streaming catalogues which are really vast.


Nitpick, but you are off by about one order of magnitude for the number of movies available on Netflix. It's closer to 20,000.


To not have to worry which streaming service currently streams the season of the show you want to watch.


Or at all. There's plenty of older TV shows and movies that you can't get your hands on anymore. Buying a second hand DVD from ebay or Craigslist doesn't count.


I haven’t noticed any difference. If anything high definition releases are now easier to find because they are ripped right off streaming services practically immediately.


For me a lot of streaming services have simple stopped working(Amazon Prime) in Firefox or are limited to 720p(Netflix), even though I have the DRM stuff enabled in Firefox. So I use it to be able to stream my "Debian ISOs" in at least full HD.


Still use it for a lot of documentaries. Also handy for offline viewing on a laptop (Netflix does that but only on a phone as far as I know).

Used it today for some archive.org content


I personally use it for big files, still my goto when downloading Debian ISOs.


Used it to download GIMP the other day.


I pay for Netflix, Apple TV, Prime Video and Disney+ and go to the cinema every couple of months, but I still steal stuff. Sometimes I find out it was available on one of these services but it just happened that the search in QBittorrent was easier.

I am once again asking the entertainment industry to charge me literally any price they like per month to have all this in one place permanently.


With the use of Sonarr/Radarr/Bazarr, along with a NAS and a fast enough connection, the last advantage of using a streaming service (convenience) is gone.


Well considering they killed the idea of having a portal to access content, convenience isn't actually that high for me.

Boxee was excellent for this, we used their services, and paid for them, but a standardised interface let me watch shows from different networks.

These days they would have to work with Plex but they would never actually cooperate.


To remove the "other" part.


Can I bind my vpn's network interface to it? Last I checked this wasn't possible


It's possible, set bind-address-ipv4 (and depending on your needs rpc-bind-address) in settings.json


Is there a (semi-) official docker image for this new beta? Would love to test it.


Still no sequential downloads :(


Why use this, when there's qBittorent?


Why wear a blue shirt, when there's yellow shirts available?


So no functional differences?


I would say Transmission has simpler UI. Outside of UI, it's also packaged differently, compared to Qbittorrent, Transmission can be installed completely headlessly without dependencies to QT/GTK+X.

In terms of actual features, Transmission is more lightweight. It doesn't support super-seeding, broadcatching, native SOCKS support and doesn't have built-in search engine integration.


"qbittorrent-nox" is the name of the headless package.


It only provides a web ui which is inferior to the standalone ui version.

If you want to go that route you might as well use Deluge, which is built on the same libtorrent library, has a proper daemon-client architecture with fully featured native clients for all major platforms.


> I would say Transmission has simpler UI

This is true but for me this is a disadvantage. qBitTorrent allows to fine-tune a lot of settings. Not as many as Tixati but still.


What settings would the average user find useful beyond adding a blocklist (which Transmission supports)? Genuinely curious as someone who used Transmission back in the day and appreciated the simplicity.


For some people, qBittorrent is unusable on macOS because of this bug: https://github.com/qbittorrent/qBittorrent/issues/14360 Transmission on the other hand works perfectly.


Why use qBittorrent, when there's this?


Does qBittorrent support a web UI with headless access?

That is the reason I use Transmission over anything else.


It does, actually, yes:

https://www.addictivetips.com/ubuntu-linux-tips/set-up-qbitt...

Mentioned on homepage, too:

https://www.qbittorrent.org/

> Remote control through Web user interface, written with AJAX

>> Nearly identical to the regular GUI


Transmission used to handle more simultaneous torrents. Looking at the changelog, seems like 4.0 should handle even more.


qBitorrent UI is fugly.


My biggest problem with BT / Transmission is that they dont work well with DNS Ad blocking solution. As the BT client will quickly swamp and use up all available DNS quota for torrents. I am wondering if anyone has a solution to that?


'dns quota'? Are you trying to use an abacus as your DNS server?

Torrents typically don't do more than a few thousand DNS lookups - which isn't much more than a few people browsing the web.

And most of those lookups are just for showing pretty flags and country codes in the UI, and aren't needed anyway.


I dont know but transmission 3.0 era uses 200K+ DNS lookup with about 30 active torrent in a space of two to three weeks. I dont think I ever got to the bottom of it.


200k queries in two weeks? That's not a lot. For waking hours and with someone surfing the web, my pihole reports about 1000/10m. That's 1.6/s.

Also, why would you have a DNS quota? I guess some DNS blocker service like NextDNS? Either pay them the money, put a local cache on your torrent client (or use a pihole with blocklists), or have the machine with the torrent client use another DNS server.


Thats like 1 per second. Which isn't really much. Thats probably 1 per person who connects or tries to connect to your machine.


A BT client doesn’t need to do DNS resolution for peers, since peers are bare IP addresses. It does need to resolve DNS for tracker domains when periodically announcing, and a public torrent can have a lot of trackers. Those add up if tracker domains have low TTL, I suppose.


Most BT clients do do a lookup of IP addresses. Many do it to provide little flags in the UI to say which country a user is from. Some also use the results as part of an algorithm to ban bad clients (if all clients from the same ISP are sending bad data, ban the whole ISP).


Transmission (or at least the official clients I have personally used) doesn't show country of a peer. Plus you can't determine the country of an IP by DNS, unless your geoip provider happens to be DNS-based; what you need is a geoip database.


Country is normally determined by doing a reverse DNS lookup, and then looking at the country code in the domain name. It's not very accurate, but you can do it for free without relying on any third party service or database.

Here is the libtorrent code that does it for every peer:

https://github.com/arvidn/libtorrent/blob/8786d17e59c90aee2c...


Well, at least some clients use a geoip database, e.g. qBittorrent: https://github.com/qbittorrent/qBittorrent/blob/2ef059807afc...


I can't say I've noticed this, and I'm a little surprised DNS gets much use in bittorrent at all, is the client doing rDNS on peers or something?


Essentially, yes. It's basically looking up the PTR records. If you look at the UI, you'll see them listed in the peer list.


Probably just resolving tracker IPs.


I haven't used torrent in a long time. Does it have any other usecase than piracy?


It's really good for distributing huge datasets, stuff that can take multiple days to download. It's also very good for using maximum available bandwidth efficiently, and have downloaders contribute bandwidth.

Linux isos are frequently pointed out as a use-case. I've seen it used for distributing neural network weights recently (stable diffusion and others). Rainbow tables and password lists too: https://freerainbowtables.com/

I have used it to efficiently distribute big files (disk images) over hundreds of computers in LANs. Computers on local switches can exchange data with each other, and computers can come and go on the network.


OpenStreetMap is such a huge dataset, for example: https://planet.openstreetmap.org/


Yup, downloading stuff reliably over lousy internet connections (packet loss, high latency, intermittent uptime). HTTP really sucks with that.


Linux distros use it for distributing their installers.


LibreOffice as well and it's a joy to use because they tend to be very fast downloads.


Plenty! If you have largish binary assets of any kind that need to be distributed across multiple nodes, it's fantastic.

For instance, if you're in the infrastructure group in a company, and you need to distribute VM images around the place, it's great.


A long time ago I was a user of transmission, then they moved to c++ and it was over. I would use other now.


Everyone talks about the application but this could actually be a very valid argument, if we shift focus to libtransmission.

Over 10 years ago I've worked on a very custom BitTorrent-related application for an ISP. Roughly, it was a piece of software that acted like caching proxy but for torrents, intended to lower upstream congestion. Legal questions aside, it was a success - but, anyway, back to Transmission.

I needed a very custom client, and libtransmission was (in my personal opinion) hand-down the sanest and powerful option. I liked the design of the API, how there are only a minimum number of hoops to "just" get a torrent running, and yet how one can gradually expand. Honestly I don't remember any details beyond the overall impressions - it was very long time ago - but I liked it. And what I remember is that one of the features I valued was that it was written in C - because I wanted to use it from Python. I mean, C ABIs are generally quite significantly easier to work with in any language that has FFI, compared to C++ ABIs.

CFFI had improved since then, of course, and I haven't looked into v4 library at all so maybe they have an interface without any C++ nuances... But in general, the point is that libraries written in C are typically easier to interface with.


I could understand if it was coded in a way that forces you to install extra modules like say PHP but stopping to use something because it went from C to C++ just makes you sound like a zealot.


c++ zealots are no less worse that languageX zealots mate.


How does that affect you as a user?


I am a developer user (yes, developers are also users), I understand why c++ is toxic for the humanity, then I did use other clients and I am still pushing to avoid it.


What did they use before?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: