Good to see it being in active development. But the one feature I'm waiting for is a network whitelist. For example, it should never connect over my home or office network, but only via anonymous VPN.
(Edit: I know you can do this already with firewall rules, or containers, but I don't trust myself to not make a mistake. Ideally this should be part of the OS. You should be able to right-click somewhere in the window and say "use this network for this app".)
The other feature I'd like to see is sequential downloads, but the developers object out of principle so that is never going to come.
This is a plug-and-play solution bundling Transmission and the most common VPN configurations in a Docker container, ensuring that all torrent traffic goes via an active VPN connection. After configuring it, you just run "docker-compose up" and can then access Transmission via its web UI. It could be nice to have this built into the client, but using a container feels safer.
This is a great container! Been using this for years. (I've contributed!)
The only downside is that @haugene has gone missing for sometime now, as such there are no firm releases. If you want new features/bug fixes, you have to pull dev, which isn't very ideal. Otherwise the maintainers are doing a great job (shout out to @pkishino!)
For me, the real power of containers is that all the mistakes I could've made during the install are in ~a single file instead of a long forensic goose chase
I’ve got a home lab with proxmox as the hypervisor. I’m running openwrt (x86) for my household router (in a qemu VM) and I’ve got transmission running in a (lxc) container. When the transmission container boots up, it gets its IP address via DHCP (server in openwrt). I configured dhcp to allocate the same lP address to the container every time. You could also just use a static ipconfig in the container.
Within openwrt, I’ve installed WireGuard and OpenVPN for accessing different vpns.
Finally, again in openwrt, I installed the policy based routing package[1]. This package makes it super easy to route all traffic from the transmission container to the VPN network.
There are probably many ways you could do this, but this setup is great for me!
Tldr: use OpenWRT to create a static IP reservation for a dedicated torrent instance, then use source-based routing to send traffic over a VPN running on the router.
Sounds pretty awesome. I've not dabbled with PBR on home equipment. We did use it in prior employment to force all PCI traffic over a dedicated physical network between datacenters.
Agreed. Users shouldn't have to understand firewall rules, containers, VMs, or even VPNs in order to tunnel traffic for a specific app through a (more) trusted third party.
We need an open protocol for establishing tunnel/VPN connections. Apps implement the protocol, which lets you enter your tunnel provider and go through a quick OAuth flow to establish a tunnel. This would be a big win for VPN providers as well because people would be sending only specific traffic through them instead of everything.
Circa 1989 I remember people complaining that the NeXT computer (which eventually turned into the current Mac OS) was needlessly coddling users, and that anybody who wanted the power of Unix needed to know how to recompile their kernel. I also remember people objecting to MS Windows because people really should be using MS-DOS so that they understand what's going on.
The main point of computers is to make things easier for people. Most of us make our livings because computers are very good for that. If most users really had to deeply understand the technologies involved in solving their needs, a lot fewer of us would have jobs and much more time would be wasted by people who were just trying to get something done. So yes, let's absolutely coddle users.
Yes, I think we agree. That would fit within my definition of "things" in "make things easier". They can only do more/better with computers if the software is built to support them in that. If instead we insist that people focus on understanding the computer rather than doing the things they want to do, then they will do less and/or worse.
I would be fine with users doing less with their computers, if they in turn understood how they worked and why we all get angry here about things like: politicians likening cryptography to criminal activity, companies hoovering your data from every site you visit, etc.
I see. In what areas are you prepared to make the symmetrical deal? That you'll do less and get less while investing more time in understanding areas other people are less expert in?
There are millions of job openings for coders now. Illiteracy was not a good idea.
Also, youtube just served me an un-skip-able 12 minute add 30 seconds into a video. It was the 3rd add. The thought police is all over the web now. My windos cant download photos from my iphone. My windos is rebooting when it feels like it. The glued in phone battery life is garbage after 2 years. I must have purchased 30 chargers now. My new laptop requires a microsoft account. Google search is not working anymore. Adsense is driving up prices.
Its like trying to get nutrition from MacDonalds. The mc drive is convenient tho. Very easy for users.
This isn't about coddling users, it's about not reinventing the wheel.
Sure you could spend hours reading pf man pages. But maybe it would be better use of time if one person figures out how to do it, writes code to do it automatically, and shares that with other people so they can spend more time watching all the movies they download instead of configuring their bit torrent client.
"Hey why are my torrent downloads running super slow?"
"Oh, well, your torrent client decided to automatically use your vpn because it saw one running and somebody decided that was the right behavior because they thought you were dumb."
Users have preferences. Users can read. Let them learn and sort it out. Don't ruin a simple program with tarred assumptions of what is correct for users. Prefer simplicity.
Also don't assume using a vpn is more secure. That is a wrong assumption.
I don't have a horse in this race, but I think you're presuming simplicity where there is none.
A bittorrent client, even in CLI form, is not simple software at all. Have you ever taken a look at the myriad of available settings?
Most GUI clients have a sort of "wizard" that helps you pick acceptable connection parameters. This feature alone has a truckload of assumptions coded in. And that kind of feature exists for a reason: no nonsense-seeking users. They're the majority. Just let me download my file bro.
It also makes sense that the developer chooses parameters. They are probably the person with the most detailed knowledge of the problem, they are probably the person who knows best what the parameters should be.
It baffles my mind when developers just expect users to tweak low level settings. Sure, it's good if stuff is configurable, but it should work correctly out of the box. Like, you spent days or weeks working on this feature, don't expect your users to do the same...
Yes, I agree that your assumptions about how you think I think this should work are bad. :^)
But to be more specific with my feedback, it seems a bit silly that Transmission (I'm trying 4.0.0b1 right now) doesn't allow me to specify that it must use "VPN X" (as opposed to "VPN Y", which is also available) for downloads.
I understand that I could mess with namespaces or containers or VMs and probably make that work, but "use Bittorrent client with VPN" is more or less a universal use case and should be straightforward.
you don't even need the extra user: use a network namespace.
- ip netns add vpnonly # create an empty namespace
- ip netns exec vpnonly wg quick ... # connect to your VPN
later, launch transmission inside this namespace:
- ip netns exec vpnonly transmission
has the nice property that as long as you do that exec step right (or even half right), the failure mode is no connectivity rather than accidentally sending traffic in the clear.
It already has the ability to specify an IP address to bind to. I use an OpenVPN up script to stop any running instance, replace the bind address with the current VPN interface address and start an instance.
Depends on what torrent is about. For me, it is about watching shows that you can't officially watch where I live yet. Once I week, I hit up a site and, click a magnet link, and a couple of minutes later I can watch my favorite show. (The cost of the VPN is incidentally about the same as a streaming subscription. I think for this reason it is tolerated by the powers that be.)
I don't expect torrents to be available for a long time. Only very few people keep the torrent entry in their client alive for long, even though I keep around all the files. This is as opposed to eMule and older systems, where you can download any file in the user's media folder.
I'm using namespaced-openvpn to restrict transmission to one network and you could limit the amount of concurrently active files to one, effectively downloading them sequentially. Or do you want to download each file sequentially?
I would hazard they are asking for file-block sequential downloads from start of file to end, so they can start watching a movie while it is still downloading and be assured that the first parts of the moview are present without gaps ...
This goes against the general cluster inter-peer everybody swaps what they have so far w/out reference to file position for maximising transmission within a swarm design of the torrent protocol.
I run my torrent client in a container and have that container only go out the vpn. Works very well without screwing with computer wide network settings.
Set http_proxy environment variable to a socks proxy that goes through the vpn. Some vpn clients offer a socksproxy built in. That way if the vpn is not connected the proxy address is dead.
Not ragging on Transmission here but qbittorrent has zero problems getting out of my network and I didn't open up any ports or upnp either, give it a try. Transmission just has issues on my network that I don't care to figure out or open up ports for, qbit is quite good at navigating through NAT.
Lately I've noticed that public trackers are behind Cloudflare. If you're on a VPN, you have a high likelihood of getting the Cloudflare captcha. Your bittorrent client will be unable to scrape those trackers, or bootstrap DHT, etc because of the cloudflare block.
If you don’t trust yourself to not make a mistake with firewall config, why do you trust yourself not to make one when setting up an allow list? One wrong digit or dot and you’ll be in the same place.
Start with a default policy of deny and work from there, if you’re only creating rules for allowing traffic explicitly then there’s less possibility for error.
I'm really excited to see support for torrents v2.
If only because transmission will stop chocking on hybrid torrent files, which should lead to more adoption across the board.
I wish transmission supported mutable torrents (BEP46). It would allow completely re-thinking distribution: grab a torrent of your favorite distribution, and have it auto-update to the latest version.
Or grab a "super-torrent" that lists isos for all known distributions, select the ones you're interested in, and let it update... Of course, that can be used for file-sharing repositories too.
Maybe transmission is not interactive enough for such a use-case (you'd have to ask the user what they want to do when the torrent mutates, and possibly add simple scripting interfaces for more complex heuristics).
> I wish transmission supported mutable torrents (BEP46). It would allow completely re-thinking distribution: grab a torrent of your favorite distribution, and have it auto-update to the latest version.
That sounds horrible. My own expectation is that if I have a torrent file/magnet link and I download from it, I always get the same content. This is why I commonly keep the magnet link/torrent file around even if I delete the content itself, so if I want the same thing in the future, I just re-download it and I'm 100% it's exactly the same.
This change would break that expectation, for what benefit? This could be implemented as a HTTP call instead, that serves different torrent files, so you can just change whatever serves the torrent file in order to mutate what gets downloaded.
> This change would break that expectation, for what benefit? This could be implemented as a HTTP call instead
The main benefit is not having to rely on a centralized HTTP server and repository, as well as the associated infrastructure: DNS, TLS, CA, etc. One could host a blog over torrents (more or less what IPFS is doing).
We already have RSS feeds for what you describe (though not supported natively by transmission).
> My own expectation is that if I have a torrent file/magnet link and I download from it, I always get the same content.
That wouldn't change significantly. The mutable torrent content is basically a .torrent that points to whatever the current version is; so just save that instead.
The good thing with mutable torrents (or mutable content in the DHT, as pointed out in another comment) is that instead of the hash (content-addressing) being stored in the DHT, a public key is retrieved. Once you got a torrent ("trust on first use" of sorts), you can verify that the new version is from the same uploader (or at least, one that knows the private key).
I would present it with a new set of checkboxes in the torrent client: "download new versions", then if checked "delete previous version after downloading updated content" and for more granularity "download new files", "download updates to existing files". Of course, for ease of use it would be better to have a historical view. I think the main use-cases would be to archive every intermediate version, or to just grab whatever is latest. There's infinite variations in-between.
One thing I would like to stress, and that I didn't point out enough in my original comment is that this goes extremely well with v2 torrents and their de-duplication feature: if the same file exists in multiple versions of the torrent, peers with any version can contribute to the swarm.
It's not as good for deduplication as a rolling hashes could be, but you could host whole distribution repositories with mutable v2 torrents.
Pretty sure it gets you most of the way there by simply putting a mutable magnet URI in DHT. The rest is just a clever way of reusing already downloaded blocks.
This is a cool feature that I've idly wondered about in the past, but I didn't know it was actually spec'd/implemented in Bittorrent.
You could imagine hierarchically pushing the entire of TPB or whatever into the DHT. And even, each torrent having an `index.html` that renders a page containing links to the rest of the sub-torrents, so that the entire site is hosted in the DHT.
Right, and we saw some examples of Sqlite+torrent in the past; those databases could be made editable more easily (although it would be nice to de-duplicate unchanged parts, not sure if sqlite DBs can be chunked).
Not necessarily sqlite, that said. It might be useful for openstreetmap tiles/db, or indeed repositories like library genesis, wikidata/wikipedia dumps, etc.
So presumably in a similar manner as https://github.com/phiresky/sql.js-httpvfs you could map SQLite pages to leaf torrents too and get the chunking you are looking for.
When I first came to Linux I adopted Transmission. It seemed less featureful than others but now I like its simplicity, does everything it needs and it is also very reliable.
It is not very known but there actually is a transmission-cli for common torrent operations that may come in handy some time.
FYI, transmission-cli is considered deprecated by the Transmission developers. Using both transmission-daemon and transmission-remote is the recommended approach. Actually, transmission-cli uses transmission-daemon internally and they share the same configuration. transmission-cli is still useful for one-off downloads, but I still prefer the recommended way because I'd like to seed older torrents some more time.
I was greatly disappointed too when I found out about transmission-cli's deprecation, but I have since found and replaced all uses of it with stig[1] which, for me, does the same thing and more.
It's a great piece of software. I'm running a web remote v2.92 on an rpi with simple basic auth protection, and it has been working flawlessly for years now.
These 1mb are only for the daemon and not taking into account the dependencies. Although I’m pretty sure the linux version is more lightweight in general because the gtk dependency is very likely to already be installed.
As a Chinese, I really need the anti leeching capabilities, as too many people here use Xunlei that only download but never upload. Sad to see that the maintainers skip this again.
And as the reported stats originate from the torrent client... They're able to report whatever they want.
I remember using some obscure tool before 2010 to intercept these packages and update them.
I was in my early teens back then and used it to overreport my upload statistics by something like x2 on some private trackers, so they wouldn't ban me.
It used to be easy to falsify that information, but no more. Most private trackers would easily ban you for that nowadays, unless you can get all the other clients to also cheat with you using the same parameters. It's easy to cross-reference the information coming from N clients, so you'll be booted relatively quickly.
That sounds like a useful development to counter such abuse, but i don't think this feature helps your torrent client wherever it should upload data to any given peer.
It’s quite easy to block Xunlei. First, they identify themselves in the connection. Second, no matter how much you upload to them, they always report zero progress.
The maintainers reject the proposal to block by client ID. But they never consider the second method.
Ad blocking is a cat and mouse game. Anti censorship is a cat and mouse game. Anti fraud is a cat and mouse game. Anti spam is a cat and mouse game. I don’t see people giving up so easily on those problems.
Impossible to track with 100% accuracy, yes. But not completely meaningless to do some basic tracking. A torrent client could implement some tracking on their own, where they keep track of seen clients and keep track how much been uploaded vs downloaded from them (effectively ratio per client) and upload slower to clients you've never encountered that have been uploading.
Obviously you wouldn't (the network is decentralized, you can't know other peers communication) but also you don't care (peer A <> B might be able to upload/download, but peer B <> C can only download in one direction).
The important part is to keep track of what's happening with your own connection to others, not what other peers are doing.
> The important part is to keep track of what's happening with your own connection to others, not what other peers are doing.
99%+ of the time any pair of peers has unidirectional traffic. The situation where two clients will have incomplete torrents and are exchanging data with each other is rare, like the first few hours after a brand new torrent hits the network.
I just read that Xunlei is the most popular BitTorrent client in China.
Does it not upload to none Xunlei clients or does it leech Bittorent entirely? I can’t imagine one that is a pure leech could exist without harming BitTorrent
I'm not sure. Some say that Xunlei does upload to other Xunlei clients, so they have to identify themselves in the connection (rather than try to emulate other clients and secretly leech the swarm).
If someone asks me to install a Torrent client on a single machine, my favorite would be QBittorrent, however at home I use XigmaNAS' Transmission client extensively, and operate it from other machines in the LAN using the Transgui interface (also available on Windows and MacOS). This allows me to turn off everything but the NAS, which with all RAID disks spun down and a TDP of 15 Watts makes for some good energy savings.
Qbittorrent unfortunately turned into a buggy mess and has no leadership. The github has hundreds of known issues that are not addressed. No qbittorrent client works reliably on Windows or Mac for at least a year or two. It gets hung up scanning downloaded torrents etc.
I ran into the scanning bug. Turned out to be because I was leeching and had a max upload of 1kbps. I bumped that up to something more sensible and it started working again. This is an issue with most Torrent clients, although QBittorrent is definitely the worst with it that I've encountered. Unsure if it's what you're running into, but figured I'd point it out since it took me weeks to troubleshoot.
This really bums me out to hear. I have used many torrent clients over the years, Azureus -> uTorrent -> Deluge -> (transmission on mac and linux, although it didnt work great on Windows), until finally landing on qBittorrent and loving it.
qbittorrent on Linux has a crazy memory leak that makes it use more than 1gb of memory very fast and possibly. I had to stop using it for quite a few months. I refonwloaded it recently and it wasn't fixed.
I found out that's just disk cache. Try launching some memory hungry programs and you'll see the cache released and qBittorrent running at around 150 MB again.
This release is really important if only because it supports BitTorrent V2 spec.
Transmission has a huge market share, particularly in open source circles. Lack of support for the V2 spec has stopped many open source projects from migrating their torrents to the V2 spec, because transmission clients won't work.
Please support and use other open-source torrent clients, because Transmission's long-standing history of going years between updates and their significant market share simply is not healthy.
The protocol should be able to find same files in other torrents - you could be downloading a bunch of files (like a season of TV), but are able to find and download one file from another torrent that is better seeded (like one single episode torrent). But I don't think any of the clients implement that yet.
There are lots of changes that are nice-to-have efficiency improvements.
But there are no new killer features. I would guess that the extra user headaches from having outdated software not work with new torrents will exceed the benefits of slightly faster downloads and slightly better availability.
Webtorrent is a modification to the bittorrent protocol to allow it to be used from a web browser with no plugins/extensions.
It's built on top of WebRTC.
Webtorrent and regular torrents can exist of the same file, but crucially, at least some small percentage of the people in the network have to support both webtorrent and regular torrent to act as a kind of bridge between the two.
If this isn't the case, you might have a webtorrent user who has the file, and a regular torrent user who wants the file, but the two people can't connect to eachother because they're talking different protocols, so the file won't be delivered.
Simply seeding for the 1+ hour you are watching the movie is typically enough to keep a torrent swarm healthy now that the majority of users have 10 Mbps+upload speeds. In that time, you can easily upload more data than you downloaded, meaning you are a net benefit for the swarm.
In a general sense, any video content where the bitrate is lower than the typical users upload bitrate could work for a webtorrent only swarm of 'regular' users who aren't going to do anything special for network health.
Only if they seed more than they leech. It has little to do with time. If you take a year to seed more than you leeched, you're a positive contribution to the swarm.
Problem is: they can't watch movies in their browser because they can only leech from people who are seeding from a webtorrent client.
Let's say I have a movie I want to let other people stream with their browser. I am obligated to use some javascript horror webtorrent-compatible client. If I already have a seedbox running transmission I simply cannot use that for this purpose, which is a shame
So where would transmission support come into play ? In being able to recognize files that include both web and regular torrent ? Or in being to transparently talk to web clients from transmission ? Both ?
> In being able to recognize files that include both web and regular torrent
The actual .torrent file is the same for both, so it already has support there.
> Or in being to transparently talk to web clients from transmission
This. Currently if all peers who have the data of a torrent are webtorrent clients, then transmission won't be able to download that file.
Most importantly, the webtorrent ecosystem is still small, and browser-based webtorrent-only clients suffer because frequently they can't get the file data they need because all the other peers only talk the original-torrent protocol.
Well, webtorrents are just regular torrents using websockets as a transport mechanism, since you can't just have raw sockets from a browser sandbox.
If it supported these as a transport mechanism, transmission would be able to communicate with torrent clients that run in the browser, making the potential number of peers much higher (both for swarms that are mostly in-browser, and for those that have a majority of traditional clients). It would be very useful for seeding peertube videos from transmission, for instance, which was the quoted use-case.
Webtorrent allows you to download torrents from a stock iPhone: you use a JavaScript interpreter from the app store (that is allowed as per the app store guidelines, same for Python interpreter), then just run Webtorrent that works in JavaScript only. Like that I downloaded a lot of things while outside.
Wait, you're not allowed to run anything using the BitTorrent protocol on iOS? I never had the need myself, but I just assumed it would be allowed, because there is no reasonable explanation you wouldn't. Like the whole protocol is banned from the AppStore?
As soon as it lands in transmission, I will start seeding my favourite peertube videos 24/7, and I hope others will do the same. Unless peertube videos are typically being seeded like this, I'm sceptical of the platform.
Is it still single-threaded and the UI chokes when doing a blocking operation like moving a large torrent to a different hard drive? I didn't find any mention of parallel/threads/concurrent in the release notes.
I've been torrenting since the early 2000s and using torrent clients on Mac and Linux for over ten years, and there have been various issues in both Transmission and qBittorrent for as long as I can remember. I'm not sure if rTorrent is better, but lots of people seem to like it.
QBT has a tendency to break on upgrades and then it tries to rename files and do re-checks and all sorts of messy stuff - maybe due to my configuration since I have "append !qB" to incomplete files - but I personally don't think a client should ever rename a file from 'video.mp4' -> 'video.mp4.!qB' unless the checking is complete and it has already determined the file needs to be partially re-downloaded. Also, checking torrents is basically a broken feature since v4.4 - according to a bunch of people on the internet. I'm on the latest version, but going back to a working version could be a nightmare with hundreds of torrents on multiple hard-drives.
I used to use Transmission exclusively but pretty much stopped using it because of the single-threaded limitation. Also, Transmission on linux didn't support categories or labels like the Mac GUI version did, which I required for organization.
I'm not sure how much goes into a Torrent client, though I did read [madreyels]<https://mandreyel.github.io/posts/rust-bittorrent-engine/> post on writing a bittorrent engine in Rust. I'm not saying rust is the answer, but I would welcome a new and improved bittorrent client meeting my requirements in any language if it works better than the existing options.
Congrats! It would be interesting to hear more about the rewrite, 18% code reduction is significant. Have any of the contributors written about it somewhere?
Wow that's an impressive changelog. Both user facing features and getting rid of what sound like lots of tech debt. Congratulations to the people who made this happen!
There's a very stupid bug with Transmission: when you switch between a local or a remote daemon the UI to download new torrents doesn't update the destination path and it will sometimes (being charitable, happens all the time to me) pick up the destination path from the other daemon setting thus recreating the default transmission download folder or the mounted path to an external HDD plugged to your PI running the transmission daemon.
/media/pi/WD\ Elements on your user@local machin instead of ~/download/torrents.
The GTK client. You can switch between remote or local daemons and it messes up the next torrent download because it uses the previous ly loaded config.
Been using Transmission for years, my instance's stats is up to 1,902 days active across 72 sessions. Really rock-solid software, excited for the new version.
I stopped using clients on my computer as my NAS has a built in client that allows me to quickly download content straight to that device without using my computer as a middleman. Transmission was my go to for a very long time though!
For example, buying a 4K HDR movie on iTunes does not allow you to even cache it locally (https://discussions.apple.com/thread/251320670), it is 100% streamed meaning you have to have a very sizeable and stable internet connection to watch it uninterrupted. Downloading it from a side channel means you can buy it on your favourite platform and actually keep it on your computer.
> Downloading it from a side channel means you can buy it on your favourite platform and actually keep it on your computer.
That would give the distribution platform the idea that what they are doing is OK. In this case, just download it and don't pay for it, if they are not actually providing you of any service. Vote with your wallet.
Just so you know (and stop using that phrase), there is no way scientifically or otherwise, to attribute the effect of non-sales to a feature, a service or anything else.
Your opinion on the matter just never shows up during product decisions, planning or analysis.
So because there is no way of "scientifically or otherwise, to attribute the effect of non-sales to a feature, a service or anything else", I'm supposed to buy content on one platform and then download it on another, just so I can actually watch what I purchased?
Maybe the terminology is wrong, but I don't find the concept of "don't pay for what you cannot/won't use" wrong.
This. I currently pay for Netflix, Disney Plus and Apple TV+. If I encounter something I still can't watch, I'm reasonably likely to fire up Transmission.
I can not speak about movies and shows, as I don't watch either, I can speak only about music.
For me, streaming services for music is no-go. It is not about money, it is about control.
I prefer to listen only albums (not playlists, tracks, sets, mixes, etc), and I prefer to choose masterings & presses. Yes, Spotify has whole King Crimson catalog, but which masterings/remasterings among many? I don't know, Spotify doesn't provide this information and can change material at any moment without notification. Also all streaming services has stupid regional limitations (and I'm in process of transitioning from sanctioned country to "normal" one, good luck to use streaming services in this situation), and could remove whole catalog on whim due to some problems with label.
If I can buy music in ownable format (which means mostly "if it is presented at bandcamp"), I buy it, no problem. If I cannot - I download it. Also, I'm still buy CDs on concerts. Again, it is not about money (I've spent on bandcamp much more than several years of Spotify subscription), it is about owning music I love in exactly form I want.
I massively reduced using torrent once I got proper high speed internet. As it turns out, streaming is much more convenient but the streaming world is not all candies and rainbows.
See, I do have Netflix, Amazon Prime, Disney+ etc. but they each have their own app and even their combined coverage is not complete.
Do you know who has the (almost)complete movie&series coverage? Fmovies. That's where I watch most of my stuff lately, because of convenience. The legal content providers screwed it up again, once more they are not the most convenient way to watch movies and series.
Every now and then Fmovies would not have something, or I would like to watch something immediately after it's premiere and it would take more hours to be added to Fmovies. Sometimes I really like something I would like to keep it and make sure not lose it for re-watch or some movie has such a nice visuals that I need it in the best quality possible.
I like having true 4k media that I don't have to suffer through shitty streaming for, which often doesn't deliver the true bitrate of the media.
Also I can go to rarbg and find whatever I want, whenever I want, without ever needing to check whether this or that service has this or that particular movie. It's like how Netflix used to be, a single directory for all media, only even better due to the aforementioned constant and high bitrate.
Please, don't call copyright infringement 'piracy'. Piracy is a penal crime that involves violence, theft of property and is often accompanied by murders. Torrenting is just copying data.
I quite like embracing the word personally, I suspect it dates back to pirate radio in the 1960s which in parts of Europe actually did come from ships anchored just outside of territorial waters.
Another interesting example of people embracing the pirate image was British submarines in WW1. The First Sea Lord of the day saw them as ungentlemanly and called for all German submarine crews to be hanged as pirates, this outburst birthed the tradition that a British submarine returning home after successfully sinking an enemy vessel flies the jolly roger! The last time this occurred was after the 1982 Falklands War, I believe HMS Conqueror to this day remains the only nuclear submarine to have engaged an enemy ship with torpedos.
Meaning "one who takes another's work without permission" first recorded 1701; sense of "unlicensed radio broadcaster" (generally transmitting from a ship outside territorial waters) is from 1913.
pirate (v.)
"to rob on the high seas; commit piracy upon," 1570s, from pirate (n.). By 1706 as "appropriate and reproduce the literary or artistic work of another without right or permission; infringe on the copyright of another."
This is the silliest argument ever. Helping slaves escape to Canada was once illegal. The society understood that such laws do more harm than good and changed them. Same will happen with copyright laws.
Some TV series aren't available everywhere. As an example, I'm currently watching "Dark", a wonderful 2017 German series (think about a grittier, darker, Stranger Things) that was dubbed in English but still doesn't seem to be available in other parts of Europe.
There are also other more legit usages of BitTorrent: I use it often to download Linux install images so that the network load is distributed among users and I do not tax excessively the servers.
I remember Dark having a pretty large mainstream following as it was getting released. I was under the impression that it was funded by Netflix and as such it would be available everywhere that Netflix is available (and by extension assumed Netflix was available everywhere in Europe).
Where in Europe is it not available on Netflix? Or is it the case of Netflix being unavailable?
> There are also other more legit usages of BitTorrent: I use it often to download Linux install images so that the network load is distributed among users and I do not tax excessively the servers.
Yeah, personally I use it to grab Path of Exile updates. The team release a torrent earlier so that people with bad internet connections can grab it early, it works great.
There are 200,000 movies, and Netflix and others catalogue 2000 of them. If you think of a movie to watch there’s literally 1% chance that it will be available for streaming.
I have almost never watched a movie I seeked out on the streamers. It’s the other way round, the streamers offer me some movies and if I’m interested I watch it. But their catalogues are laughably small. Completely incomparable to the music streaming catalogues which are really vast.
Or at all. There's plenty of older TV shows and movies that you can't get your hands on anymore. Buying a second hand DVD from ebay or Craigslist doesn't count.
I haven’t noticed any difference. If anything high definition releases are now easier to find because they are ripped right off streaming services practically immediately.
For me a lot of streaming services have simple stopped working(Amazon Prime) in Firefox or are limited to 720p(Netflix), even though I have the DRM stuff enabled in Firefox. So I use it to be able to stream my "Debian ISOs" in at least full HD.
I pay for Netflix, Apple TV, Prime Video and Disney+ and go to the cinema every couple of months, but I still steal stuff. Sometimes I find out it was available on one of these services but it just happened that the search in QBittorrent was easier.
I am once again asking the entertainment industry to charge me literally any price they like per month to have all this in one place permanently.
With the use of Sonarr/Radarr/Bazarr, along with a NAS and a fast enough connection, the last advantage of using a streaming service (convenience) is gone.
I would say Transmission has simpler UI. Outside of UI, it's also packaged differently, compared to Qbittorrent, Transmission can be installed completely headlessly without dependencies to QT/GTK+X.
In terms of actual features, Transmission is more lightweight. It doesn't support super-seeding, broadcatching, native SOCKS support and doesn't have built-in search engine integration.
It only provides a web ui which is inferior to the standalone ui version.
If you want to go that route you might as well use Deluge, which is built on the same libtorrent library, has a proper daemon-client architecture with fully featured native clients for all major platforms.
What settings would the average user find useful beyond adding a blocklist (which Transmission supports)? Genuinely curious as someone who used Transmission back in the day and appreciated the simplicity.
My biggest problem with BT / Transmission is that they dont work well with DNS Ad blocking solution. As the BT client will quickly swamp and use up all available DNS quota for torrents. I am wondering if anyone has a solution to that?
I dont know but transmission 3.0 era uses 200K+ DNS lookup with about 30 active torrent in a space of two to three weeks. I dont think I ever got to the bottom of it.
200k queries in two weeks? That's not a lot. For waking hours and with someone surfing the web, my pihole reports about 1000/10m. That's 1.6/s.
Also, why would you have a DNS quota? I guess some DNS blocker service like NextDNS? Either pay them the money, put a local cache on your torrent client (or use a pihole with blocklists), or have the machine with the torrent client use another DNS server.
A BT client doesn’t need to do DNS resolution for peers, since peers are bare IP addresses. It does need to resolve DNS for tracker domains when periodically announcing, and a public torrent can have a lot of trackers. Those add up if tracker domains have low TTL, I suppose.
Most BT clients do do a lookup of IP addresses. Many do it to provide little flags in the UI to say which country a user is from. Some also use the results as part of an algorithm to ban bad clients (if all clients from the same ISP are sending bad data, ban the whole ISP).
Transmission (or at least the official clients I have personally used) doesn't show country of a peer. Plus you can't determine the country of an IP by DNS, unless your geoip provider happens to be DNS-based; what you need is a geoip database.
Country is normally determined by doing a reverse DNS lookup, and then looking at the country code in the domain name. It's not very accurate, but you can do it for free without relying on any third party service or database.
Here is the libtorrent code that does it for every peer:
It's really good for distributing huge datasets, stuff that can take multiple days to download.
It's also very good for using maximum available bandwidth efficiently, and have downloaders contribute bandwidth.
Linux isos are frequently pointed out as a use-case. I've seen it used for distributing neural network weights recently (stable diffusion and others). Rainbow tables and password lists too: https://freerainbowtables.com/
I have used it to efficiently distribute big files (disk images) over hundreds of computers in LANs. Computers on local switches can exchange data with each other, and computers can come and go on the network.
Everyone talks about the application but this could actually be a very valid argument, if we shift focus to libtransmission.
Over 10 years ago I've worked on a very custom BitTorrent-related application for an ISP. Roughly, it was a piece of software that acted like caching proxy but for torrents, intended to lower upstream congestion. Legal questions aside, it was a success - but, anyway, back to Transmission.
I needed a very custom client, and libtransmission was (in my personal opinion) hand-down the sanest and powerful option. I liked the design of the API, how there are only a minimum number of hoops to "just" get a torrent running, and yet how one can gradually expand. Honestly I don't remember any details beyond the overall impressions - it was very long time ago - but I liked it. And what I remember is that one of the features I valued was that it was written in C - because I wanted to use it from Python. I mean, C ABIs are generally quite significantly easier to work with in any language that has FFI, compared to C++ ABIs.
CFFI had improved since then, of course, and I haven't looked into v4 library at all so maybe they have an interface without any C++ nuances... But in general, the point is that libraries written in C are typically easier to interface with.
I could understand if it was coded in a way that forces you to install extra modules like say PHP but stopping to use something because it went from C to C++ just makes you sound like a zealot.
I am a developer user (yes, developers are also users), I understand why c++ is toxic for the humanity, then I did use other clients and I am still pushing to avoid it.
(Edit: I know you can do this already with firewall rules, or containers, but I don't trust myself to not make a mistake. Ideally this should be part of the OS. You should be able to right-click somewhere in the window and say "use this network for this app".)
The other feature I'd like to see is sequential downloads, but the developers object out of principle so that is never going to come.