> It was also raised that Snap packages also aren't currently included as part of Ubuntu source ISO builds anyways so these source ISOs are incomplete and have been so for years.
To be completely honest, I was done with Ubuntu after they hardcore started pushing Snap packages. My package manager should never be randomly killing apps. Ubuntu has become the Windows of the Linux world.
> We'll see if they pull the trigger for the current Ubuntu 24.04 LTS cycle on doing away with these rather unnecessary source ISOs.
I still use them for flashing USBs - a lot of installs are offline, especially when the networking is some new or non-standard driver. Honestly Ubuntu restricting itself like this is a good thing, Canonical need to be kicked off of their perch.
> I still use them for flashing USBs - a lot of installs are offline, especially when the networking is some new or non-standard driver. Honestly Ubuntu restricting itself like this is a good thing, Canonical need to be kicked off of their perch.
These are not the installer ISOs, they're ISOs containing the _sources_ for all packages in Ubuntu.
> These are not the installer ISOs, they're ISOs containing the _sources_ for all packages in Ubuntu.
I'm aware, you can install from source [1] [2]. My application is embedded devices that cannot connect to the internet that need to run blazingly fast on 'minimal' hardware for a long time. Ubuntu was chosen because we needed support for things like ROS and Nvidia.
The problem is that we would take a source image occasionally and then archive it on our infrastructure, so it looks like one download but actually services a significant number of machines.
As I mentioned previously, snap is just outright hostile. It's a massive shame for Linux to be forced to use it.
You can easily get the source packages for the likely small set of packages you actually need for your embedded devices instead of downloading some 30GB of ISOs of everything, most of which you don’t need. You’re basically talking about an internal mirror, and you don’t need ISOs to set up a mirror.
Also, installing from source on a fleet of embedded devices makes no sense whatsoever, the software should be cross compiled on some more powerful machine. So it’s still not clear how source ISOs are helping you “flash USBs”.
I'm aware that I'm making the case for my niche use-case, and that's going to be controversial for whatever reason. I could sit here and explain exactly why certain decisions have been made, but ultimately the point is that the ISOs that are/were provided were important for us.
> I still use them for flashing USBs - a lot of installs are offline, especially when the networking is some new or non-standard driver. Honestly Ubuntu restricting itself like this is a good thing, Canonical need to be kicked off of their perch.
A source ISO is just the source code, right? How many people install ubuntu from source?
I'm not a fan of Ubuntu, but this sounds a lot worse than it is. This is just about discontinuing the ISOs that contain source code. You can certainly disagree with them doing this, but my initial reaction based on the headline was that this was similar to the Red Hat Enterprise Linux announcement a few months ago when its significantly more benign.
You can still rebuild RHEL from CentOS Stream sources if you know what package versions to use, it's just not all in one convenient bundle anymore. Likewise, Ubuntu seems to be getting rid of their convenient bundle.
The source ISOs aren't a convenient bundle though. Nobody uses them; everyone uses Ubuntu's sources that are published online instead, including all historical versions. Like: what build would you be trying to reproduce, anyway? The original build published with a release, or the build that you are actually using that includes recent security updates?
But RHEL (starting from version 9) is built from CentOS Stream (apart from the separate trademarks which have to be swapped). CentOS Stream sources are RHEL sources.
Not exactly. RHEL is a point in time snapshot of CentOS Stream. There's no longer exact ABI parity between CentOS and RHEL like there used to be in the past.
The ABI for CentOS Stream can move forward while RHEL stays frozen in time at time of cutting the snapshot. Compiling/building a binary on CentOS Stream at a RHEL release + 1 may mean the binary does not run on that RHEL release.
So unless you know the exact timestamp that RHEL cut their release from CentOS Stream you no longer have the guarantee that it matches RHEL.
Whereas before CentOS WAS an exact copy of RHEL and if I compiled a binary on CentOS 8 it would work without issues on RHEL 8.
I'm only 6 months into the 'linux world', I don't understand what Ubuntu's competitive advantage is. I only used Debian and Ubuntu (and Parrot OS) and I don't see anything missing in the former (that can't be added manually)
A lot of the responses you're getting mention stuff that is only relevant if you're running a server or maybe on corporate desktops. If this is just your desktop machine for your personal use... the key thing to understand is that Ubuntu used to put far and away the most effort into polishing desktop scenarios, and they acquired a huge user base that way.
That user base didn't make them any money, plus a bunch of derivative distros showed up to compete with them for it. So these days they've pivoted to a server focus and it's still an OK desktop distro with one of the largest user bases, but it's not head and shoulders above the competition like it was 10 years ago.
Personally I've been running Ubuntu on the desktop for over a decade myself, but the main reason I haven't moved to Mint is... emmhh.... I have better things to do than reinstall my OS? Next time I need to do it, that may be all she wrote for Ubuntu
I'm not surprised that's the case. For probably 15 years Ubuntu was the Linux distro for a variety of reasons, most simply because it was the most user friendly. Desktop Linux in the old days was a mess of different layers from different groups which didn't quite mesh together. I recall using some distros over a decade ago where you had three or four different Settings GUIs to look through to figure out how to customize something. Ubuntu, while not perfect, was far ahead of its peers. If you go on Steam and look at system requirements for Linux games, they still list the recommended Ubuntu version, that's how widespread it was.
Now though? The rest caught up. Even in the last five years things have become incredibly painless across most desktops and environments. Ubuntu still has the big (for Linux) brand recognition, but on the desktop isn't anything special anymore, and if anything has begun alienating its users with forced snap packages, Amazon bloatware (although I think that was removed finally) and forced system upgrades.
As someone that has used Linux for 20+ years that is not my experience. Rather the UI experience goes in waves. Sometimes it's easier and sometimes it's more difficult. There were polished desktops even before Ubuntu existed. Everything is in constant motion, as paradigms come and go, and new hardware needs new drivers. Usually the worst experience is right between paradigms, where you need to use some things from the old way and other things from the new way.
The amount of work they've put into making Linux usable on desktop before the community decided to tear into them like they always do with biggest OSS contributors.
As always, people have forgotten just how much of an improvement Ubuntu was on other distros in stability, UI and feature set.
Ubuntu uses more up-to-date packages than Debian and has more frequent releases. Canonical provides a whole host of various services, so it depends on what you want from a Linux distribution. If you value stability over new features, then Debian is a better choice.
Ubuntu LTS has the same frequency as Debian Stable releases (2 years). There are the non-LTS Ubuntu releases, but they are supported for a short period (I think 6-9 months, but I'd need to check, and the files are deleted soon after support ends), which can be a problem if you don't update to the new version fast enough, and get stuck with a system than can neither be updated to a future version (LTS or non-LTS), nor downgraded to a previous LTS, and so you must reinstall everything from scratch (this happened to a colleague of mine, partially their fault for not updating, but Debian's release process doesn't let this happen).
My experience is don't run Canonical-developed (as in it originated from Canonical) software, it almost always breaks, get dropped, or they decide to screw around with the licensing. I think early on they tried to copy Fedora (e.g. having the latest by pulling from Debian unstable, 6-month releases), but now they're trying for the enterprise (and trying to copy RHEL), but they lack Red Hat's culture to pull it off.
They have more frequent LTS releases than RHEL, many software vendors release packages for Ubuntu or offer package repositories, and they haven't taken away their free product or turned it into something incompatible with their enterprise offering. Everything "value add" Canonical tries is a failure or negative though (snap most recently)
The desktop in Ubuntu is a bit more polished than in Debian; this has nothing to do with system features or stability, it's just some more convenience for new users; nothing that can't be put on par by changing some settings or installing standard packages. On the business side, the only advantage of using Ubuntu compared to Debian, is that Canonical, which is behind Ubuntu, is a company while Debian is a community. Companies usually want to talk with other companies rather than communities with no business structure. They are of course wrong, but when they realize it's usually too late.
I switched to Ubuntu back in 2008, and have used it on and off since then. When it first arrived, it was IMO significantly better than Debian for end users:
Debian was notoriously out of date. Debian Stable mostly contained packages that were several years out of date, and even Debian Testing often didn't have e.g. recent drivers and browser versions in its repo. Debian Unstable, on the other hand, was too unstable due to being rolling release. My subjective experience was that things like getting MP3 codecs to work or nVidia drivers installed was also a huge hassle. Ubuntu was based on Debian, but ensured that the latest versions of all these things either just worked, or could be easily enabled.
Secondly, I remember Debian as being largely a "do it yourself" option, not so different from what an Arch user today might be after. You used a curses-based installer, selected what packages to install manually, customized everything from the terminal, and so on. Ubuntu offered a cohesive desktop experience - it gave you a pre-configured Gnome desktop with all apps and drivers an everyday user might need, and slapped on a (subjectively) aesthetic and coherent theme. The installer itself was also a friendly GUI that even non-techie friends could use. If you wanted to install it, you didn't even need to know how to flash an ISO, you just put your address on the Ubuntu website and they literally mailed you a CD containing the installer - and nearly anyone could install Linux using it.
I think "peak Ubuntu" was around 2012, after that they became too large and things started going downhill. They split the Gnome user base by creating Unity instead of contributing to Gnome Shell, they split the post-X11 development by pushing Mir instead of contributing to Wayland (I think related to the failed "Ubuntu Phone"), they added Amazon ads (with privacy concerns) in the default desktop experience, then they started pushing Snaps, etc. It's been largely downhill since then, but they had already become the "default Linux distribution" targeted by companies like e.g. Steam, so they kept chugging along.
Personally, I still think Ubuntu works quite well if you want something Debian-like that is release-based (not rolling release) but updated more frequently. But I in that case prefer using Ubuntu Server edition and installing my own packages via apt, that way you get all the Ubuntu compatibility of running a mainstream distro, but you avoid most of the clutter in the default distribution. There's many forks like PopOS and Mint that can be used instead though if you want something that continues in the original spirit of Ubuntu.
I've being using Linux for almost a year and have had a similar experience.
From what I've read, Ubuntu was easier to install in the past but today (at least with Debian 12 Bookworm) Debian is just as easy to install and use imo. I've actually had less problems with Debian compared to when I was using Ubuntu.
In the past Ubuntu had a little pop up when you booted with Nvidia graphics, installed the correct drivers and just worked. Debian required you to install manually with all the fun involved.
All the other distros have caught up so there is less to sell Ubuntu on, but it was one of the first. They also had their 1000 paper cuts project where they spent a lot of time fixing everyone else's bugs and submitting PRs.
> I don't understand what Ubuntu's competitive advantage is.
Historically, Linux has always had more distros than the open source community (and the closed-source-but-targeting-linux community I guess) can support to a high standard.
This has meant there is always one leading distro, which everyone packages for and documents for and guides newbie users towards. For a long time this was Red Hat Linux; they decided not to do that any more right about the time Ubuntu came out.
Ubuntu has held onto that position up to this day.
Is it still true? I recently discovered Debian's live installer, it seems quite apt. Now that Debian installers come with non free firmware, this difference is also gone.
Their advantage is that they have the largest desktop user base. Access to those users or being the de-facto default desktop base, can be turned into income.
They built up their user base many years ago by providing some level of resources and coordination of desktop usability built on top of Debian's unstable release. Thus becoming something of the default target for application developers who wanted to support Linux. Debian never had the polish that Ubuntu did with its catchy animal based nicknames and 6 month release cycle. They built everything on top of the Debian community and just provided a little bit more testing and marketing. Ubuntu released it's first release almost 20 years ago with Warty Warthog. I switched to Ubuntu around that time as my primary machine although I switched to POP! os by System76 which is a desktop distribution built on top of Ubuntu that focuses on the desktop when System76 engineers decided Ubuntu was straying from supporting desktop Linux.
I use Ubuntu on all of my work and personal machines. The reason I like it is simple: if you search "$PROBLEM ubuntu" you're much more likely to get a useful result than if you search "$PROBLEM $OTHER_DISTRO."
I used Debian for several years before moving to Ubuntu. The reasons for that move are twofold: First, I bought an Nvidia graphics card. Second, Ubuntu has newer packages. Sometimes I need a very recent release of something, and Ubuntu is far more likely to have that. I also like Snap as a failover from apt: the only time I use snap is when I run into a situation like this:
> shit, this package is broken and there's about 200 steps I'd need to adapt from ArchWiki to fix it. You know what? I'll just snap it.
I am strongly considering going back to Debian, though. I run a kinda weird setup on my personal machines (I start with Ubuntu Server, then hand-install graphics drivers, xorg, i3wm, st, etc etc) and a lot of Snap packages seem to assume I'm using GNOME even though I'm not.
I stopped dealing with anything to do with Canonical the moment I saw their job ads require you to put in your high school gpa. Yes, even if you're a senior+ engineer.
These questions don't get placed on their job ads by chance and they aren't done in a vacuum; there is some form of approval at the leadership level.
If they ask that question, then they make a judgement based on the answer. If you're a company making such a judgement on such a question for such a position, you're a ridiculous company.
By HR. HR decided to add them. They make a judgement based on your how well you did at education, which does imply how smart you are.
But as I said, HR added them and stopping using tech because of the HR works is a bit ridiculous. "I don't use your product because I was going to apply for a job and your questions were stupid" is weird as hell since their product isn't built by HR.
I don't know the strict offline use case for such ISOs, so put me in the 'sounds good to me' group. Being a 15+ year Ubuntu desktop user, rPis, and Ubuntu as preferred images everywhere I do cloud servers, I would rather resources be spent on these normal online things.
Once upon a time, in a galaxy far away, the sources of coding voodoo demanded distribution. But broadband internet was rare and you would die of old age waiting on dialup internet. So it was that the preferred method of source distribution and acquisition was CD-ROMs (and later DVDs) driven down the highway in a station wagon.
Ubuntu (like Debian) has always had online source repos available; they are in no way restricting access to sources here, just deprecating a legacy channel to deliver them. See `apt-get source` subcommand
It also made sense when you could reasonably get Ubuntu given to you on a disc, and you would be in your right to ask for the sources files in the same format, assuming you didn't have internet access at all. It was a matter of license compliance.
I'm not even sure that would have been a useful distribution method for the first Ubuntu release, in 2004. Waiting to receive a posted CD would take longer than downloading even KDE or the kernel's source code.
A posted CD in Germany took a day to arrive (when sent from Germany, and there were people who'd send it locally), downloading the source code for a few large packages could easily take a week on dial-up.
In 1998 maybe, though 1 hour at even 48kbit/s is 20 megabytes, which is a decent-sized program's source. (The Linux kernel source in 2003 was about 25MB, compressed.)
What's so special about 2004? My parents finally got on DSL in 2008, but until then we — and many friends I knew — were still on dial-up. (Using the good old web.de smart surfer tool, if anyone still remembers that)
And remember, if you wanted to customize something you'd usually need not just one program's source, but also the libraries. Downloading all of Qt and most of KDE to customize something and build it from source took a loooong time.
I was using dial-up up to 2008, included. It could push 10 megabytes an hour at best, and even that only late at night. An hour of dial-up was on the order of 0.05-0.1% of a typical monthly salary back then.
So even if you were willing to spend 10% of your monthly income (which is impractical, but this is an extreme example), it could buy you maybe 1-2 gigabytes of data. Or you could just, you know, order a (source or binary) CD for free or almost free.
You could often get much cheaper connections, during odd hours in the middle of the night. But then you'd have to chunk the transfer over days or weeks.
I think more common case than postage was downloading ISO on fast office internet and then burning CD-R to carry home; this was era when CD-Rs were the widely available convenient large-capacity portable media. Although that began to be already waning, it was around that time that multi-gigabyte flash drives started appearing.
That is probably the best reason these CDs were made — legal requirement.
That requirement was removed in GPL version 3, where you could distribute binaries however you like and direct people to an HTTP/FTP site for the source. GPL 2 requires the same distribution method to be used.
Monthly computer magazines included CDs with all kinds of software (open source, shareware, freeware, older releases of paid software) and linux distributions. Different software each month, Ubuntu maybe in the issue after an Ubuntu release.
That is significant. I've got to admire the maintainers stepping up to the plate for this.
It does look like so many changes have been gradually made to other things while neglecting this most fundamental boot device, that now a bit of catching up would be needed to eliminate new technical debt when it comes to details where the long-standing floppy code might have had defect-free performance in the past.
I wonder how changes like this would affect bootability, or accessibility of non-floppies like HDD, USB, SSD when formatted floppy-style without partitioning?
How many different boot devices are there anyway? Would a loss of one forever reduce the number of choices unthinkingly?
I always thought floppy performance was engineered to be highly reliable before moving on to more complex filesystems and hard drives in the 1990's.
Why? Have you ever used the source ISOs? They don't seem to be a very practical way of working with the sources, as opposed to downloading the source packages from the online repositories.
I don't like what Ubuntu have become but this does not look like a good reason for trashing it.
At a previous company, we developer our own custom OS on top of Ubuntu [1], and patched a lot of components. I think getting the sources was the first step building our final product, so at a first glance, this is closing the doors making that (easily) possible.
I can't remember precisely what is on the source ISOs, but if you are building a custom version of Debian you likely want the upstream package source plus the Debian changes to that (from things like dgit if the package maintainer uses it). Since these source ISOs didn't contain things like the snap packages you'd need other things anyway to start making a downstream distribution.
> The Ubuntu sources will remain available and this isn't anything about closing off that, but rather whether the merits and ongoing maintenance burden is worth it for assembling source ISOs.
This is a poor reason to ditch Ubuntu as almost no-one uses the source ISOs. The sources are still available online - just not as however many ISOs that are needed (9?).
I've been using Ubuntu at home and work for many years as it's at a convenient point between bleeding edge (like Arch) and slow and stable (Debian), plus due to the large user base there's a lot of information on fixing issues and configuration instructions.
However, I'm not a fan of snaps as they seem to cause more issues than they solve. (One of the reasons that they're binning the source ISOs is because they don't currently include source for the snap packages, so rather than spend time fixing that for a download that very few people use, they're just stopping producing them). Since they moved to using snap for Firefox, I've had a few instances where Firefox wouldn't start. There's also issues with how Firefox interracts with the OS and so some extensions don't work properly. But probably the most annoying is that you have to close a snap package for it to be updated! So, I generally have Signal and Firefox running all the time, and now I keep getting messages about having to close them so that the snap package can be updated - it reminds me of running Windows.
TLDR; ditch Ubuntu because of SNAPs, not because they're stopping the source ISOs that you never used anyway.
I'm currently sticking with using the Snap packages so that I can evaluate how well they work and the level of issues, but if they annoy me enough then I might well move back to standard packages.
Unfortunately all Linux distributions seem to have issues for me. I just want to install something and have it work.
Ubuntu doesn't like my 4tb drives and the Nvidia driver support is horrible for fedora. Less people using Linux makes it more and more difficult to find helpful tutorials and guides. Kind of sad.
Definitely not pointing fingers (I'd take the stance of Linus in giving nVidia the middle finger).
Just from the point of a consumer trying to get work done on my machine and deliver things in a reasonable time frame.
I expect to be able to flash a USB, plug it into my computer and have things work. The default settings for installing Ubuntu do not let you use 4tb drives without custom partitions, not ideal.
And RPM Fusion straight up doesn't work with fedora, have to boot into a cli and spend a few days trying to get the closed source drivers to work with no success.
I've generally had success with the RPM Fusion instructions when installing Nvidia drivers.
One part that is strange is that the kmod gets built in the background, and you don't really know if it is still building except by watching the process/logs.
If you reboot immediately after install, sometimes you'll break it. Not a great experience by any means.
I just installed ubuntu on a computer with 1 2TB NVME drive and 2 4TB SSDs in it and had absolutely 0 problems. One ext4 partition for each of the 4TB SSDs.
> It was also raised that Snap packages also aren't currently included as part of Ubuntu source ISO builds anyways so these source ISOs are incomplete and have been so for years.
To be completely honest, I was done with Ubuntu after they hardcore started pushing Snap packages. My package manager should never be randomly killing apps. Ubuntu has become the Windows of the Linux world.
> We'll see if they pull the trigger for the current Ubuntu 24.04 LTS cycle on doing away with these rather unnecessary source ISOs.
I still use them for flashing USBs - a lot of installs are offline, especially when the networking is some new or non-standard driver. Honestly Ubuntu restricting itself like this is a good thing, Canonical need to be kicked off of their perch.