Why would you recompile X11? Back in the 90ies, early 00s, I did compile kernels to make them lean and enable some functionality that was not in the default kernels. But I never saw anyone recompiling X11 outside Gentoo and other source-based distributions.
I remember Gentoo linux being quite the rage in the early 2000's (at least in my office). Compiling everything and getting your system up and running was a badge of honor, I guess.
I remember booting from a Knoppix live CD and using that to install Gentoo so that I could use a web browser, IRC client, and GAIM to keep in contact with friends while the full-day process of the stage1 install worked on my old, slow computer. I remember not including GNOME or KDE in my ebuild flags so that the build took less time, and then using WindowMaker as an X11 window manager because it took less time to compile and ran faster than trying to run GNOME or KDE on that old machine (PII 400, 96 MB RAM, 8 MB ATI onboard video back around 2003 or so).
Gentío taught me so much. When their documentation went through that weird phase where stuff went missing was when I dropped off and my Linux knowledge declined. I stopped using it and lost track of what’s trendy nowadays.
Compiz times with the cube desktop and compiling kernels overnight in my Pentium 4 kept me away from making out with girls many times.
For quite a while, if you wanted to learn how a Linux system really operated, you'd build a Gentoo system.
Eventually, you'd get tired of all the options and switch to something more stable, especially for servers. I have some fond memories of Gentoo and emerge and compiling all of my software, just so. Sadly, it was never very stable... and not really through any fault of it's own. Really, the customization you could do was great... but there was always one more thing to tweak, one more knob to turn...
Badge of honor -- yes. I'd almost call it a requirement for someone to work through once or twice.
It wasn't watching the compiler output... it was choosing the components. You'll need A, B, C, etc. For each category there was often more than one choice. You had to choose which syslogger you'd use, for example. With RedHat or SuSE or other distributions, those choices were already made. You may not have otherwise known what options were available.
Imagine starting out with Linux today and not knowing that systemd isn't the only option for an init system. (Regardless of whether or not you like it, it's helpful to know what alternatives exist).
In the end, with Gentoo, when you had your config set, yes, you'd get hours of compiler messages. And if you were lucky, none of them would be errors.
But you'd also know how the system worked. Honestly, it was also about control. With Gentoo, you could configure the system exactly as you wanted, down to the compiler flags. How many other systems let you really do that? Instead of targeting a well-known arch (ex: i686), Gentoo let you set your compiler flags for the entire system to match your exact CPU. The upside was that it was your system. The downside was that it was your system and if/when it broke, you'd have to figure it out. If your goal is to learn how to use Linux, that's also a feature. If your goal is to have a stable server, not so much.
Like the original parent commenter, I was playing with Gentoo back in the early 2000's, so much has probably changed. But I definitely learned a lot back then.
Configure output isn't compiler output, so I guess we've got to the bottom of this mystery.
> With Gentoo, you could configure the system exactly as you wanted, down to the compiler flags.
This doesn't give you the control you think it does. Most of the customization you refer to is basically adding a USE flag to make.conf and running "emerge whatever" again, which I maintain isn't much of a teaching tool.
You end up learning about Gentoo, not software, compilers, operating systems, or computers.
> Instead of targeting a well-known arch (ex: i686), Gentoo let you set your compiler flags for the entire system to match your exact CPU.
This is a good example. So many people ended up thinking (incorrectly!) that this benefited them in some meaningful way, and thought that they were learning and customizing, when they still don't know, even a decade on, that '-O2' and '-O3 -march=native' are for almost all
tasks indistinguishable from a performance standpoint or how to even go about measuring the difference.
Gentoo is like a show on the History Channel: it sure feels like you're learning!
Compiling and installing large amounts of system software, a la `emerge world` or `make buildworld`, is great exposure to many system components. `make menuconfig` introduces one to various features of the Linux kernel, and yes, even a humble `./configure` illustrates how the software in question depends on libraries and hardware. I wouldn't casually dismiss the educational value of these experiences, nor the curiosity of those partaking. They're certainly more expository than the digests displayed in a `docker pull`.
I built a Gentoo system once or twice, and I learned a lot that I otherwise wouldn't. Even just following the directions forced me to go to parts of the system I otherwise wouldn't have.
Now I use Macs on the desktop and linux on the server.
I've had my fair share of X11 builds at different work places.
The typical use case is when you have to work on some Linux dev box which does not have any (or a somewhat recent) X11 and the distribution is either too old to get one, or simply you're not root.
In these cases, the simplest (though annoying) solution is to rebuild X11 and a wm from source on the box as user.
Given OP mentioned he was doing his studies, I guess he was required to work on some old boxes and wanted a decent modern environment.
I remember recompiling X11 around the time freedesktop was getting started. Because features like XRender, XFT, etc. were coming online and I didn't want to wait for my distro to update. Having decent fonts was that good.
I certainly remember rebuilding X to get it to work with a new graphics card. Normally a little investigation to find out the changes I’d need to make to the code for identifying the card and sometimes some other small changes.
I'm guessing at some point in the distant past the distribution you were running didn't build X with the options you needed this wasn't normal 18 years ago and it certainly isn't now.
A Linux Mint install normally consists of a friendly gui installer followed by installing common software from an app store interface. It's more friendly than installing windows.
When's the last time you installed windows? I installed w10 about 3 weeks ago and I:
- Plugged a USB key & Ethernet cable into my PC
- clicked through a handful of GUI options
- Made a coffee
And when I returned (~15 minutes, I didn't time it), it had installed windows, done the post-install reboot crap, and was ready for me to install my own software. Out of the box I had internet connectivity, power management, semi-modern graphics drivers (< 3 months old) and was ready to rock.
I had a fresh install of windows from 6 months ago that just committed unrecoverable suicide a few weeks past after an update. The filesystem was fine but it wasn't able to boot any longer and absolutely nothing had changed save for the update.
Going through the recovery tools built into windows was pretty easy but nothing including "refreshing" the OS which is basically just a reinstall while saving your files didn't work either and ultimately I had to just start from scratch.
When I did the reinstall I decided to switch it from legacy to EUFI boot and enabled that. The installation of windows tried to get me to link my account with my imaginary microsoft account. Opting out of that is designed to be confusing. Then it tried to get me to enable invasive telemetry with promises of functionality I didn't care about. It wouldn't have worked anyway because without changing another peripherally related option in my motherboards settings menu windows consistently malfunctioned when enabling networking in a way that was not an issue under Linux. One hour later thinking I had somehow created the windows install usb with the wrong option I figured out what was wrong and finally had windows working again.
In the course of 6 months windows had to be installed twice and took over 2 hours total time and tried to trick me into tying my ability to use my own computer to their permission and giving up my privacy.
The only reason I bother to keep it around is that its still easier to game under Windows. Might as well call it XboxOS because its surely unsuitable for any other use.
I had a non-standard monitor (mid-90s) which will not work with xf86config out of the box. Spent a nice summer trying various settings and was such an aha moment when it worked.
I have much gratitude for how much I learned. Apple had made great money in my desire to never do that again.