I have to admit I'm surprised to find that anyone is still trying to make desktop Linux work for normal people, i.e. non-programmers who aren't free-software fanatics. I think to myself, don't they realize that it's pointless, that the year of the Linux desktop will never come, that Linux will always be the distant third-place desktop OS?
More constructively, I wonder if recent Linux distros and desktop environments actually run better on 10-year-old hardware than Windows 7. My guess is that GNOME 3 and Unity require roughly as much CPU power and RAM as Windows 7. If that's the case, then maybe kids like the one in this story would be better served by a donated Windows 7 system builder license to go with that old hardware. Then they wouldn't be frustrated at a thousand points by someone's well-intentioned but misguided choice of OS.
> I have to admit I'm surprised to find that anyone is still trying to make desktop Linux work for normal people, i.e. non-programmers who aren't free-software fanatics.
As far as day to day use goes, modern desktop Linux environments are just as usable by end users as Windows.
The main impediments to adoption tend to be:
1. Software that they need that only runs on Windows or OS X
2. Syncing to devices like iPods or iPhones
3. Lack of easily accessible support infrastructure for when things go wrong, major version upgrades, etc. You can generally take a Windows machine to Best Buy or some other local computer store, or a Mac to an Apple Store, and they can deal with it, but there's little accessible support for the complicated cases on Linux unless you have a friend or relative who's into Linux.
4. Lack of support by IT departments.
So if you have a situation like in the original story, where someone is willing to provide that support voluntarily, and you just need a home computer with internet access and no specialized software, a modern Linux desktop will do just fine.
> My guess is that GNOME 3 and Unity require roughly as much CPU power and RAM as Windows 7.
Lightweight desktops like LXDE and XFCE can run fine on older hardware, and are much lighter weight than Windows 7. If you read the article, you would see that it was running LXDE, not Gnome or Unity.
I have found NetworkManager to be no more difficult to use than setting up networking in Windows. For me, for simple cases, it tends to just work; for more complicated cases with multiple interfaces, it takes some fiddling but can be made to work as well; and pretty similar to the amount of fiddling on Windows. I've found OS X to require less fiddling for some of the simpler and moderately complex cases, but also have problems of its own when you get to some more complicated cases.
It always "just works" at first. The problems start creeping in after a few weeks or months, when NetworkManager inexplicably vomits all over itself with dropped connections, unexplained failures to connect to wireless networks, and duplicate wireless entries (among many, many other things) when the underlying components (namely, wpa_supplicant) work perfectly fine on their own on the same hardware.
Are you trying to imply every OS doesn't face issues like these? A simple google search of "wifi broken after upgrade [insert OS]" will get you thousands of results.
I'd say I had worse wifi issues with Windows than Linux.
This isn't a "oh no, I upgraded and it broke". This is a case of "oh no, it simply stopped working properly without any perceptible cause".
Yes, I'm fully aware how broken Windows' wifi implementation is. NetworkManager, however, tends to make the Windows equivalent look wonderful in comparison.
And yes, every operating system has problems. That's not my point. My point is that we (the free-software-loving community) can do better than the current state of NetworkManager. It's the one pain point that seems to be pervasive across GNU/Linux distributions, and pretending that "oh, every operating system's wireless stack is broken, so we shouldn't be worried" is delusional. Even if NetworkManager is "just as bad" as the others, why shouldn't we be striving to make something better?
Well I've never had any of the problems you're experiencing on Linux in the past 3 years. Only issue I've had was a driver not being included by default. I've had issues on OS X and Windows though, and they put billions of dollars into their operating systems. So if they can't get it right, then maybe the problems go beyond the code and into the actual hardware?
Maybe you're just lucky. Or perhaps I'm just unlucky.
Whatever the case, I know it's not the hardware; wpa_supplicant (the underlying software that allows NetworkManager to function) has (in my experience) generally worked much better than NetworkManager, and OpenBSD's networking stack is also pretty flawless in my experience. So it's something about NetworkManager specifically that needs to be fixed or replaced.
And note that my criticisms about networking are not about Linux as a whole; on the contrary. My criticisms are strictly about NetworkManager. You probably understand this already, but your comment seems to be phrased in defense of GNU/Linux as a whole as if I consider the failings of NetworkManager I've experienced to be the fault of the kernel or somesuch - which is very much not the case - so I figure it's worth clarifying just in case.
I fear that at least one of the big DEs have contracted a severe case of developer paternalism.
In essence the developers involved are attempting to encode every use case they can think of into the program logic, and anything outside of that gets met with a "why would you want to do that?!".
I've been putting off learning it for years and years and years. Finally decided if I'm going to keep doing this as a career I'd better figure out how to use the new tools.
Have a new Cent7 install on a laptop at home.
I spent over an hour trying to get the wifi to simply connect to the network. Much of this was me trying to learn how to use the networkmanager syntax which is neigh-on-impossible to decipher. The help is useless, and apparently the syntax made a huge change from 0.9 to 1.0, and not knowing either means that 90% of the hits I found on Google were utterly and completely wrong and used deprecated flags.
I finally broke down and tried to use nmtui to solve my nmcli woes, but it only made things worse.
The installer was nice enough to create an ifcfg-<myssid>, but when I try to ifup that interface, the system complains there is no DEVICE for it. So I added a DEVICE= line corresponding to my wireless adapter (which nmcli sees), but then the system complains that my wireless device (WHICH NMCLI SEES!) does not exist.
I believe I have traced the problem to a bug report I found on RedHat's bug tracker -- I needed NetworkManager-Wireless installed in order for it to use my wireless adapter. Why lspci sees my adapter, the OS names it, the firmware is installed and up to date, and nmcli even sees the adapter as a wifi adapter, but then can't use it, is beyond me. nmcli wireless scans constantly failed (which is what led me to the bug report).. the bug report says it's fixed and provides better guidance that you need NetworkManager-Wireless installed as of <some version> but that's probably wrong, since I have the latest Cent7.1 (unless the bug report is REALLY recent).
Regardless, after adding the EPEL repo and installing the wireless-utils package which provides the iwlist, I was able to scan for wireless networks.
NetworkManager still was unable to scan, so I did a network restart hoping I could solve the issue without rebooting the whole box. Sadly, I lost my wired connection to the system at that point, so I'll have to wait until I get home to play with it some more.
Oh, and by the way -- even though my wired i/f is set to "ONBOOT=true", it did not, in fact, come up on boot because the cable was disconnected. After I connected the cable it took me 5+ mins to realize I had to restart the network service to get the connection to function, which is not the way it works pre-NetworkManager.
Suffice it to say, the software is massively, massively, massively, massively sub-par, and far less user-friendly than the good 'ol "editing text files" way. I suspect I am soon to have many more similar headaches as I delve into the world of systemd head first (another thing I've been avoiding).
Fortunately, we get to do really exciting things now that NetworkManager has taken over. I can CD to things, and set properties and methods, and all kinds of shit that makes me feel like I'm using Fujitsu's deplorable ALOM:
It's a buggy pile of shit, for starters. Constantly drops network connections, will frequently flood your network history with multiple duplicate entries for no apparent reason, will often mysteriously fail to connect to a wireless network for whatever reason even though others work fine (and other devices can connect to the wireless network in question without issue - even other NetworkManager-using ones)... that sort of thing. I know the underlying utilities (namely, wpa_supplicant) aren't the issue, since wpa_supplicant - in my experience - is much more robust when configured manually, with far fewer problems, but at the expense of having to edit wpa_supplicant.conf in order to manage wireless networks, which is a bit of a pain (though perhaps worth not having to deal with NetworkManager).
If someone figures out a way to port Android's network management system (I'm not sure exactly what it is off the top of my head, but I know it uses wpa_supplicant and that it's not NetworkManager) to general-purpose GNU/Linux distros, that would be a godsend.
OK I must be really lucky. I find NetworkManager just works. It does ask for my admin password when it doesn't need it but I find that if I just ignore that prompt then everything is fine.
The more common issue with passwords is that it incessantly prompts for the user's password or for the wireless network passphrase due to some strange inability to talk to whatever keyring daemon is in use (and thus an inability to retrieve network passwords). I usually work around that by disabling the keyring (either globally or simply preventing NetworkManager from using it).
It tries to bite off a lot more than it can chew, and fails horribly for anything that isn't a simple wifi config.
1. Makes fixed wireless connections less reliable. I plugged an LTE modem into a server for a backup link, and network mangler set it as a default route and munged up resolv.conf. My mistake for not removing it when I installed the machine, but that's hardly an endorsement.
2. Makes complex config out of reach. On my laptop, I'd like wlan0/eth0 to be bridged by default so that I can easily switch existing connections to a wired connection by plugging in. But I'm not going to fight network mangler just to get this feature.
3. Furthermore lots of times I'll want to setup an ad-hoc config on the wired interface (say, configuring a new router). For which it's easier to kill -9 the mangler and setup manually, especially if I want to keep my Internet connection.
4. GUI is the preferred way to config/interact with it, which is completely un-unix. It's one of the few things I have to do manually when reinstalling a machine.
I played with Arch a while back, and wicd felt much simpler and more straightforward. Network Mangler feels like a hack someone came up with to setup a simple wifi connection, which then got adopted into a bigger role than it can handle.
At least as of version 12.04, Ubuntu [1] automagically detects my wireless HP Laserjet printer (albeit with some slightly reduced functionality, IIRC). My wife's Windows 7 laptop struggles with it; the install process was more involved and the printer sometimes refuses to wake up when sending a print job to it.
[1] The modern desktop-oriented distro I'm more familiar with. I'm sure other user-friendly distros handle it similarly well.
At the very most, it might involve having to navigate the CUPS web interface (which, admittedly, is a pile of shit; thanks, Apple) in order to get a printer setup. Still rarely involves having to install drivers (though some vendors are better than others; Brother and HP tend to be the most Linux-compatible, while Canon and Konica Minolta have tended to require driver installation on some distros).
What installation? At the most, you'll be asked to install packages from HP and you click okay and you're done. On Windows, it's still likely that you'll need to download drivers from a website.
> I wonder if recent Linux distros and desktop environments actually run better on 10-year-old hardware than Windows 7
One of my main machines right now is an old HP workstation laptop that came out when Vista did. I can't run recent versions of Windows due to driver issues (random blue screens apparently due to the wifi driver; I only had a key for Windows 8.1 and the drivers were incompatible), but I can run Ubuntu perfectly fine out of the box with no special configuration required. In my opinion that is the biggest advantage to running old hardware with Linux - as long as you're not using oddball hardware there's a great chance it will run better than it would with Windows.
Linux distros like mint are basically indistinguishable from Windows to sufficiently casual users. All they ever do is open up the word processor or browser.
How many parents buy their kids expensive MacBooks because "they need a good computer for school" when the truth is they just bought them a $1300-$2500 Facebook machine.
The only people I heard complain about Linux were in the middle range of computer skills. Not so casual that they didn't care, but also not so good that they could easily grasp Linux.
I find this demography is hardest to deal with. Even when they are on a platform they are familiar with, they are usually first to complain about having something different (e.g. some UI elements moved around) yet don't know how to fix it or even not bothering with a simple search to find out the problem.
For some users, the year of Linux on the desktop has already arrived (barring some problems like the ones lambda mentioned).
Now, I won't use myself as an example, because even though Linux fulfills almost every need (gaming, watching videos, programming, browsing the web, printing, burning DVDs), I'm aware I'm a power user. So it's probably not useful to mention I haven't used a Windows computer at home in more than a decade.
Therefore, I'll use my mom as an example instead: save a few exceptions, I could simply replace her laptop's Windows install with a user-friendly Linux distro (such as Ubuntu) and she wouldn't even notice. Maybe I'd have to tweak it a bit and install some useful software, and explain what some of the icons mean -- which is exactly what I did when I first installed Windows on her laptop. Honestly the biggest hurdle would probably be explaining to her the differences between LibreOffice and MS Office.
I'm not a free software fanatic. I'm dual booting elementary OS with Windows. On my Windows installation, I have Office 365, Visual Studio, Microsoft SQL Server, etc. I also have 1TB of space on OneDrive. Everything completely legal of course.
But, I haven't booted to Windows since mid-April (and even then my Windows adventure lasted for like one hour or less, I just got bored and wanted to play one game I haven't played in a long time). elementary just gets the job done and it looks pretty damn sexy while doing it.
My laptop is currently running on 59 degrees Celsius. It's been up for 22 hours now and its running Firefox, Thunderbird, Spotify, Atom text editor, MEGA client, terminal emulator and a .pdf viewer ever since I booted my laptop. Of course, I didn't use it the whole time, but my laptop was still running while I did some other things. I can't even compare that to Windows (8.1) that jumps to 75 degrees as soon as I boot into it.
Plus, elementary looks like 10x as better and I don't have to search the web to find something like a freaking C compiler or a SSH client. I don't have to worry about the registry and I don't have to worry about junkware being bundled with some piece of legit program. I don't have to click on the freaking Next button for 15 times to install something. All I need to do is to enter one command and enter my password.
You know how you make linux work? Never install pirated versions of any software on someone else's computer. When you start talking about how much they have to pay to use windows and office legally they'll change their mind and use whatever you put in there.
That also, ofcourse. But on our own machines we can decide, and if you're like me, every computer in your house uses a flavour of Linux and you don't even think about pirated software or software licenses most of the time.
I love when it gets to the part where they ask me how to download a youtube video as mp3 and I introduce them to youtube-dl, the terminal and a bit of .bash_aliases magic to make the youtube-dl command a bit easier. Their face: priceless =)
I know I do; I've bought every version of Slackware since I started using it a couple years ago (which, granted, has been, like, three versions, but still). I've also bought a few OpenBSD CD sets, though I've slacked off a bit lately.
> ...normal people, i.e. non-programmers who aren't free-software fanatics.
Those people are not replacing their PCs. They are buying mobile devices. PC sales have been in a multi-year decline, and sell at about one-fourth to one-third the rate of mobile devices.
> More constructively, I wonder if recent Linux distros and desktop environments actually run better on 10-year-old hardware than Windows 7.
This depends entirely on the desktop environment / window manager.
The machine described in the article uses Xfce. As a real-world example of its low requirements, I very recently setup an old Dell laptop running openSUSE 13.2 with Xfce; said laptop has a single-core processor and somewhere between 512 and 1024 megabytes of RAM (I forget the exact number). Runs like a champ, while a "modern" version of Windows would - in my experience - choke even with much more resources at its disposal. I've also run Xfce-based setups on even weaker hardware, though at that point you start to run more heavily into performance issues.
You can go even lighter than this, of course, by switching to a plain window manager instead of a full-fledged desktop environment. Tiny Core Linux, for example, uses FLWM (a lightweight FLTK-based window manager); combined with its use of busybox instead of the GNU userland, it's able to run on as little as an i486 and 42MB RAM. That's a bit on the extreme end, though, but going a bit bigger than that, I've had good experiences with Openbox (and similarly lightweight window managers) on a full GNU/Linux system running on systems with Pentium II processors and less than 256MB RAM.
Heavier desktop environments like GNOME, Unity, and KDE certainly have additional performance impacts, especially due to their heavy reliance on compositor effects. However, in most circumstances, a system with 1-2GB RAM and a mid-200x single or dual core processor will handle these fine. In the case of KDE, you can also turn off desktop effects, thus reducing computational overhead even further.
In the Windows world, you can generally get a system comfortably to the same realm by turning of Aero (in the case of Windows 7). I'm not sure what the customization options are for Windows 8/8.1/10, but I expect the option for a "Windows Classic" look/feel still exists, and I further suspect that it would allow some additional usability. However, in my experience, Windows is quite prone to slowing down over time; even with routine maintenance (defragging non-solid-state drives, running CCleaner, etc.), performance will typically degrade over several months to a year after installation. For newer hardware, this isn't a huge problem, but for older hardware, this becomes a significant obstacle to usability that GNU/Linux and BSD systems don't typically have.
> My guess is that GNOME 3 and Unity require roughly as much CPU power and RAM as Windows 7
I have a 7 year-old Dell laptop that wouldn't run Vista with anything like acceptable performance. I installed Debian/KDE and performance is absolutely fine.
The only tricky bit (as other comments have pointed out) was the wireless device driver, which I had to download from the Broadcom website.
I think Windows 7 generally runs better than Vista. IIRC Windows 8 should also use less ram than 7. My Dell Inspiron Mini 10 ran windows 8 smoothly, just couldn't run Metro apps because of the low resolution (there's a minimum res required).
More constructively, I wonder if recent Linux distros and desktop environments actually run better on 10-year-old hardware than Windows 7. My guess is that GNOME 3 and Unity require roughly as much CPU power and RAM as Windows 7. If that's the case, then maybe kids like the one in this story would be better served by a donated Windows 7 system builder license to go with that old hardware. Then they wouldn't be frustrated at a thousand points by someone's well-intentioned but misguided choice of OS.