Hacker News new | past | comments | ask | show | jobs | submit login

If you want to "win" then you must think about your achievements. Focusing on where you're being screwed keeps you being screwed.

In the last 20 years, Linux has taken over many domains. First servers. The big win that made Linux impossible to ignore. Ten years later a major shift to the userland was mobile: there are half a billion Android devices sold. Not to mention embedded systems. Meanwhile, Linux has also become the way to build supercomputers: the world's fastest supercomputers these days mean assembling huge Tesla GPU racks to run Linux.

While I do realize that secure boot can make Linux's life harder, I also realize that Linux per se isn't going anywhere or isn't attacked by Microsoft but general purpose computers are suffering from exhausted momentum. There are more and more devices that come with a preinstalled system, such as Android and iOS and the traditional desktop/laptop paradigm of buying hardware and installing whatever you want to run on it will continue to move into the marginal.

I predict that eventually it's the programmers who will be the only ones who exercise tasks of so varying nature that they need a more complex interface, something like what I'm using now. Most of the rest of people can manage with mostly touching a screen to use applications that offer a set of predefined capabilities, occasionally hooking up a keyboard.

The era of immutable operating systems with applications installable from a prefiltered app store is probably what actually works for most people. It's us programmers who long for the days when hardware was to be bought and software was to be written because we know that the lowest layer must always be there. For us, it was the very definition of a computer: programming hardware was all there ever was to computers. But most people don't need that.

Most people are perfectly happy with an appliance. They don't want to install updates or manage their system, they just click the power button and immediately continue from where they left the last time. They want an application and they tap it and it gets installed without clicking through a dozen pages of a helper wizard or aptitude install commands. And appliances such as tablets and phones and touchscreen laptops deliver a much better experience for non-programmers.

What we as programmers see as a bad thing is that programming will be removed farther from the end users. We all know how we started with playing games and ended up writing our own games because our computers allowed, and to an extent, suggested that. We would want to preserve that heritage to the future whizkids and future programmers. But that's still got a perspective bias: most people so far still haven't "found it" even if their computer would have allowed it. It's just us for which "finding it" was the revelation, and we know we would've "found it" anyway, somehow.

Maybe they'll be selling a limited stock of programmer's computers in ten years that are fully programmable. There's definitely a market for those because you can't write the nice appliances without a real programmers' computer. Maybe my grandkids will receive one of my old programmer's workstations from my work place, and use that deprecated hardware to teach themselves programming and find it a thousand times more interesting than playing games on their phones, much like my predecessors managed to hook themselves up with cheap nighttime computing time in a mainframe at their father's workplace and abused those cycles to write games to entertain themselves.




What a trash comment. No, the closed garden application computer is not the right way to go for non-programmers. It gives tremendous monopolistic power to corporations - fuck that.

We need to get rid of all the kinks out of linux, make plug and play work seamlessly, and improve usability. Part of that means getting rid of the dependency to use the terminal (who uses that piece of shit anyway?) and the need to config text files to fix driver issues. Linux still has the ability to reach #1 in non-programmer desktop usage.


What a trash comment.

Actually, it's not. You just disagree with it. Sometimes when you disagree with someone's conclusions, it's hard to see the merit of their reasoning.

We need to...

There are two complications to this. One is that the "we" of Linux development has never and will never have the kind of coherence that it takes to appeal to an extremely large audience. (Even calling it "Linux" isn't always agreed-upon). There are a great many people working on Linux or making things from Linux, but they are not all going in the same direction. This is both a strength and a weakness.

The second is not so much a different reason as it is an example of a major weakness of this style of development: It's a demonstrably inefficient way of addressing the needs of certain kinds of users. The list of things you say need to be fixed with the Linux desktop, which is quite typical and also quite incomplete, has also remained constant for more than a decade as year after year "of the Linux Desktop" passed by. Although the situation has improved, "We" have not moved on to a better class of problems because most people working on them want to use the terminal, or at least don't mind editing text files for configuration, and correspondingly are not very good at addressing the needs of people very much unlike them.


While I agree that closed "walled gardens" are a bad idea, the traditional desktop/laptop PC is facing competition, and yason's comment isn't trash.

By combining both of your comments, I do think the future is clear: the only truly free, general purpose computing machines will run Linux, and will _look_ like an appliance, just because of all the locked down walled in appliances Linux will be competing with.

It won't hurt that this will make Linux easier to use. And Gnome 3 is an example of how not to make it easier to use – give the user what they want, do not take it away.


Will a programmers computer necessarily have different hardware?

It strikes me that with better virtualisation technology you could have your "appliance" device and your dev box sitting side by side on the same piece of hardware (or access it remotely). The question will be more about access to that, will it be something you can just install like everything else on your device or will you need to do some complex jailbreak or sign an agreement with the manufacturer to install?

I'm not sure to what extent Linux has "take over servers" though. Surely this would seem to be the case with HTTP servers (which are the most visible to the internet) but if you go into businesses from small to large you find an awful lot of Windows Servers doing stuff like Active Directory, Exchange and File/Print sharing. In fact almost all of the "IT Service Provider" companies in my area are Windows only shops.


> I'm not sure to what extent Linux has "take over servers" though

15-20 years ago for a lot of functions a big UNIX box would be installed. More often than not, that box is now a Linux box. Linux didn't so much as push Microsoft out of the datacenter as replace UNIX.


The two things keeping MS in the datacenter are Active Directory, and MS SQL Server.

There were recent discussions about AD here in HN.

SQL Server is a great piece of software, better than Oracle and MySQL and its only fault is to run on Windows.


Since virtualized IO is slower, and since developers often pay a premium to get faster compile times, I don't think virtualized development environments are very productive unless a dev can't get a bare-metal box for the target OS environment.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: