Hacker News new | past | comments | ask | show | jobs | submit login
A Return to Linux on the Workstation (cuddletech.com)
93 points by Danieru on Sept 20, 2012 | hide | past | favorite | 37 comments



Even a Joyent tech admits Ubuntu is a more pleasant experience. Thought this was interesting, but all his points are not isolated to the desktop experience. Pretty much all the reasons he switched to linux affect the building and maintenance of servers too. I think it just boils down to community support.


I will read the article later, when I have more time, but many geeks that only know Linux and eventually BSD, have no idea how painful commercial UNIX systems can be.

Several of our customers have HP-UX systems that look like plain System V systems straight out of the 70's.


That was my first response to Linux when I was first trying it out. Back in the mid 1990s.

At the userland level especially, it didn't suck, and (with the GNU userland, windowmanagers, etc.), in fact, sucked radically less than stock commercial Unices I'd been using at the time (Sun, HP, AT&T, Data General, BSD).

The situation's only gotten much, much better.

I'll occasionally find myself in situations where I'm connecting to commercial Unix boxes (was a semi-recent shop where a fair number of staff still ran CDE desktops), and, really, it's painful. Doable, but painful.


Sure HP/UX and friends always were terrible, but in 1995 I actually failed in love with Unix thanks to IRIX. IRIX really had a nice UI, easy to use management interfaces, powerful multimedia capabilities. Only installing anew it was bringing me back to the stone age (particularly the partitioning part, "inst" package manager was actually decent).


Commercial systems operate under constraints that Linux didn't have until recently, such as needing to maintain backwards compatibility. This went all the way from not changing the interface for people who used workstations for highly skilled works but were not "geeks" (e.g. CAD guys), to sticking with old shells (e.g. Sun stuck with ksh long after Linux had bash). Witness the furore over the GNOME team changing things a couple of weeks ago...


The first task for the first real job I ever had was to get gcc 2.something running on an HP-UX 9 system. I still have nightmares.


Does HP-UX still want you to use the newgrp command all the time?


After using Archlinux for a year I am back on Ubuntu 12.10. Ubuntu 12.10 is an absolute bliss. It is a complete hassle free desktop environment. If Ubuntu had support of hardware and software developers, it is definitely ready for some serious competition.

On the other hand, I am not so happy with Linux kernel which is being updated every 3 months yet it performs terribly when it comes to power management. There is a new power regression bug every few months.


Weird, I'm on Arch and it's no more hassle than Ubuntu. Do you use a custom WM/etc?


I have tried several WM from DWM, Awesome to XFCE4 and Gnome. There are things that Archlinux community doesn't want to do as a part of their philosophy. I am not saying it is wrong. I just like already set up distribution for now.


The beta version, no?


Yes, I am running 12.10 Beta 1.


I have installed Ubuntu Linux 4+ years ago, basically when its multimedia capabilities started to work in a fresh installation. I have never felt any necessity to come back to OS X/Windows since then.


Once Steam comes to Linux, I don't see many reasons why a lot of gamers won't switch over. Many of the people I who use Windows to develop in Python/Ruby/etc use either a VM or SSH into a Linux box, simply because they want to game. I think many people will not want to pay the Microsoft tax, especially for Windows 8.


Because most games on Steam will still only run on Windows. Until game developers take up the charge (which Steam on Linux might encourage), it's just a window (no pun intended) into what could be.


The only games that will be on GNU/Linux, at least at first are Valve games and crappy indie flash games. While it's the first step, most developers aren't on board. Yet.


Most of the Humble Indie Bundle games seem to be available on Linux and I wouldn't call them crappy nor flash. But yeah, the big titles rarely work. Valve's work on porting their Source engine to Linux will certainly help this.


every humble bundle game has both been available on steam, and linux.


Ahem, on the desktop. And the switch was from OSX.

I don't much care either way - I use OSX on my workstations and Ubuntu on my servers - but the title seems overhyped for such a simple article.


No, the switch was from Solaris on his workstation - not OS X.


Whenever I had a tough Solaris x86 problem, a web search would usually turn up information from Cuddletech.


But at the end I didn't understand because he wrote "In the end, 3 days later I had been issued a replacement MacBook Pro which I got just as Mountain Lion released." So he went back right?


No. The laptop is irrelevant.

The article is about him realizing that his workstation shouldn't run a dead Solaris installation anymore, and more general that he doesn't need to replicate a server on his (workstation) desktop.

A broken laptop (running OS X, not that it is relevant for the meat of the article) just lead to the switch, which wasn't reverted for all we can tell.

Yeah, the laptop (with OS X, incidently) was replaced.


Right. He, like most of the tech world, is using a Mac on his desktop.


Proof of that?


haha I clicked the article because of the hyped title... I was expecting something like "Why Solaris sucks"


SmartOS looks pretty interesting (the thing he works on as mentioned in TFA). Do people have opinions about it?


Headline is a cheap clickbait.


You appear to assume that obtaining clicks from HN readers through subterfuge is the author's goal.

I don't think that's borne out by the content of that post or of other posts. It's more likely that he's just writing for his regular audience, for whom the title expresses exactly what's happened: he's changed the operating system on "the Workstation" to Linux.


I should apologize, I submitted the article with a different headline: "Joyent sysadmin switches to linux". I contemplated using Ben's full name but I thought better of assuming everyone knew who he was. My title was an attempt to summarize the article, which I will avoid doing in the future.

For those who do not follow his blog, Ben was a major OpenSolaris proponent and even served on the council before the Oracle buy out. I never thought I'd see the day he replaced a solaris machine with linux.

Now I should also mention that Ben thought Sun got distracted by focusing on OpenSolaris as a OS for developer workstations. Thus I think it would be inaccurate to call the OpenSolaris forks dead, I'm sure he still prefers Solaris on servers.


I actually prefer your headline because I think it gives context to the post. Joyent has been promoting Solaris in a world where everyone else appears to be using Linux.


Ben Rockwood has his blog since forever, he is a serious author, I doubt it was just for clicks.


So... what's a 'workstation' these days? Every desktop machine has the 3M: (at least) a megapixel display, (at least) a megabyte of RAM, and (at least) a 1 MIPS processor, and that's in addition to little things like graphics cards and Ethernet (and/or WiFi) hardware, both of which were defining features at one point.

http://en.wikipedia.org/wiki/3M_computer

Does the definition of workstation come down to what the computer's used for at this point?


I'm torn apart whether this is a good or a silly question. On one hand it's an interesting nitpick on the change of labels / language, on the other hand it feels like Gordon E. Moore should poke you a bit with a stick or something.

In other words: Labels change, especially in IT. Duh?


That doesn't answer my question, which is: What has the label changed to?


I'd look at it like this: A workstation is what you use when computation power and/or memory bandwidth is a limiting factor of your work. This holds true for:

- developers of software with long compilation time

- 3d animators / professional video editors

- scientists that want to run simulations

- engineers that want to run simulations

For these people, a computer is never fast enough and they get a tangible benefit for every speedup - which is why they always have the best performing gear. And the (multicore) performance of the best gear is always [CONSUMERGRADE_PC * X].

I'd set X=3 and then you have yourself a workstation. (For example 16 core Xeon vs. 4 core i5.

For that reason I don't think it really makes sense to define workstation in an absolute number of Gigaflops/Hertz/Byte. It's just the cutting edge of desktop computing that's still feasible for work (e.g. overclocked CPUs to 8Ghz with liquid nitrogen cooling don't count, neither do machines with say more than 2kW heat - that's when you need a computing cluster).


> Labels change, especially in IT. Duh?

No, not unless you really just want to be rude.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: