Hacker News new | past | comments | ask | show | jobs | submit login

Hot desking is the new thing

I visited a place that was a giant room of ~100 people working. No one had their own computer, instead you just sat down and whatever computer was there and started working. The guy running this insanity said they re-image all the machines all time taking settings from one of the machines that has been in use. His theory was that over time everyones preferences would merge.

It was interesting to see and a little frightening. Right off the bat the noise level would have turned me off. I asked how hard it was to hire and he said interviews are held in the room with him or another senior person pair coding with the candidate.




At my university, 15 years ago, we had network profiles. So you would log to any machine in the university, your home directory would be your own. That worked with dual boot Windows and Linux, so you'd get the same files on either platform.

The machines were pretty much stateless. When you booted a machine you could choose between Windows, Linux, or reset the whole machine. The third one was what you did when the machine acted funny, it would wipe the hard disk and reinstall both OS' from a network image. The whole process was completely automatic and was done in 20 minutes.

So it was real hot desking, and it worked great. You sat at any computer, it became your computer. Way before Google started to talk about stateless computers and their chromebook.


That works pretty well on Linux systems, where you can simply remote mount (usually NFS) your home directory.

For Windows, what's actually happening is that your user profile is getting copied to the system. Which is why logging off takes so long -- the profile is getting copied back to the server.

Do this on an underprovisioned and busy network, or worse, one on which work cycles are highly synchronized (e.g., students, in standard class blocks, over the course of a day), and where account profiles can grow without limit (at one point I had tools to ID and prune large profiles), and things go all to hell.

The Linux / Unix model actually can be quite useful, and it isn't too dissimilar from my own initial experience: console logins to the campus Unix network from dumb serial terminals (precisely zero local state).

Sun Microsystems did some work with this (in conjunction with their own hotdesking workplace experiments) as well.

The downside is when you're doing highly compute- or data-intensive work, in which case the amount of information transferred across even NFS links becomes problematic, and/or you need to provision some really beefy servers. At that point you likely want some sort of shared batch compute resource. Again, more easily accomplished under Linux/Unix than other platforms.


I've configured a number of student labs exactly as eloisant explained. In windows we've used folder redirection which has options to disable the "offline files" type features such that it doesn't do any copying of profile files to the local drive.


I'll admit to 1) avoiding Windows admin work to the maximal extent possible and 2) wow, it's been ten years since I've had to do any.

So my information may be somewhat dated. Still generally harder to do this under Windows than Linux.


We had this for meeting rooms at a previous job. It worked quite poorly, since it took a long time to pull in your profile. Meeting runners learned to get to the thirty-minute meeting ten minutes early so that there was some hope of being able to use the computer for part of the meeting. Also, the Windows profiles we were using didn't include applications, so there was the fun of each person who needed, say, Skype, or Chrome, downloading it onto each computer that they hadn't previously used it on.


The university I went on used a similar setup, and pulling the profile took no time.

I don't know what's wrong with corporate enviroments, that Windows profiles take so long to roam. That seems to be true to all of them. Also, corporate IT has an extremely irrational aversion to simply installing the software on all the computers for once. Even when it's free software, or things that everybody uses. That also repeats everywhere.


I remember from the old days when I used to deal with this. It was most often a case of people putting a lot of big files on their Windows desktops.

Certain people refused to put their big files on the network share because they said it took to long to load and then complained that their roaming profiles took to long to load because they had those big files on their desktop.


The fact that MS platform management separates user capabilities (profiles) from platform capabilities (applications, installed and managed per-host) results in some particularly painful characteristics.

This is where the ability to have automatically configured (puppet / chef / ansible / cfengine) clusters of servers for 'Nix hosts, or NFS-mounted /usr, so much more powerful.


My university still does this, it works perfectly on the Linux computers. I'm not sure how well it works with Windows machines since I don't use them more than I have to, but you at least get access to your home directory.


UT Austin, nearly 20 years ago. NFS and NIS and a locally-written authentication system meant you had the same home directory on any Unix machine you sat down at. Since your configuration was all in your home directory, personal configuration was also instant. A skilled and somewhat crazy sysadmin team meant that (almost) all of the programs that were available on one system were supported on all of them, so you had the same environment on any of about six different OS/hardware setups.


That's not what this was. I was specifically told that over time their settings would all merge together.


Due to RSI I have my mouse set to the lowest possible sensitivity, I'd like to see my settings merge with his


My only stipulation would be to have my own keyboard/mouse, and my own chair. And some bleach wipes for the desk to take care of cold and flu season.


My university did this as well. I also worked on the software that handled the configuration management for these machines too. Really awesome setup. http://www.labnet.ca/MainPage/index.html


That sounds remarkably like my school; New Mexico Tech perchance? I was a UC, then SysProg at the TCC for a while.


Systems like this were very common in universities, especially in the late 80s/early 90s. Thin client computing (or simulated thin client) went out of fashion after mid-late 90s implementations of similar things on top of Windows were massively disastrous.


This is a great idea! You could apply genetic algorithms to the process. Pick the most popular and productive computers and re-image from these, eventually identifying the optimal configuration. Popularity of a shared resource could have nothing to do with location, hardware specs, or other external factors. Likewise, productivity is only going to have a miniscule correlation with the actual person behind the computer as opposed to the configuration of the operating system. Forget individualized tooling, single licensing, and security. We want to encourage cross-pollination of skills. If the accounting package is available to everyone, well, they might just learn some accounting. Win - Win for everyone.

Hell, you might as well leave some of the viruses on there. Haven't some viruses integrated with the human genome? We want the same concepts to apply to our IT and office layout. Perhaps one of those viruses is closing security vulnerabilities and keeping other viruses out.


Some years back I walked into a startup for an interview.

First impression: no lobby, no receptionist. I had to interrupt someone at a "desk" to announce myself.

Second impression: the "desks" were in fact folding Costco tables, arranged in ranks of around 10 and about 6-8 files through a cavernous space, edge-to-edge.

Third impression: site had some exceptionally poor security practices (and is among the more notable password disclosure case histories), which constituted a considerable part of my own interview content. The expressed interest in changing procedures was near nil.

I phoned the recruiter as I walked out (early) telling her that this was my new reference point for the worst interview experience ever.

No, I wasn't in the least interested.


There is no middle ground between my desktop settings and the crazy cat lady.


That is inspired. I use VsVim and remap certain keyboard chords that I rely heavily on, whereas the rest of my office have very vanilla setups. Days where my settings are the baseline would be glorious.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: