Hacker News new | past | comments | ask | show | jobs | submit login
Why I use old hardware (drewdevault.com)
436 points by ddevault on Jan 23, 2019 | hide | past | favorite | 471 comments



I remember getting in an argument a few years ago during a budgeting meeting at a job, where the prospect of upgrading our two-year-old Macbook Pros came up. This company was a startup that wasn't doing particularly well with money, and I said that you don't need a super-fast laptop to program...especially since this job was Node.js based and none of the work we were doing was processing-heavy.

This started somewhat of an argument with my peers who claimed they needed a top-of-the-line Macbook to do everything because you can't program on anything slower. Management ended up caving and buying the new laptops.

I stand by my point on this though; as a proof of concept, I "lived" on an ODroid XU4 for a month awhile ago, doing all my programming and everything on there. I was happy to get my big laptop back when I was done with this experiment, but I never felt like weaker hardware impaired my programming ability.


This only makes sense if you pretend developer time is free and low morale has no effect on productivity.

10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop. Usually it costs more than 10 minutes a day too.

The hit to morale when you tell a professional they can't pick their tools is even worse. Now they are spending at least 1 hour a day looking for a new job because devs are good enough at math to know if you can't afford a new laptop you've got six months of runway tops.

If you believe employee happiness has any correlation to productivity you always buy them whatever hardware makes them happy. It is a small price to pay.


> 10 minutes of lost productivity a day

and yet not every developer not using a tiling windowmanager is fired on the spot for wasting company time.

"productivity" (or what people think of as productive activities) is overrated. Shiny new hardware makes employees, especially technically oriented ones, feel important and appreciated. That's about the gist of it. Nothing wrong with that, but let's not blow up egos with claims of "productivity".


Productivity is one of those things that I think should be motivated with reward for increasing it instead of punishment for decreasing it. I love the nice feeling I get when I use my own custom window manager to shave another 3 seconds off a thing I commonly do. It's an amazing feeling, it makes me feel like Tony Stark, building my own personalized JARVIS, a program that automatically does exactly what I would have done manually. That's a big part of why I built my window manager and why I want to share it with people, because I want them to feel that same excitement and joy of directly improving your own quality of life, even in a tiny but very real way. I would open source it and give it away for free if I could do that and still keep the lights on.


My reaction to customizations that shave off seconds is: "so what, it'll be blown away the next time the tech stack changes." I do automate, but there's a subtle difference in goals.

If I automate my personal toolset, I follow the same procedure I use around automation anywhere else: don't start off doing it to save time, do it to increase reliability. I will write small scripts, sometimes one-liner scripts, sometimes largish hundreds-of-lines scripts. But the outcome I am aiming for is that I have a procedure that is documented and puts all the configuration in a place where I can see it, so that when the situation changes, it is fixable. A productivity boost is a frequent byproduct of successfully automating, but it's usually a side component to "reliable and documented". The boost is perceived as reduced friction and increased conceptual integrity: fewer things to check or to accidentally get out of sync, and thus less stress involved.

Focusing on UI being both shiny and fast is likewise often missing the point - which is the case when discussing new hardware. There are order-of-magnitude thresholds for human attention that are important to keep in mind, but hitting a lower attention threshold usually doesn't solve a productivity problem. It creates a case of "wrong solution faster", drawing the user into a fast but unplanned and reactive feedback loop.

See for example the case of writers who like the Alphasmart, a device that makes doing editing so tedious that you don't do it, you just write the draft and edit later.


I work with .NET and that used to mean you have to be on a Windows® computer. At a place I used to work at, I had an HP elite book Windows 7 laptop with i7 processor, 8GB RAM, and a spinning hard disk. That by itself is not the problem. The problem is there is an "asset management" software installed (I assume by default) that is overly active which when combined with a antivirus with "real-time protection" meant a subversion checkout can take a long time. This definitely degrades employee morale I think.


I would fire explorer/finder etc. programmers on the spot


What does that mean? You would fire anyone who opens up Finder ever? As opposed to what? Doing everything from the command line? This sounds ridiculous without clarification.


Alternatively you could watch them and learn some new tricks.


>10 minutes of lost productivity a day due to inadequate hardware

What on earth are you doing that a 2 year old mac is inadequate for?

Yeah there is a point that the hardware is a problem. I'm working on a 5 year old, mid range PC and I don't think an upgrade would really change any of my day to day work. Maybe InjeliJ would be a little faster and compile times would be a fraction faster but I doubt I'd notice it.

I have a 10 year old PC at home and the only pull to upgrade is gaming, but I'd rather not sink time into that (I get enough time sitting behind a screen at work) so I hold back on spending money on it too.

Maybe the developers happiness does drop if you give them older hardware but I don't think that's based on realistic changes in performance of that hardware.


> Maybe InjeliJ would be a little faster and compile times would be a fraction faster but I doubt I'd notice it.

Just because you don't notice doesn't mean that it's not there. The argument is that it's insane to pay developers north of $100k per year + benefits and then don't invest $4k every 3 years to buy them fast hardware.


No, the argument is that it's insane to pay developers north of $100k per year + benefits and then don't invest $4k every 3 years to buy them marginally faster hardware.

And I don't think that argument is particularly convincing. Typing this from my 3.5 year old work MacBook Pro.


> marginally faster hardware

Not sure where your experience is originated in. It obviously also depends on what exactly you do with your computer. Mid 2015 Macbook Pro to Mid 2018 Macbook Pro got a 50% improvement for compute workloads according to Geekbench [1].

I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.

Not sure what marginally faster hardware means exactly for you, but for us it's definitely been significant, not marginal.

YMMV, but if you do the math of saving 10 minutes / day * $200 /h * 200 days that's > $6000 per year it becomes pretty hard to economically argue against investing in faster tooling of some sort.

Typing this from a 2.5yr old Macbook Pro.

[1] https://browser.geekbench.com/mac-benchmarks


If you're dealing with something intensive then upgrading makes a lot of sense. If you've got large compilation times in your pipeline or if you're doing machine learning and need to throw loads of hardware at a problem I totally get that. I'm sure there are plenty of other situations that justify this too.

But if you're like me where most of that happens off on the build machines there is very little impact in upgrading your hardware.

A 50% improvement on compute workloads probably wouldn't be noticeable on the setup I run. Outside of compiling I don't think I push a single core much above 30%.

I guess it really comes down to what you're doing.


> Mid 2015 Macbook Pro to Mid 2018 Macbook Pro got a 50% improvement for compute workloads according to Geekbench

If this is enough to make a huge difference then you should be running the workload on a low end or better server instead of a mac book. You'll get much more performance for a fraction of the cost and won't have to pay for the things you don't need, like a new battery, screen, etc that come attached the the CPU you need.

> I work on largish (win/mac/linux/ios/android) C++ code bases and compile and test run times are definitely an issue. Switching to newer computers every few years definitely boosted productivity for me personally (and for my co-workers as well). We mostly saw that compile times halved for each generation jump (3 years) as a combination of IO, memory and CPU performance increases.

Have you exhausted all other avenues there? Do you have distributed builds? Is Everything componentized? Do you compile to RAM disks?

For that matter, why a macbook? Why not a high end gaming laptop with more CPU, RAM, GPU resources?


YMMV indeed. Most devs aren't working with large C++ codebases with slow compile times and making $400k working only 40 weeks a year.


This conversation reminds me of that conversation from a day or two ago, when a writer argued that, if you're trying to hire senior engineers, you need to woo them, rather than expecting them to woo you.

In high-end job markets, hardware is so cheap compared to salary, etc. that there is no reason not get the top stuff. If you don't, that may impinge productivity, and in addition it will send a very negative signal.


Honestly this doesn't just go for high-end jobs. If you're a coffee shop and refuse to shell out for some nice stools for employees to take a rest on then you're just not using math.

All labour in the US costs far more than a decent chair or a nice ergonomic keyboard and mouse, a hands free headset... All of these things are peanuts compared to the costs of employing someone.


I keep telling my employer this, and similar arguments, but no matter what, they still won't buy me one of these:

https://www.mwelab.com/en/


Ah, I’m sure that’s because it would be hard to fit in the office, not because of the price.

It’s actually cheap enough that you could buy it yourself if you wanted to. It certainly went into my bookmarks, just in case I ever actually need a literal battlestation.


Why does that page require javascript to be enabled before you can see anything? ;(


Buy it yourself. "Don't ask for permission, ask for forgiveness."


I know we're not meant to say people didn't read the article. But did you click the link to the chair?

I think it was a joke.


Yeah I clicked the link. I was also joking, saying he should buy it with his own funds, naturally the business wouldn't reimburse him :P


Yup. I really appreciate working a place where my employer understands this. Any tools, equipment, technicians we need to bring in...anything...we just say and we get it. The difference between this and my last job where we used equipment until it broke down...and then had to keep using it...where even a pair of earplugs seemed to be a source of stress...it's like night and day.

I honestly don't think I could work for a company that cheaps out on equipment and supplies anymore. The difference it makes every day to the quality of not only the work I do, but the working conditions for me and the people I work with is worth it.


I seriously think there are power games being played by employers who give their labor terrible working conditions when the marginal improvement would have such a tremendous morale boost. Perhaps out of worldly impotence, such an employer feels satisfied in seeing the toil of someone subservient.


Basically anything is cheaper than salary. It’s why I don’t understand companies that do things like take away the free coffee to ‘save money’.

If I can provide free coffee to the whole office on my own salary alone, maybe that’s not a great way to save money.



And yet, people insist on using heavy, slow development tools and SDKs, which make much more of an impact on productivity and iteration times than a 2 year difference in hardware.


> Usually it costs more than 10 minutes a day too.

Anything that slows you down or irritates you eats up more minutes than just the raw time you're waiting too.

Say something takes 1 second to compile vs 6 seconds on slower hardware and you're having to recompile ten times to fix a bug. The raw waiting time might only be 10 seconds vs 60 seconds but that extra pause of having to wait every time to see if you've fixed bug might annoy you enough to drain significantly more productivity out of you.

You're best keeping your developers happy so they enjoy plowing through the work instead of hating it.


Once it takes more than 10 seconds, many people are likely to alt-tab to ”fun” - and often take quite a while to tab back


You won't get a 6x performance improvement from replacing a two year old laptop. Laptop performance improvements have been tailing off for a while.


The OP mentioned a two year old computer. Do things compile 6 times slower than on today's computer? I'd guess at most 1/6th slower, ie 5/6ths of the time...


It's a hypothetical example to illustrate the point that "anything that slows you down or irritates you eats up more minutes than just the raw time you're waiting".

For the sake of a few thousand dollars for a laptop that can be used for several years, even a 1% increase in efficiency is very likely worth it.


If you think developers churn out code at maximum productivity throughout the day and a 10m improvement in compile times per day will reap you benefits in a year, you're sorely mistaken. Unless that improvement lowers the compile time below a certain threshold where the dev will say "fuck it, I'll read some HN while it builds" you're probably not gaining anything.


Ah but then that dev comes across an insightful article, that changes how they think about programming, and becomes a better programmer.

What we should really be doing is spending all our time 'researching' on the internet. Increasing our value to our companies. That's what I tell my boss anyway.


The original argument I read was that slowdowns or waits past a certain time affect mental flow. The flow is basically you being in the zone coding optimally. Interruptions like phones can jolt a person out of flow. So can long wait times.

If person is in flow, you want them cranking out as much output as they can with no slowdowns. That was an advantage of Smalltalk and LISP. Any language with REPL or IDE supporting something like it will let one do this.


I think there is more to it than pure compile time. It can affect responsiveness enough that it becomes disturbing.

From there it can mean using fewer plugins, disabling part of the static code checks, doing less full debugging etc.

I remember how activating the debugging flags wouldn’t work on some projects as the machine couldn’t handle it memory wise.

All of these have a compounding effect that is non trivial.


If you are using Node as the original poster mentioned, what is a slightly faster computer doing for you? You’re not spending time waiting on a compiler.


Exactly. They can not break out the "my codes compiling" argument. Lost morale over a 2 year old laptop?? Wow, they should work in education. Try a 10 year old laptop and no hope of every buying a new "high end" machine. Try a shiny new $500 dell. I think some new programmers are out of touch a bit on hardware...


> Wow, they should work in education. Try a 10 year old laptop and no hope of every buying a new "high end" machine.

That it is worse in another field is not a very good argument?


Yes because developers who are whining about having to use a two year old laptop because their company is trying to save money seems more like a bunch of entitled brats.


In the end, what matters is that you keep your employees happy. Whether or not they are entitle brats (they are...) is irrelevant if you need those employees.


If you're bringing in 3x your cost (as you should as an employee), the cost of a laptop is very nil for the company. Their nickel and diming you is worse for morale and retention.


Anyone who will quit a job for something so minute as having a two year old laptop is someone that is probably so immature that they couldn’t handle the normal rigors that come from being a professional.


You may think so. When you see certain coworkers get new machines every 2 years but you don't, for example, it builds a power structure when one wasn't before. If you feel less valued as a result, I don't think it means you can't handle the "normal rigors" of being a professional. It means, if you can find somewhere else where you feel more valued, then more power to you.


Again if you can’t handle the “power structure” of not getting a laptop every two years, you will never be able to handle navigating life in corporate America.


If you're bringing in 3x your cost, you're getting ripped off and need to start raking those numbers back in.


Unlikely with a smart employer. Your cost to them is 2x your tc. They need a profit, that's where the last x comes from.


What would you say is a good ratio then?


>seems more like a bunch of entitled brats.

I'd say anyone that drops the money on a 4 year CS education of any rigor, survives it, survives and succeeds at the fairly rigorous interview cycles required to get a job in SWE these days... is absolutely entitled to requesting top of the line hardware to perform their duties.

I'd say your argument makes it "seem" like it's coming from envious, curmudgeonly luddites.

If individuals of other vocations feel that these developers are "entitled brats", perhaps they should switch careers? I can't imagine a teacher went in to education seriously expecting to get provisioned the latest rMBP as a perk?

Are we assuming all jobs are equally dependent on high-end hardware to provide the best RoI for time spent?


Because you think CS is the hardest, most rigorous major?


Where did I say or even imply that?

I simply said it was hard and rigorous. Not the most.


Why do you assume all employed SWEs have 4 year CS degrees? Or any degree at all?


That's fair. Then they've managed to educate themselves, not exactly a non-trivial feat either.


Teaching yourself to program is not rocket science either. I was writing 65C02 and x86 assembly on 6th grade in the 80s. I got a C.S. degree because that’s what I was suppose to do.


It's not "another field", it's literally almost every field other than computing, as outside tech people either don't have money for top hardware, or don't have interest in buying it.

In so far as it makes developer more empathetic towards users, it's a good point to make.


If you were paid at your job the same as in tech, and you know the direct value you add then you'll understand the cost of the tools you prefer is a literal drop in the bucket and not worth serious discussion.


Wow seeing that I’ve been “in tech” developing professionally for over 20 years and a hobbyist programmer in assembly since 12, I think I’ve earned my geek cred...


It's actually you that is out of touch. I could reiterate the arguments already well expressed but perhaps I should just suggest you read the entire rest of the thread.

A teacher spends presumably most hours actually interacting with students and use the computer for preparation and paperwork something your 10 year old dell is probably well suited for during the minority of the time they spend on it.

Your dev spends most of their time on their machine and even if they don't have to build software in a compiled language may still be running a heavy development environment, several browsers to test the result, virtual machines, etc etc etc.

To drive the point home lets consider the cost of a 5% decrease in productivity due to using an inferior machine.

If a teacher is contracted to work 185 days or 37 work weeks in a year and earns 60k the teacher earns $32 at 50 hours.

If the teacher spends 10 hours per week on the computer the cost is no more than $32 * 37 * 10 * 0.05 = $592

If your software developer is earning 100k x 50 weeks and spends 50 hours per week almost all of which is spent using the computer then the cost is $40 per hour x 40 hours on the computer x 50 weeks x 0.05 = $4000

This doesn't account for actual costs incurred by having to hire more developers because management is too incompetent to retain them buy buying them nice tools.


Classrooms often use interactive whiteboards driven by PCs that can be quite slow to log in to the teacher's active directory profile as this involves pulling data over a network of variable speed. There can also be issues with software upgrades deciding to start at random times. The PC will need to be logged off and logged on 8 times a day...

Teachers don't talk to themselves in classrooms so time loss can affect a small percentage of up to 180 student-hours per day or 900 student-hours per teaching week per teacher. A typical '8 form entry' secondary school in UK will have around 100 teachers plus admin / head of subject / 'leadership'. School year is around 38 weeks.

I sometimes think something like ChromeOS but that can run IW software and just stay booted would be better. An appliance.


You need to factor in the time it would take for teachers to move all of their resources over to another format. Some teachers have decades worth of work that they teach with.


The argument you are constructing is that we should invest more money in classroom computers not less in developers.


Yup! Or at least 'appliance' style end points


Editor will be snappier, Docker will start faster, git merges will be quick, etc

it all adds up.


Vim is pretty snappy regardless of hardware!


I do not and have never used Vim, if I started at your company would you prefer to have me waste three or four days learning it or just pay an extra six hundred dollars on hardware?

Just for your knowledge, your answer probably differs from the one I'd get from whoever does the accounting at your company, and their answer is the right one.


> waste three or four days learning it

If you're a proficient IDE user, learning and setting up Vim up to a comparable level to a top-notch IDE (Visual Studio, IntelliJ) would take more than three or four days. Three or four weeks (or even months sounds) more realistic, in order to get efficient:

* code navigation * code refactoring * debugging * tool integration * workspace/project management

Don't let anyone tell you otherwise, I'm a Vim user and the people that say it will take just days or a few weeks have just forgotten how long it took them to ramp up. Or they're fooling themselves that their Vim does everything a capable IDE does on strong hardware.


I agree, but I wanted to clearly under estimate because three or four days is already enough cost wise and it's hard to argue. I think two weeks is on the optimistic side, but I think real proficiency would be a slow process and likely end up not paying off for about a quarter, though you'd be productive in other ways during that time.


At a previous company, I never told the new college grads that there were options other than vi, so unless they were enterprising enough to figure things alternatives by themselves (this was before widespread availability of Linux and the web), they were forced to learn it.

And then I'd tell them about alternatives after they were proficient with vi. Not one ever switched away from vi.

Did that cost us in initial productivity? Probably. But it's such a minor thing when it comes to NCGs.


Good thing I'm not in charge of these things at my company, I'd probably look for people who used hardware properly rather than just rely on Apple to make their POS GUI IDEs to run... Most of my colleagues run IDEs, but not the ones who actually get stuff done.


> three or four days learning it

Now this is being ambitious :)


> I do not and have never used Vim I do wonder, what editor do you use when you're sshed into a remote machine?


I generally use nano for quick and dirty things, but prefer to push/pull changes. The environment I'm working in doesn't necessitate editing local files on a remote very often.


For me, my ~4 year old MacBook Air (Yeah, not really a top end dev machine) has started to struggle with more than 15 or 20 browser tabs open. I regularly spend _way_ more time researching than compiling, so it's starting to annoy me enough to think about pushing for an upgrade.

(I put the blame half on the browser vendors, and half on modern cloud web apps. My tabs usually include Gmail, Jira, Confluence, Trello, and Slack. Even doing nothing, between them they'll sometime have the fans spinning...)


Bundling + dev servers are place a heavy load. I'm actually looking into a way I can avoid using bundling for production but still get some kind of "hot javascript file reload" in the browser.


Getting fast feedback from your tests.


As a counterpoint, tests are your application, and thus should run at the exact specs of your average consumer. And you don't get to compensate for the test suite itself either unless your 100% sure the average customer is only going to be running your app.

This is also a good argument for running tests on a separate dedicated machine


I like running tests on a separate dedicated machine and enjoy a build environment that verifies each commit with a full test suite... but being able to run unit tests for a file on each save of a file is something that can save you some time pretty trivially.

I also disagree strongly about needing to run tests on the exact specs of your average consumer, most of us aren't writing software for a small set of hardware configurations so determining those average specs is likely not possible - but if you're working in an embedded platform or with specialized hardware I do agree that you absolutely do need to run tests on the actual hardware regularly, I'd still argue that those tests should be run on a dedicated machine and should be in addition to tests that verify the code is obeying its contract.


> should run at the exact specs of your average consumer.

Kinda expensive, having a set of multiple dedicated servers (or VMs) running on God knows how many dozen cores and hundreds of gb of ram just to run the tests that my laptop runs fine by itself, all in the name of running the "exact specs" that my stuff is going to be run on.

You're describing system testing, which takes place extremely late in the product cycle.


This argument is particularly badly formed. Tests are your application but you are testing for correctness not speed. In theory your tests could exercise as quickly as possible as many unique operations as a customer could perform in hours of normal use. You would never want this to take hours.


That pesky windows 10 won’t run on an msp430 unfortunately...


Do you have incremental testing so that only what's changed is tested? If not that would be a better investment than new hardware, IME most shops don't have incremental testing.


For real. Compilers are lightning-fast compared to running even a subset of most test suites.


What if you're using TypeScript, creating a few packages, and bundling them?


then you probably only need a 486 or so? programs can manipulate text quite effectively using ancient cpus.


Clearly you have never used TypeScript, created a few packages, and bundled them.


Sounds like better tools are in order. Golang produces results instantaneously for example.


Almost no present software will run acceptably on a 486 since machines orders of magnitude faster are available at walmart for $300 the question of what software would is mostly academic.


In 1999 we were 'upgrading' to Evergreen 586s in K12 for a lucky few boys and girls. Good times and a lot of nothing going on. PIO4 and PATA with PCI buses clocked at 33 MHZ.

Can you imagine really trying to run 20+ year old tech in developer space? 8 - 10 MB/s throughput and 70 bogomips?


I think it's easy for a lot of people to criticize your mention of morale and many have, but the cost ratio of tools to labour is pretty extreme. Getting a new laptop every week is stupid but if your hardware is on the fritz your company should be willing to replace it pronto, having downtime without a machine while yours is in the shop is a reckless waste of company resources.

My favorite example here is from an old CS job my wife had in connection with HP, this shop was such a grind house that they refused to buy a new chair that would be compatible with her back issues. Refusing to buy an employee a 120$ chair for a loss of perhaps 10-20% of their productivity just doesn't make sense mathematically - ditto for any company that has people working on computers that refuses to shell out for good keyboards and mice. These pieces of hardware are so cheap that having a lengthy discussion about why you're not going to get them is probably costing your company more money than just snapping them up.


HP somehow became the epitomy of crappy enterprise bean counting. They lost all the talent that had any options available, and the results are readily visible.


I do not bother to ask for new mice or keyboards any more. I just bring those myself.


I'm just never quite sure the gain in productivity and happy developers outweighs the cost of shipping a software product that requires hardware that costs 4-digit US$ figures to run smoothly (which seems to be the case for most everything produced by startups nowadays).

I for my part would prefer if, for instance, the Slack developers were confined to machines on which their current product runs as badly as it does on mine, even if they feel so miserable and underappreciated as a consequence that their uplifting have-a-nice-day loading messages get replaced by passive-aggressive snark or something.


I agree with the first point: wasted time will quickly stack up with old/cheap tools. I don't buy your second argument about morale. The vast majority of professionals don't get to pick their tools, they are handed work and tools based on what is available and cost-effective. If these professionals' flexibility does not include programming on a 2 year old laptop vs a brand new one, wow, that's weak grit. If your employees' morale is destroyed by not working on the latest gadget, why is that, what else about the company is insufficient to degrade their morale so far?


As a professional I know what is cost effective for me better than anyone else. For example I know that I'm always bumping up against my 16 gig memory limit, but I have plenty of CPU. I know that I get less eye fatigue on a 4k monitor - so I picked a smaller 4k monitor that was cheaper than the monster lower resolution wrap around thing my colleague prefers.

I know that I'll be significantly more cost effective with 32 gigs of ram because I don't need to spend time killing processes and rebooting VMs after half a day of work.

I know what keyboard and mouse is still comfortable for me after working 8 hours, etc.

I know I'll be more productive on a macbook not because I'm an apple fan boy. I hate apple because they've done more to kill open source than any other company - even MS. I'm a linux fan boy. But I need to use various tools that don't run on linux. I could "tough it out" on a cheaper windows machine, but it wouldn't be cost effective. I would be less productive.

A professional knows what is cost effective and spends their hardware budget wisely to optimize their productivity. They don't rip off the company for the latest shiny gadget. It is stilly to trust a dev to essentially run a multi-million dollar company, but not to pick out a computer.


The hit to morale comes from management dismissing a serious cost-benefit analysis with phrases like "weak grit" and "latest gadget".


The literal difference between good and poor management. Being objective vs. subjective.


Do you make your employees only listen to one genre of music while working? Do you only allow specific food for lunch? Why do you just care about this specific point in how they work?


> 10 minutes of lost productivity a day due to inadequate hardware means you are paying at least $4k per year per dev for that shit pile of a laptop.

This only holds true if your developers are 100% efficient programming every second the machine is running. But let's face it. The first hour of the day is most certainly not the most productive (10 minute boot? fine, I'll make coffee meanwhile). You could easily schedule a meeting, like a stand-up, while the machines fire up, if those 10 minutes really would be necessary.


Mathematically flawed there is no reason to suspect you can subtract the time spent waiting for the computer from the time already wasted whereas actually inefficiency from poor hardware is distributed throughout the day including productive periods.

You would actually multiply a percentage of inefficiency x hours worked.

Also honestly human beings doing intellectual work can't just do something else for 10 minutes and lose zero productivity because intellectual work is fundamentally dissimilar from assembling widgets OR managerial work.

Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.


> Also honestly human beings doing intellectual work can't just do something else for 10 minutes and lose zero productivity because intellectual work is fundamentally dissimilar from assembling widgets OR managerial work.

Yup, I'm talking about the beginning of the work day. Nothing productive would be interrupted.

> Consider reading http://www.paulgraham.com/makersschedule.html because you fundamentally don't understand how people work and can't be a good manager without at least attempting to understand.

I have no idea how you got to your conclusions about me by reading my comment but thanks for judging. Regarding this piece by PG, did you read it? Because it actually supports my claim, to schedule a meeting in the beginning of the day, while machines would boot, in order to save the precious time in between start and beginning of the work day.


To reiterate the prior poster claimed that 10 minutes of lost productivity could cost more than the devs desired computer.

You said that this calculation is erroneous because the developer could easily make coffee or have a meeting while his machine boots up and thereby recover that lost time.

This is a very puzzling suggestion. Slow machines aren't merely slow the start they are slow to complete user operations while the user is sitting at the machine awaiting the result. The time cost is the sum of 1000 small delays throughout the day. You can't productively fill the extra 30 seconds you spent waiting 20 times with a meeting for example.

In fact acceptable hardware/software wakes from sleep in a second or cold boots in 30-90 seconds. Boot up isn't really the problem.

You said

>Because it actually supports my claim, to schedule a meeting in the beginning of the day

What the actual article says

>Several times a week I set aside a chunk of time to meet founders we've funded. These chunks of time are at the end of my working day, and I wrote a signup program that ensures all the appointments within a given set of office hours are clustered at the end. Because they come at the end of my day these meetings are never an interruption.

PG actually suggested meeting at the end of the productive day to avoid breaking up productive working time.

I suggest you read it instead of skim it.


> PG actually suggested meeting at the end of the productive day to avoid breaking up productive working time.

I suggest you understand it instead of mindlessly quoting it. It's clear PG wants no interruptions for productive work time but if a meeting is scheduled before productive work time begins nothing gets interrupted.


It's less clear how you can deal with a slow computer by making coffee or holding a meeting. Did you think slow computers take 10 minutes to boot up but run real fast after?


Read it, almost everything written applies mostly to the IT industry. It does NOT operate anywhere near like that in any warehouse, full-bore semiconductor manufacturing, or even fast food job I've ever had.

What may apply to one person or industry absolutely does not apply to them all.

And to boot - I'm the GM of a very, very large solar company.


It is more like - this tests takes just long enough that I can jump over to hacker news for a few minutes...


Plus opportunity cost when developers talk and the good guys decide between a job with servers or VMs in the cloud and a MacBook pro to use them or a shitty, out of date, slower than Christmas windows desktop rendered on a vdi appliance connected to a Citrix server in the next state. I mean... hypothetically, of course.


i hate these kind of taylorist arguments. dev's limiting factor is always energy, not time. and i cannot imagine any good devs i know truly caring that they arent on the latest and greatest. i guess i just wouldn't work somewhere people care about that kind of shit, so maybe im biased.


> dev's limiting factor is always energy, not time

If this is true, I would expect that investing in hardware that most effectively gets out of a dev's way to have an even higher return on investment than is suggested by time and productivity arguments. The emotional toll of dealing with the dev equivalent of paper cuts should not be under-appreciated.


As long as it doesn’t get in their way, I think the previous statement is indeed true.

Devs don’t care whether their laptop is 2 years old or not, they care that their compilation takes 5 minutes when it could be 2.


I used to do in-house development at a shop that had a great policy on that front: Developer machines were always a generation behind the workstations of people who would be using the software they write.

And a strong culture of belief that, if a developer couldn't get something working well enough in that kind of environment, then that should be taken as a reflection on basically anything but the developer workstation. Inefficient code, excessive memory usage, just plain trying to do too much, bloat, whatever.

But I realize that's a hard thing to sell. A lot of developers don't really appreciate being reminded who works for who.


This is a really poorly thought out policy masquerading as deep understanding. It's if I can coin a phrase yoda talk. Basically incoherent ideas aren't improved by being couched as hard earned wisdom.

The tools being used to create a piece of software are often fundamentally different than those used on the other end.

This means that machines should be provisioned in accordance with the needs of the actual tools running on them.

Developers may in addition to running different tools may need to iterate quickly in order to test functionality. It may be acceptable that an app that is started once at the beginning of the day takes 3 seconds to start but if you deliberately handicap developers machines and it takes 7 each time the developer is trying to fix a bug in the space of a few minutes instead of 1 on a faster machine you may have damaged your own productivity for no reason.

This has no nothing to do with the idea of who works for whom. Incidentally this statement sounds entirely passive aggressive. Ultimately I'm pretty sure all of you worked for whomever is your ultimate customer and are invested in helping each other do effective work it sounds like the management team was entirely unclear on how to do this. Is that shop even in business.


Not only is the shop in business, it's one of the most profitable place I've had the pleasure of working at. The policy was actually handed down by the CTO, who started in the company as an entry level developer. Some money was incidentally saved, but, at least to hear him talk about it, it was more about getting people's incentive structures in line: If you want to discourage developers from doing something, the most straightforward way is to make it painful for them to do it. If you want to make creating performance problems painful, you accomplish that much more effectively by making it apparent on their workstations, where their nose will be rubbed in it constantly until they do something about it. Slow performance on an external test environment is much easier to ignore, because people typically don't bother testing there until they think they're basically done with their work, at which point their incentive structures are nudging them toward ignoring any potential problems.

Contrary to some of the criticism I'm seeing here, it wasn't actually hated by the development team, either. The hypothetical "can't get any work done because compiling takes a bazillionty days" scenarios that people are pulling out of thin air here simply didn't happen. At a shop where developers were expected to think carefully about performance and know how to achieve it, they tended to do exactly that.

Someone who was busy making excuses about how they weren't very productive because they only got a 3.4 GHz CPU while the analysts got a 3.6 GHz CPU probably wouldn't have lasted long there.


> At a shop where developers were expected to think carefully about performance and know how to achieve it, they tended to do exactly that.

As long as you have time to actually achieve this, remove that and even on the worst hardware, you'll see shitty implementation.


"Yoda talk" is a very nice phrase, I hope it catches on.

Dogfooding is sometimes a good idea, and of course testing on a range of setups is important. I suspect there is a problem with people testing on older software but not trying any older hardware (especially for web apps), which using old machines could have partially avoided.

But the idea that development should inherently be able to happen on older hardware than the product will run on is arbitrary and ridiculous. At best, that creates pointless pressure to rely on hardware-friendly tools, which could mean anything from not leaving open lots of relevant Chrome tabs to pushing developers to use vim instead of a Jetbrains IDE. (Nothing wrong with vim, obviously, but "we intentionally hobbled our hardware" is a weird reason to choose an environment.)

At worst, it fundamentally impedes development work. For an extreme case: Xcode isn't really optional for iOs development, and merely opening it seriously taxes brand new Macbooks - applying this theory to iOs developers might leave them basically unable to work. Even outside that special case, there are still plenty of computer-intensive development tasks that are way outside user experience. Just from personal experience: emulating a mobile phone, running a local test server for code that will eventually land on AWS, running a fuzzer or even a static analysis tool.

Even if we grant the merit of that trite "remember who you work for" line, sticking to old hardware doesn't seem to follow at all. We wouldn't go around telling graphic designers that if they work for a customer with an old iMac G3, they're not allowed to have a computer that can run Photoshop. Heck, are there any professions where we assume that building a thing should be done exclusively via the same tools the customer will employ once it's finished?


My company’s mobile app is made in react native.

As a senior dev., I still use an iPhone SE, but the mobile dev team all have the latest iPhones.

The app looks horrible on the SE, and some touch areas are blocked by overlapping text or images.

It is basically unusable on a supported device.


I worked in a place where dev workstations were far beyond the end users', and when devs tested (mostly local, not multiple systems across networks), there was no basis in the customers' reality.

Endless performance problems were masked in development, and then very noisily evident when it reached customers. Performance testing was largely reactive and far removed from the creators of some terrible code choices, so they'd tend to shrug ("Works fine here") until you analyzed the hell out of it.

Now, the code was a big C++ environment, and compilation speed was a problem, but maybe a means of testing in a throttled state would have prevented a lot of grief much, much earlier.


This reminds me of Facebook's "2G Tuesdays" (https://www.theverge.com/2015/10/28/9625062/facebook-2g-tues...), where they emulate their internet speed to that of emerging countries. If you are frustrated while waiting for a page to load, then your users are going to be as well.

This is a really nice way of working and you can see the results when browsing https://mbasic.facebook.com: even at 2G speeds, even with photos, pages load fast. No unnecessary background process trying to do something, all buttons are there to be clicked on already. A really smooth experience.


Developer tools often have much higher requirements than the resulting product. Not everyone is using vim.


Not a valid excuse. The end users have their own productivity apps that eat up their system resources, too.


In the minds of developers, the end users shouldn't be running any other apps than the the app that the developer is working on.


Correct, it seems much better to create a test bed environment emulating this, rather than artificially constraining a person own development environment due to some poorly thought out policy which is actually likely a way to save money.


That's a complete non sequitur. The grandparent comment is not talking about how resource intensive background apps affect the performance of the product, it's talking about how the development tools themselves may not run smoothly and efficiently on shitty hardware if they are resource intensive.


Why had complex IDE's with drag'n'drop tools, syntax highlighting and intellisense that were happy to run on hardware not much better than a 486.

Developer tools are only resource hungry today because their developers aren't dogfooding.


They are resource intensive because they prioritize feature completeness over being sleek and lightweight. That's the correct thing to prioritize because, because extra RAM is dirt cheap compared to benefit of higher developer productivity.


That's the point, they already were feature complete back when they were less bloated. VS2017 offers very little over VS6.0 or VB6, I think they're min requirements were 128MB of RAM, it was a lot more responsive too. Similar for eclipse.

Developer productivity hasn't improved since, even gone backwards in some ways.


That assumes your not using some monstrosity ide that requires bleeding edge hardware to run COUGH oracle forms COUGH.

We had to buy cutting edge pc's spend >£2k on extra ram and £4k on 20 inch crt monitors to develop - the application would run on a 133MZ

If you had less than 2 Blue screens per hour that was a good hour - This was early 90's btw.


I think you're mis-remembering your time period. 20" monitors weren't widely available until the mid-late 1990s; nor were P133 machines (P133 release was in June 1995; so for it to be considered mid-low end, late 90s would be more likely).

Where I worked in the early 1990s (1992-ish), us developers fought over who would have the color terminals (Wyse 370/380) and who would have the monochrome ones (Wyse 50/60). I was low-man on the pole so I always got stuck with the low-spec terminal (though I still have an affinity today to amber VT220 text characters).

Until I "wowed" the owner of the company (small shop) showing how to do windows, dialogs, etc in a terminal using a virtual buffer and some other "tricks". Then I was given one of the color terminals, to jazz our interface (which was mostly numbered menus and basic flat screens).

At one point, I played around with the 16 color Textronix graphics mode you could pop into with the right escape sequence; I think I made a very slow Mandelbrot set generator (never showed that to him; our app was already pushing our dev hardware, which was an IBM RS/6000 desktop workstation running AIX we all shared via 9600 bps serial connections)...


It was 93 and the gear we had was bleeding edge for the time - unlike the average developer who was on 100MZ or 133 at best

The system was SIROS the system that helped manage the UK's SMDS network so we had budget for it.

The 20 inch monitors where awesome for playing doom late at night


90s and 2000s was a pretty different story. These days a 5 year old i7 with 16g of ram is pretty damn fast. Maybe it's a different story if you're running windows with a big IDE.


This may be workable for a certain subset of projects, but programmers often have much more on their system than the end user. End users don't need a bloated IDE, an SQL server, an HTTP server, etc all running at the same time. Trying to run all of these programs on an old computer is of zero benefit to the process. Better to give programmers a new machine with remote desktop access to a slower computer/virtual machine that they can use to test out their software.


You could easily argue the opposite as well. Developers don't need an IDE, a SQL server, an HTTP server, etc running on their device at all. The choice is to use a bloated IDE that most people only use a small fraction of the features for. The servers could all run on a dev server and compile/test cycles can be done on similar servers.

Mind you I don't necessarily agree with all of this. Well except the IDE part, Vim and Emacs are tools that more people need to learn.


> The servers could all run on a dev server and compile/test cycles can be done on similar servers.

In every case I've had a dev db running on a shared test server, that DB has been woefully underspecced for the purpose and often in a datacenter with 300ms latency from the office over the company VPN.

While production instances are in the same datacenter as the production DB with 5ms latency.


You're using emacs as an example of non-bloated? Eight Megs And Constantly Swapping?


Ya you're right. I should have used something more light weight like Atom.


I understand it's a whole 0.1% of physical memory for the program you're spending most of your time. Better reduce that to 0.06 quick.


Unless you are literally building the dev tools you are using that doesn't make any sense. That shop lost tons of money on wasted dev cycles. You spend much more time building the app than running the app.

They should have bought everyone a second low spec machine to test on, and let them use proper dev machines for building the software.

I guess if is a shop where the management feels they need to remind developers suffering through that for 8+ hours a day "who works for who", that was probably the least terrible part of working there.


"But I realize that's a hard thing to sell. A lot of developers don't really appreciate being reminded who works for who."

"Yep, we put those developers in their place."

"Hey, why are they leaving???"

"Oh, you mean we need developers more than they need us?"


The idea you're trying to advance here is just plain silly.

You need your doctor more than your doctor needs you. That doesn't change the fact that your doctor is doing work for you, and not the other way around. Same for lawyers, plumbers, electricians, architects, and anyone else working in any number of other skilled professions.


Do you also engage in silly power plays to put your doctor in his/her place and remind them that they are working for you? Maybe you can insist that they use a 10 year old stethoscope or you else you'll take your business elsewhere.


Developer can switch jobs fairly easily. Its a sellers market. Companies that don’t understand this are going to wonder why they have a hard time retaining talent.


That’s great, until you have to run Xcode and interface builder.


I see...so your shop with a great policy seems not to have learned about these things called test environments eh?


It's a staple of 'developers' to have fancy Macbooks while essentially just needing a shitton of RAM these days.

Programmers don't need bleeding edge, just a lot of RAM.


Some of them don't – but some of them do, and that improved performance can offer a noticeable productivity improvement.

I recently upgraded from a 2015 Macbook pro to a new i9 one, and right now I'm working on a computer vision pipeline for some data – an embarrassingly parallel task. It takes about 15 minutes to run a process which would have previously taken about an hour. This is a direct improvement to my development experience (trust me!)

But there are a bunch of different reasons. Modern stacks can be annoyingly under-optimised; a large Webpack app with live reloading can be irritatingly slow even on recent machines. Fast SSDs are useful for people working with large datasets. Better GPUs can mean better UI performance and more screen space.

In short, remember that just because you don't need the hardware, doesn't mean that others don't! :)


But why do you need that on a laptop? Just run it on a server.

If you work for a employer with deep pockets I guess sure why not? Otherwise a workstation you can remote connect to (if you work from home or travel) is probably good enough.


Nothing about your post adds up.

2015 MB i7 to i9 MB doesn't increase anything times 4.

How long have you been working on that task and waited an hour? The wasted labour cost might have bought you a dedicated compiling rack 2 weeks in.

How long could you have rented cloud ressources to bring that task down to close to instant for the cost of a i9 MB?

I am just curious tbh. I have various tasks like yours hobbywise. But the least thing I'd encourage my laptop to do is compile/render/analyse a problem that takes more than 60 seconds. Beware if you fully utilise a laptop like you describe shouldn't it become useless for everything else while doing that?

So much questions...


Nothing about your post adds up.

That's great, but I'm literally sitting at my desk doing it just now, so I can assure you it's not just made up!

2015 MB i7 to i9 MB doesn't increase anything times 4.

i5 2015 13" MBP to i9 2018 15" MBP. 3x the cores, higher IPC, higher frequency, faster memory, faster SSD. It adds up, and for this class of process a 4x improvement is totally reasonable.

How long have you been working on that task and waited an hour? The wasted labour cost might have bought you a dedicated compiling rack 2 weeks in.

I don't know how the hell you work, but I don't just kick off a process and let it run while I sit still at my desk waiting for it to complete :) It just involves working to a slightly different rhythm, and the ability to iterate a bit faster makes that nicer for me, at minimal cost.

Anyway… wasn't the point that "developers don't need faster machines?" I think buying a "dedicated compiling rack" would count!

How long could you have rented cloud ressources to bring that task down to close to instant for the cost of a i9 MB?

No idea, but in the long term, more than it costs to buy a new development machine. Plus this way I don't have to fanny around with copying assets back and forward, remoting in to visualise things, setting up a server or whatever. And the new laptop makes everything a little bit faster and more enjoyable to use. The price of the machine is pretty marginal for a tool I use in excess of 40 hours a week.

Beware if you fully utilise a laptop like you describe shouldn't it become useless for everything else while doing that?

Nah, it's fine generally. Everything's just a bit slower until it's done.


Faster is usually better but it's worth pointing out: If a task took a whole day, you wouldn't burn the day twiddling your thumbs - you'd context switch and do something else. You'd save long compilation steps to late day so that you could come back tomorrow, not wasting time. For many people, 15 minutes is not worth switching but 1 hour wait time is.


Recently got access to a new HW platform with 256 cores CPU, 512M of L3, 512G/1TB of RAM. The speed of that thing compiling the linux kernel and run certain AI testing are amazing. One can try out different experiments so much faster.

Used to working diff company with server farms for compiling. Even with the servers, compiling the 26G software pkg took 4 hours. When it is high noon and everyone was using it. I have seen compile jobs can last 8+ hours. There a few times that by the time compile is done, I have completely context switch out and forgot what I need to debug.


I did two dissertations on realtime outdoors computer vision stuff (detecting landing targets for quadcopters and shadow classification), with a really high-end (at the time) laptop with 16GB memory, 8 cores @ like 3.5GHz turbo, a 512GB SSD and a GTX970 GPU which was okay at running tensorflow et al. Not only was it a very capable workstation which I could leave crunching data on overnight, both dissertations involved field robotics (quite literally, doing stuff in fields with drones :)), and being able to use the same machine wherever was a godsend.

No matter where I was, be it at home, in a computer lab, presenting my work to peers or my supervisor, or outdoors in a field, I had the same tools, same large dataset of raw h.264 video streams, same hardware, same everything, without needing to rely on streaming data to and from some server on the internet, or worry about keeping my work and software in sync across multiple machines. I could tweak my algorithm parameters in-field and I could continuously compile my huge 100-page latex sources for my dissertations from.. the beach :)


I think that's definitely one of the good use-cases of a portable machine. Before buying a new one, I did do the maths on using a more powerful desktop instead and just keeping my older laptop for travel – but like you say, it means constantly thinking about what data and capabilities you have available at any time. And coincidentally I was also working last week on real-time computer vision for robots, on a remote customer site, so it was nice to have my machine with me :)


I myself will take a small computer and a fast connection anyday, I do not think money is the biggest issue here just what is practical.


Yah. Or simply imagine the benefits of a cheap, upgradeable workstation instead of trying to do a long build on a computer squeezed into a half-inch-high package: more cores, more RAM, big old GPU, wired network, the list goes on.


Sure, a desktop machine would be less expensive than a laptop of equivalent performance. But I work mostly from home, work in the office maybe one day a week, and often visit customer sites. The ability to just pick up and go with a relatively powerful machine is pretty attractive.

I think it's always important here to realise that people have different use cases and priorities for their equipment, and that there's no right or wrong answers. Some people are happy to fork out for a portable; others don't require that, and would rather have a fixed workstation with more power. Some developers are totally fine with 10-year-old kit, and some can benefit from newer stuff. I'm sure everybody evaluates their circumstances and comes to suitable conclusions for themselves!


The newer macbooks also have significantly better I/O performance, which together with the faster CPUs and more cores might add up to a 4x difference.

It's a pity they are otherwise crappy for my work (I need a reliable keyboard, I do not want a "touch bar", I do want USB-A ports).


I've installed small amounts of RAM into machines and have them speed up more than an order of magnitude. The speedup was because it stopped swapping. Modern computers have other similar bottlenecks that can show massive improvements for what seem like small changes.


That's a bit of a knee point though.

If your machine doesn't have quite enough RAM, it will swap and likely be unusable.

Once you have just enough it will fly, and adding more RAM will make little difference.

I can't imagine the OP's aged stack is so low on RAM that he puts up with swapping.


As I said, there are lots of other knees. For example, a slightly increased L1, L2 or L3 cache size can have a similar effect.


uhh, in 2015 a i7 had 4cores, boost of around 3.4

in 2018 a mobile i9 had 6cores/12threads, bost of around 4.8 (so lets be reasonable, thermals might let us keep 3.6).

For a super optimized parallel load you could totally see more than 2x speed up.

But yeah... why do that on your laptop?


If it takes 15min locally then you should do it locally. My rule of thumb.. if it takes > 1 hr locally it should be done elsewhere.


> uhh, in 2015 a i7 had 4cores, boost of around 3.4

Not on a 13" MBP, it didn't. Source: there's one on my desk.


And on a none portable computer you could probably do it even faster. Maybe you need a laptop but there are tons of peoples using laptops that doesn't needs to.


I think that's pretty much what I said, right?

Not everybody needs a laptop, or a super powerful development machine. But if you want to process some stuff nice and quickly, while not being tied to any physical location, it's a totally reasonable solution.


The wonderful thing about a laptop is that if I want to go sit by a window for an hour while I'm working, I can just stand up and move. A desktop chains me to my desk, deep within the heart of a dimly lit cubicle farm.


The funniest thing about this is probably that RAM prices have went up in the last 2-3 years. I built my current computer in 2016 and if I had to buy the same RAM that it has today, I'd have to pay more. So for the same money, "newer" means in fact worse. Noting that prices have went down from their peak increase since, but it's still more than the lowest historic price.


More important than price is trust.

The Core 2 Duo/Quad on LGA775 is the last revision of the Intel Management Engine (ME) that can be completely removed. The me_cleaner script recently added the ability to clean this platform.

A Q9650 quad-core CPU is the best performance that can be reached on this board. I have two and I have run me_cleaner on both. I do my finances on the one that runs OpenBSD.


I bow to your paranoia. Although I have often wondered if retro computers will become more valuable in the future as the final bastion of electronic privacy.


If your tinfoil hat isn't too thick, you can take a modern computer and set it up behind a hardware firewall, or connect over serial...


And then monitor the tons of weird ass encrypted traffic coming in and out of it, hoping that those requests to AWS are some program getting updates and not the contents of your L2.

Firewalls are great, and you should have one, but you have to constantly watch and understand everything coming in and out of it. Works ok as a full time job, terrible when you have other things to do.


If this is too much trouble, you can connect over serial, like 25+ years ago. Transfer files with zmodem from an internet-connected host. A Raspberry Pi with a serial port would be perfect for this.


I have a rack full of machines like this. Dual sockets with 8GB ram each. Any affordable upgrade would be a downgrade. I'm looking at at least $200 per U to upgrade from hardware that is essentially worthless. I literally got the CPUs for $2 each. There's something really funny going on with prices after core 2.


How much is your power bill?


Kindof a lot, but I tell myself that servers are an efficient winter heat source.


Thank you for this information, this is going to be my next project.


Don't forget the tinfoil in your bunker.


Can your board take an X9770?


These are the boards that I patched:

https://github.com/corna/me_cleaner/issues/233


Even that depends on the dev environment.

If you're using a lot of virtual machines, sure. If you're working on something that's inherently RAM-hungry like video processing, sure. If you're working against a local database server, maybe. If you're using something like IntelliJ or ReSharper (but not necessarily Visual Studio), then, <sigh>, yeah, I guess.

If you're doing Web development, OTOH, it's probably better if you not have much RAM. If front-end, because a significant percentage of your users will be using low-RAM devices like smartphones and Chromebooks, and you shouldn't need any more RAM than they do. If back-end, because the production servers will (hopefully) be handling a lot more load than what you're doing in testing, so if you get too comfortable with treating RAM as a plentiful resource in development, that's going to be a recipe for scalability problems.


Our Scala teams disagree.

I do mostly Go, and yeah, there it doesn't matter as much, but you should see some of the build times on these services. There isn't even a big scary monolith left in our architecture (besides a frontend in node).

Oh, and let me not forget minikube.


SSD's really help a lot too. SSD's and a reasonable chunk of RAM (whether reasonable is 8 or 16 gigs depends on what you're doing, I've never needed more than 16). Anything else is not so important for normal development. A fast SSD definitely helps as without one, IMHO waiting on the disk I/O is typically the bottleneck.

Hell, I did a lot of (hobby, not work) development on an eee PC netbook a few years ago. Besides the discomfort that a small screen and sub-normal keyboard brings, it was a reasonable experience. I didn't have to run slack then though.

Obviously it depends on what you're doing though. Processing data? Running a ton of tools all at once? Differing workflows and requirements have different hardware needs.


I think the existence of SSDs are why modern OSs are borderline unusable on HDDs. It is kind of ridiculous how much passive IO there is, and how ridiculously long it takes to launch or open things.


You're probably right. Applications definitely are a lot more lax on resource consumption.

I've sat down and enumerated the things I do daily on my computer versus what I did back in ~2000. Besides streaming (youtube, netflix and spotify basically), its pretty uncommon that I do anything that I couldn't or didn't do back then. The performance of these things, now, seems about the same anecdotally. Maybe it was a little slower then, but not so much slower compared to the hardware performance difference. It makes me sad.

Also, as I said in other comments, I once developed hobby projects on a netbook. I of course ran a minimal system and didn't run resource hogs like slack. I'm currently using a cheap laptop running a minimal system (but unfortunately I do need to use slack, chrome, docker for work). They run fine.

I think developers or companies have decided that developer time is more important than performance and resource use of their products, punting the cost of that onto their customers. If my work didn't require it, I would definitely switch to more resource conscious tools and I'd be a bit happier. Oh well.


I would agree, and I hate that attitude in developers. Technology is a force multiplier, when you write something slow you're multiplying that slowness and all the time it wastes across the lives of your entire user base over the lifetime of the product.


I think browsers is what regressed the most.

I used to be able to open 3 Twitch streams in 720/1080p back in 2010 on a dual core Athlon X2 5000.

These days, if I open 2 HD streams in Firefox on a 4.2 Ghz quad-core 4670k, the system essentially freezes - the browser becomes unusable and both streams lag. Thank god for streamlink that lets you bypass the browser interface and view the streams in VLC or MPV.

Not to mention Electron-based apps using 600-800MB for a chat program with some light jpeg usage.

Developers' blatant disregard the the use of user's system resources is insane. It seems that regardless of hardware improvements, the modern developer only targets "acceptable" levels of performance, leaving power users frustrated.


Ubuntu seems fine on my older laptops, but Windows 10 is a different animal. I had an otherwise fast laptop that was almost unusable because Windows was always thrashing the slow spinning HDD. It could take 10 minutes or more after waking up from a sleep before the HDD light turned off (and even then off means a constant flicker). Apparently it's related to telemetry data Windows was constantly collecting for some reason. I eventually replaced the HDD with a SSD and the difference is night and day.


I'll admit that Windows is a bigger offender in this regard, but Linux Desktop has become quite bloated too.


I had a really old laptop with a 2.4Ghz C2D and 2GB of RAM plus a 4600RPM (I think) 60GB HDD that I had to switch to Cinnamon, but after that it worked alright. Could even keep a few tabs open in Firefox. It really only had trouble when it ran out of memory and started swapping.

It doesn't seem as easy to switch Windows to a lighter desktop environment.


Actually it kind of is, you just go to an older version of Windows. A surprisingly large amount of software will still work, or at the very least have alternatives that do. XP is really really snappy on modern hardware.

Granted, using XP today is not recommended for other reasons.


Can you even activate a copy of XP these days?


It's a trade off. Your OS had to play all kinds of games to manage those 150 or so IOPS you had. When you have 10,000 to a million IOPS you can drastically simplify the operating system.


RAM? maybe for Visual Studio/Xcode/Android dev. But for any nodejs/ruby/php development, a 10 year old machine with 1GB RAM is perfect.


There are two things that cannot be true at the same time.

"Ram is cheap"

and:

"The maximum ram you can get is 8-16G"

Developers who are making electron applications (not that electron is the primary cause, it's just a correlation I see often) constantly consider that "ram is cheap" for their users and thus do not pay attention to memory consumption like they should.

This means that you're right in a way; vim is good enough for most people and a decent terminal emulator is going to cost you much less than 100mb of ram.

However; slack is consuming 2GiB on my machine, Skype for business is using 500MiB, Outlook @ 600MiB. I use safari and not chrome (safari is @800MiB with too many tabs open) but if I were using chrome I could be using many multiples of gigabytes.

If everyone thinks their program is worth 1GiB of ram or more then your machines become very limited in what can be backgrounded. You might as-well run Android/iOS if you're running with 1G of ram with todays application/web ecosystem.

The thing is; I am quite conservative with memory usage and I'm still quite sure I wouldn't be able to work on anything with 8GiB or less.

( I mean, I just checked and I'm using 15G of memory: https://i.imgur.com/xd6eB91.png )


Electron is kind of the cause. You can program lean in Electron (see VSCode), but realistically the people that program JavaScript/TypeScript are usually not your CS pros. It's a language that everyone can get into easily, attracting very subpar developers as a result.

So it's not Electron, but it's what Electron enables.


Electron has its place, which is allowing web developers to make 'native' apps without learning new technology. When you are downloading any free software built on Electron, realize that it likely started as a labor of love that the person decided to open source or make available for free.

That said, large companies using Electron is what I don't get. You can't expect me to believe Slack doesn't have the resources to create an actual app for their product.


It's not that they don't have the resources, but if you want to create something that works the same on Mac, Windows and Linux it might even be a good decision for a bigger company.

Also they might have started in electron and now it's hard to move away from it.


Sure, there's a very low barrier to entry, but node has a big hurtle in the middle.

You can relate if you ever had to setup a JS taskrunner/webpack from scratch. If you don't do it regularly, expect to spend 2-3 hours going through documentation and lots of outdated StackOverflow posts.

JS is easy as long as you stay within the constraints of whatever scaffolding you're using, if you need anything outside of that or need to upgrade your stack you need to know a lot of little things. "Best practices" move fast in JS, and projects die fast too.


What I'm trying to say is that the low barrier to entry causes a lot of bad software engineering. Of course tooling issues can be hard to solve and you can do beautiful programs in every language. It's just far more likely that if you pick a JS developer at random they won't know much about memory allocation, which algorithm to use, profiling, and on general clean code.


Even VS Code is slower and uses much more memory than native apps.


True, on my machine it uses 150MB of my RAM. It's a lot compared to native apps, but it's really at the point where it's not that bad in the grand scheme of things.

Our companies customized eclipse on the other hand hogs 8GB ram when doing nothing, needs 2 minutes to start, and is sluggish in general. It's our course written in Java, which is much closer to native than electron, but still they wasted resources everywhere they could.

You can't run a compile on a 8GB ram machine, it will start swapping to hard disk.

Sublime doesn't have all the features, notepad ++ neither, and I just don't like vim style editing.

That leaves vscode.


And here I am, using nmh and exmh under CWM. With vimb + a host file, or plain Links+. IRSSI + Bitlbee for the rest. XMP to play some inspiring tunes. I only use 1.20G of RAM while using cached I/O . That minus the cache, is 185MB, according to vmstat.


You’re lucky. I could run so lean if I didn’t need to read emails using outlook and use Skype for business + slack for communicating with colleagues.

I’ve tried replacing slack with the wee-slack plugin for weechat but, while it works it’s far from a fully working solution and I still need to spin up a slack client sometimes.


Can't you use Thunderbird + Lightning? The RAM usage may be less.


The actual act of writing code, maybe, but:

1) 4GB—let alone 1GB—is already cramped with just Slack and any one of your usual bloated shitware issue trackers/wanky PM toyboxes (Jira, Asana) open in a tab or two, plus the usual few tabs of DDG/Google, Stack Overflow, some docs, et c. That's before any actual code-writing tools enter the picture. The basic suite of tools to just be working at all in almost any role is just barely not-painful to use on 4GB. Worst case you're in an agency and have all your tools, plus several duplicates for the same purpose for a client or three, and so on, all open. Yes, it's because all these tools are terrible and eat like 20x the RAM they have any right to and even that's generous, but I still have to use them.

2) Better hope you don't need any design tools at all. (Sketch, say) if you're trying to get by on 4GB or less for everything, unless you like only having one thing open at a time or dealing with UI sluggishness I guess.

3) Docker(-compose) or minikube or whatever? Service dependencies, local mock services? Running test suites? Without a strong CPU and 16GB you'll see slowdown.

4) A fan of any of the fancier webmail clients, like recent GMails or Inbox or Outlook or whatever? I'm not and just keep the Basic HTML version of Gmail open because its full-page loads are faster than the "speedy" AJAX garbage on those, but if you are into that sort of thing take a look at their memory use some time.

FWIW I think almost all the tools surrounding and supporting development these days are god-awful resource hogs that somehow still manage not to be very good and think 1GB absolutely should be enough memory to get by doing node/ruby/php dev, but I still have to work with that junk, and 8GB's the bare minimum to do that without hitting swap constantly, IME, and even with that you've gotta be careful. 16GB's much more comfortable, especially if you sometimes have to do things other that just Web dev.


I have 8GB here, under Debian. I can easily run docker (with a rails server), firefox (discord, slack, facebook, youtube, online radio + outlook webmail, all at the same time), with many Emacs windows.

I think I could manage to have less RAM (I frequently code on my chromebook with 2GB RAM). My theory is that the ram expands to the amount available https://en.wikipedia.org/wiki/Parkinson%27s_law


IIRC at my last employer an Asana tab + Slack ate ~1.5GB all on their own, and Asana was so slow to load that one hated to close it.

Jira's not as bad as Asana but depending on the set-up it can be pretty close. Then there's Invision, et c which are much lighter than those but still pretty damn heavy, if you're trying to get by on 4GB or less. And/or maybe you've got Outlook and Teams and all that. And that's just the communication & collab tools, not even any of the stuff to produce actual work output. Temporarily having to use a 4GB machine with that kind of workflow is why I'm now permanently on Basic HTML for Gmail—it loads fast enough I can close it, and uses so little memory there's no reason to. I couldn't spare the 300+MB for Inbox or whatever with all that other junk open, and besides, Basic HTML's much faster.


No, it's that developers write their code on 16GB of RAM Macbooks running the minimal amount of software required.

If you're not seeing performance issues in your developer machine, you'll hardly see a developer running on 2GB of RAM.


But for any nodejs/ruby/php development, a 10 year old machine with 1GB RAM is perfect.

I don't think I'd enjoy the experience of developing on that machine!


A lot of shops have large and swollen Linux VM's to run your database, web server and interpreter in for consistency with the production environment.

If I wasn't running on a Mac at work I could comfortably use containers without the overhead of another VM.


The new docker that uses the Mac’s builtin hypervisor works pretty well


I didn't know about that, I'll give it a look over.


You could always install Linux. I personally would hate running a VM regularly for my job. It just adds complexity without much benefit.


The benefit is the VM can precisely match production, and every project you work on can have a different VM with a different set of software.


It gets worse than that, I'm still on spinning rust, although I do have 21gig of RAM so I can keep my VM's running.

Looking forward to putting an inexpensive linux workstation together when this Mac eventually bites the dust.


For a small app on a box running Linux, _maybe_. Any other case? That hasn't been true for years.

The large Rails app I work on can eat up 500mb RAM easily. The webpack dev server I'm running to compile the Angular frontend for said Rails app is taking another 500mb. Visual Studio Code (easily the best TypeScript editor) is using over a gig of RAM all by itself.


lol, I'm sorry but no a 1GB machine will not cut it even for a hobby, never mind in a real business environment.

A good IDE like PhpStorm is going to use 1-2GB.

A browser will consume 1GB+. Want multiple browsers for testing? Add another few gigs.

Most devs now use a VM or container with PHP, MySQL, etc running inside it. Add at least 1GB, maybe 2GB.

You probably need Slack or HipChat running. 0.5GB to 1GB.

A mail client. 256MB+.

Your company probably has Office 365, so you'll need Outlook. 1GB+.


Eh, 1G is pretty limited. I'd want at least 4G. Yes, 1G is workable (I could do most webdev on a Raspberry Pi), but a desktop environment and a browser running one tab uses half of that, and each tab would increase that usage.

The 4G isn't for building or running code, it's for tons of browser tabs open with documentation. I routinely have >10 tabs open, and that's quite likely to cause swapping on 1G RAM.

Sure, I could slim everything down by running a tiling window manager instead of a desktop environment or configuring my browser to unload older tabs (or close and reopen with bookmarks), but that takes extra time, and if time is money, that money is better spent on a few extra gigs of RAM.


As a counterpoint I currently have over 130 tabs open and my machine isn't breaking a sweat. The secret is leaving Javascript disabled by default and selectively enabling it on a per-tab basis only when needed. This strategy works great if the main thing you're looking at is documentation.


I found out last year that 1GB RAM is not enough for TypeScript development (whose compiler runs on nodejs). I was building a fairly small TypeScript project on a Linux instance with 1GB and no GUI and no other processes hogging RAM. I was consistently having builds fail with errors about half of the time. Finally figured out that it was running out of RAM and if I turned off source map generation the errors went away. I didn't really need source maps in this case, as I was not developing on this machine, but it certainly would be a problem if I was trying to do development.

Sure, there is a lot of development that can still be done with 1GB RAM, but you're going to run into limitations and it is far from "perfect"


And people who only close browser tabs every six months


As long as you don't have to run a bloated electron app or anything similar at least...

I have a relatively minimalist Linux environment by today's standards (I use Emacs, Firefox with about 10 tabs, the Signal client, a few terminals and a lightweight WM) and I already use 1.5GB of RAM at the moment. Firefox alone uses 408M of RES, the Signal app almost as much (!) while my Emacs with dozens of buffers uses "only" 130MB. How times change. The rest is used by all sorts of system processes and minor daemons.

So basically I'd be fine with 1GB if it wasn't for the bloat that's modern web-based development. These days I'd say that 4GB is a more reasonable baseline.


Also not strictly true - this is enough for some version of PHP development.

The last PHP app I was developing was running in the same mode as the whole company's infrastructure was developed and running - one puppetized Vagrant VM for the code and one for mysql, and sometimes a few more for other services.

Sure, even if you just want to abstract the stuff away from your host machine you could reconfigure stuff to run on one VM - but that's again diverging from production. And in the grand scheme of things we were working on several different vm types a lot more than only on this PHP app...


> 10 year old machine with 1GB RAM is perfect

No, a machine with 1GB RAM is not perfect. If you are a frontend developer coding in node. And if you are not, you might need a browser for Stack Overflow; and you might need Docker or vagrant for virtualization.

And, you may also want an Electron-based editor, such as Atom or VS Code.

Even 4GB is hardly ideal. 8GB is probably where it starts getting comfortable.


I propose we give every front end dev at _most_ 4GB RAM for awhile and see if web bloat stops inflating.


There are far better ways to address this particular problem. Introduce web performance budgets in frontend projects; test frontend projects on specific target devices on which you need them to perform well; perhaps create dedicated web performance teams that will work on the tooling and testing for better performance.

Development machine does not need to have the constraints of a testing machine; it's counterproductive.

In any case, "web bloat" is relative to the target users and target devices. It's one thing to target cheap Android phones in India; it's quite another to target laptops of a SF-based startup. In the second case, web bloat is negligible.


> In any case, "web bloat" is relative to the target users and target devices.

What about, relative to how many resources should be required for whatever tasks the software needs to perform?

Just because I have a lot of ram doesn't mean I want Slack to use it all. I'd rather give that space to the OS for file caching and such.


Yes please! Very few companies do this type of dogfooding.


Nope. The project can still have crazy dependencies that need compiled (thrift etc).

Every morning every Dev in our shop no matter the project is going to run "mvn clean compile -U"

Sure you can split the large projects up. Just get some time from your PM.. next quarter.... :)


So you’re not doing anything with Docker or ML, or a compiled language, and you have no unit tests?


I haven't done web development lately. Why do unit tests mean higher requirements on the hardware?


At a guess - the various popular test runners are slow and bloaty.


Depending who you talk to, there are behavioral breakpoints at one, three, and seven minutes.

If you have a few thousand tests at a couple milliseconds apiece (really really easy to do even in a midsized project) you’re getting above that 1 minute range. Shaving 40% off with a faster computer stops people from task switching while testing.

Task switching pretty much doubles your test cycle because you never switch back at the precise moment the tests are done (we are developers. We always underestimate everything by a factor of 2).


Can Chrome or Firefox run on less than 1 GB of RAM nowadays?


FF Quantum runs quite well on a 1GB system. You can go even lower than that, but you'll be hitting swap.


Firefox can.


While I generally agree with that - more ram is better. I was doing android os builds a few years ago for a wearable project and using the -j make arg reduced build time significantly.


Don't forget the SSD.


The ssd is actually needed. My employer encrypts the drives. It’s too slow without the ssd


What encryption are you using? LUKS on HDD is imperceptible for me.


Sounds like you never worked on a monstrosity that requires 9 different docker processes running. Java, Scala, Postgres, damn.

Even on a top of the line macbook pro that shit chugged.

And the laptop would get HOT.


I think that's just the Docker for Mac client (and Hypervisor.framework). I think the disk I/O performance was bad, the last time I used it was for Node.js/Postgres about two years ago. I was able to really speed up the integration test suite by disabling fsync for Postgres.

If you were running native Linux it should be a lot faster.


That's what servers are for.


100


Furthermore, the mac os is seriously impaired compared to linux.


even windows have better UI. on a Mac you can't even alt-tab between two terminal windows. ridiculous.


What's ridiculous is condemning something based on your own ignorance. The keyboard shortcut is Cmd-Backtick ("Cycle Through Windows") and it works in all macOS applications.

There's simply a more sharply defined distinction between applications and windows. Cmd-Tab is for application switching.


you happen to be talking to someone graduated in UX design. and from my high horse I often don't fall for personal taste when writing UI critic :)

if the user wants to move to a terminal window they have open in the background, which key combination do they press? alt+tab or alt+backtick? they have to stop and actively think about which current window is highlighted. is it the browser? or another terminal? this completely kills the action fluency of a keyboard shortcut.


> is it the browser? or another terminal?

This is generally a non-issue, but even if you are getting tripped up consistently, the OS literally tells you at all times what application currently has focus


People have already given you the answer, but macOS also supports tabs in nearly all the stock apps, so CMD+{1,2,...} is an even nicer way to work with multiple terminal windows, if you're not a fan of tmux. I don't think any stock Windows applications allow tabs.


of course you can, it's just another hotkey.


lol try pressing command instead of alt :Z


cmd ~


I recently upgraded from a 2013 MBP to a 2018 MBP. The time to do a simple development recompile of our big monolithic web application dropped from ~1 minute to ~10 seconds. This is a process that is done about 20 times every day.

Over the course of the year, this will save ~72 hours in development time. Of course, it's not a strict comparison because I would always do other things while waiting for it to compile, but it's still a massive boon to productivity.

I understand that we want to live in this perfect world where all software is designed to run on 2008 era laptops. That's not the world we live in. It's Fantasy. I implore everyone to keep fighting for it, but Reality is what matters to businesses, and the reality of most businesses is that software is insanely complex, poorly designed, and it still generates revenue; more often than not enough revenue to afford the best machines to support running it.


>Over the course of the year, this will save ~72 hours in development time. Of course, it's not a strict comparison because I would always do other things while waiting for it to compile

That, in my experience, actually makes it worse. As soon as you start doing other things you multitask, forget where you were at, lose the zone, and lose much more developer time than that 1 minute.


Yes! And I think the limit is about 6 or 7 seconds for me.

SQUIRREL!


I would think that the performance difference between a 2013 and 2018 MBP would be significantly larger than the performance difference between a 2013 and 2015 MBP.

Having said that, MBP to MBP comparisons are not always apples to apples (sorry about the pun). You need to compare MBPs with CPUs from the same family as well.


And nobody suggested buying desktop workstations?

A laptop is generally roughly half the speed of a decent decent desktop. And from experience, even if mobility is sometimes useful, it's hardly the norm. Personally my laptop is used maybe 5% of the time at most outside of my work desk.

Cheap laptop for emails, meetings and occasional ssh into prod + powerful workstation would be a better option for me, and I'm guessing I'm not an exception. And this option is cheaper than a high-end Macbook pro.


Some things are helpful -- large clear display, good keyboard

That's why I prefer older equipment, especially laptops which have proper keys and a matte screen.


I started using mechanical keyboards made for gaming, and I've never been happier with a piece of hardware in my life. It's like typing on butter instead of jamming your fingers into concrete. From what I remember, really old keyboards were all mechanical; sometimes older really is higher quality.


The IBM keyboards used for the PC, AT, and PS/2 used a "buckling spring" mechanism that is not mechanical, as later models use a membrane.

I am typing this on a Model M that is dated "22JJUL88" - I do all of my important typing on these keyboards.

https://en.wikipedia.org/wiki/Model_M_keyboard


I’m fairly certain that Model F and M keyboards are considered mechanical.

It’s not a precise category. But I’d even go so far as to say they are the type specimens for mechanical keyboards.

Yes, most products in the category use Cherry-type switches, but Topres are definitely considered mechanicals and they combine a rubber dome and a spring.


Literally all keyboards are mechanical, what’s the alternative? Organic?


The term is used figuratively, not literally.

Buckling spring keyboards count, rubber-dome and scissors don't. There's no overarching principle here, if there is, it's how they feel under your fingers.


Optical or touch.


Te only thing that prevents me from switching to a mechanical keyboard is I care about my coworkers ability to concentrate.


Cherry MX Browns are reasonably quiet. My local Micro Center has a tester keyboard with a variety of switches that you can try. If you look outside the limited options there, you can find huge selections of mechanical switches that are only slightly louder than membranes. You’re looking for linear or tactile switches and you can find a pretty complete list at https://deskthority.net/wiki/Main_Page in the “Keyboard Switches” section. You can even mod louder switches with O-rings to dampen the noise.


I agree, MX Brown are silent enough to not get notice.

I guess it's a positive side effect of open spaces: there is so much noise around from phone calls and co-workers discussing that it doesn't make any difference.

More seriously, I've a Ducky keyboard with MX Browns (and it's also a weird one with MX Blues (the noisier ones) for the arrows and page UP/DOWN), and I never got any remark for that.

The only story I've heard from colleagues complaining about keystroke noises was for a friend that has a really heavy typing (to the point of cracking the key caps), and even in this case, rubber O-rings did the trick to dampened the noise enough.


Then get some medium stiffness linear switches. Near silent unless you bottom out, and the added stiffness helps with that.


> sometimes older really is higher quality

You can apply this successfully to espresso machines, coffee grinders, furniture and many tools (gardening, woodworking etc). The difficult part is finding them as often you need them immediately.


For me it's stereo receivers/turntables and cast iron pans. Vintage Le Creusets look incredible and they feel so good to handle.


> proper keys

Yes, please give me a full keyboard, and don't put things in weird places. My current machine has "Fn" where "CTRL" should be and it's driving me crazy. Also, the "PgUp" and "PgDown" are directly above left and right keys and I inevitably hit them when I'm jamming the keyboard with my meathooks. I don't need a number pad, but if you're going to give me nonstandard buttons, put em somewhere I can't hit em while I'm trying to do actual work.


In many laptops with Fn and Ctrl swapped there is often a setting in the BIOS to switch their positions. I know Thinkpads have this ability, and I feel pretty certain other brands can do this as well.


Not this Asus I have. I'm ready to cut some traces and solder jumper wires to rearrange this POS.


You can remap any keypress that reaches Windows by editing the registry, as described here: https://www.experts-exchange.com/articles/2155/Keyboard-Rema... (2011, but works with Windows 10)

Unfortunately many "Fn" keys are handled purely in hardware and Windows can't see them. But it might be worth a look, if you haven't already tried it.

AutoHotKey is also useful, especially if you want more complex hotkeys.


And xmodmap is the de-facto solution for X11-based Unix machines. Rebind your keys in an ~/.xmodmaprc that is run at startx by your ~/.xinitrc.


One of my old laptops had the function keys swapped (so you had to hold down Fn to hit the F key) and had mapped Sleep to F5, which I’m used to using to both start debugging and reload a web page.

The amount of times I put my laptop to sleep by accident was infuriating, and it was a work laptop so I couldn’t get into the bios to swap the keys back!


> My current machine has "Fn" where "CTRL" should be

You mean directly to the left of the A?


I.. uh... no? Why, why would you think that?



Well, my laptop isn't a Teletype.


I miss laptop mice that had actual buttons and didn't get in the way when typing. Laptop ergonomics are terrible these days.


My laptop at home is a Dell Latitude E6450 (I think that's the one); it's the last one where they had a real keyboard and not that stupid island shit, and it also has a trackpoint and a normal trackpad with actual buttons. After that generation, they finally jumped on the stupid mushy island key bandwagon, so I don't know how to upgrade from this machine. I had one of the newer Latitudes at my last job and it was nearly unusable because of that shitty keyboard. Honestly, WTF is wrong with everyone these days?

Luckily, this laptop can play x265 1080p video just fine and has 16GB of memory, but 4k is a no-go.


Best laptop keyboard I ever used was on a Compaq SLT/386 I owned. Keyboard could be detached from the laptop (ok, today it would be considered a "luggable", but back then, it was a nice machine), and it had a coiled cord that plugged into the computer underneath where the keyboard sat.

You could unplug it from the computer (it was a mini-din PS/2 style plug). The keyboard itself wasn't mechanical (not a buckling spring or similar system), but it did have full-travel keys and a nice feel for typing on.

I got mine used and had to build a custom battery pack from old cell phone batteries, which made me have to remove 2 meg of RAM (to fit the larger custom battery pack), leaving me with 6 meg instead of 8. I had Caldera OpenDOS installed on it:

http://www.deltasoft.com/opendos.htm

http://esca.atomki.hu/paradise/dos/opendos-en.html

https://en.wikipedia.org/wiki/DR-DOS

...with Monkey Linux installed on top of that (Monkey is a distro that used DOS for the underlying file system - you could even share data easily):

http://projectdevolve.tripod.com/ (downloads don't work)

http://www.ipt.ntnu.no/~knutb/linux486/download/monkey/monke...

I'm honestly not sure where or if you can still get a copy of that distro - I should look into it; maybe I should host my copy somewhere...


I can't speak to laptop keyboards that far back in time, but for a very long time, Thinkpads and Dell Latitudes had the best keyboards for laptops (obviously they weren't going to compare to a Model M or other mechanical keyboard).

But even these have gone away, to be replaced by the shitty island keys.


I use a Lenovo T480 for work and disabled the trackpad so I just use the nipple + hardware buttons when I'm not using a mouse. When I need a new personal laptop I'll probably buy the same model.

https://www.lenovo.com/us/en/laptops/thinkpad/thinkpad-t-ser...


give me a nipple any day.

I can use a macbook trackpad, it's OK for consuming, but the pc manufacturers saw macs, and copied them. Poorly.


I remember getting to my first Job and they had Pentium4 based machines (the last stepping that supported x64) with exactly 4gb of ram. They were dog slow but it also meant we were able to see and address performance problems early because they were more pronounced. I remember taking one bug from taking an hour to run to less than a second.


I kind of wish I was in that boat right now. I made a highly praised window manager[0] for macOS that some people[1] have reported some performance issues with. But I can't reproduce the issue, possibly because I have 16GB of RAM and don't use it all. Maybe I don't have enough Electron apps running and should install Slack and others (I run Slack in Safari when working with clients), not joking that literally seems to be the main environment difference.

[0] https://sephware.com/autumn/

[1] http://brettterpstra.com


Have you tried debugging it in a throttled VM or with valgrind?


I feel the same way, honestly. I do wonder if the programming environment has a huge impact on this though. Did you colleagues use an IDE of some sort? If you're a vim or emacs user, the editor doesn't require much. If your compiling a ton of C++ then maybe a faster processor would help.


I think one of them used Webstorm, I was using NeoVim, and the rest were using SublimeText.

I feel like if we were doing something that required C or C++ (e.g. video processing or data-training), then they might have had a point about wanting to upgrade, but we were doing a lot of fairly typical Node.js REST stuff, something that could fairly easily run on a Raspberry Pi.


From my experience, IDEs benefit the most from more RAM.


I don't always need a powerful machine but I sure want a powerful machine. I consider it part of my compensation. Working with a slower machine or a smaller screen is the same as working with an uncomfortable chair or lower pay.


A two year old machine, assuming it was top of the line at the time of purchase, should be plenty fine for almost anything today though.


In some organizations, replacing top of the line notebooks frequently is less about productivity than employee morale. Sometimes that cost is worth it, sometimes it is not.


10 year old laptops are still portable supercomputers. There is zero reason for them to be slow. The problem is bloated code that expects to use excessive amounts of system resources.


I feel that slower cheaper laptops with good battery life coupled with a compiling desktop/cluster you can ssh into is the best trade-off.


Plus, when that cheap laptop breaks or gets stolen, I can easily replace it. A $100 x200 in my bag and a $2000 dev server at home/work is much less of a risk than a $3000 MacBook Pro in my bag. It's nice to have all the power you need in your hands but it's also nice not putting all of your eggs in one basket.


I've prob been on the other side of that argument. I usually advocate for allowing developers to pick their laptops rather than coming up with a one size fits all solution.

This started when my company started buying Skylake HPs for all new hires because they were "cheaper". They were only cheaper because they were comparing them to current gen MBPs. As a result we were stuck with TN panels and 8 gbs of ram. I would rather have a $400 chromebook with an IPS display at that point, easily 1/2 to 1/3 of the price.


One of the great beauties of working with scripting languages, they work just fine on lighter laptops. The MacBook Air is a great machine for the node/python/php programmer on the go.


Running an old T440 Thinkpad to do my work programming. Mostly doing data analysis. Never ran into an issue with any kind of speed or capacity, and I love the keyboard.

I have a gaming laptop as well that I don't bother to carry around. It's just not necessary unless you're doing GPU programming or model training. Even then, I'd rather just work with a cloud instance.

Devs, you're supposed to know how to make a computer meet your needs! Don't outsource it to someone else. Even 5+ year old computers run pretty damn quick if you use a lightweight linux distro.


Most Devs don't want to deal with hardware restrictions. Its a lot easier to just get a good, general all rounder instead of coding on a dinosaur. Many lightweight distros barely have any meaningful functionality and usually require all interaction with the console to get even basic things usable. And quite frankly, even though you can do more with a console, its a lot easier to remember how to do things with a GUI then without one.

Older hardware generally means older, unsupported, unsecure drivers as well.

Also, I'm confused as to why you claim that you shouldn't outsource to someone else, yet you're fine with working in the cloud...


No idea how you'd run into hardware restrictions; T440 is from 2013, and I have yet to run into a driver issue dual booting Debian and Windows.

I don't understand the bit about the console; I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI. Maybe an argument to keep general users on Windows, not really an argument against devs running a linux distro.

I was suggesting you should understand what's running on your machine and why, and if you do, that $2k mac isn't doing anything for you that a machine worth less than a quarter of that will. Whether or not you have a top of the line machine, there's still reasons to reach for an AWS instance with a powerful GPU attached.


> No idea how you'd run into hardware restrictions;

Hardware can become unsupported when you update the OS. It's happened to me with wireless cards when running FreeBSD on an old EeePC. Even when using xfce, having wpa_supplicant UI was much simpler than remembering and writing a bunch of scripts to set all the crap involved with getting it working. Not everyone uses Debian.

> I don't understand the bit about the console; I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI

Doing a few clicks in a GUI can often result in very complex command executions in the CLI, sometimes across multiple processes. It can be confusing what's going on, especially with redirecting I/O and if you have to do something different, it often requires editing multiple arguments depending on what you want to do.

This is good if you want to script a common task that's repeatable and changes infrequently, but frequent changes in a GUI are much faster and you don't have to worry about copy/paste errors or spelling errors.

And if consoles were so much faster, why does everything evolve into a GUI at some point?

> I was suggesting you should understand what's running on your machine and why

There's hundreds of processes running on the machine at any given time. I would guess that most people don't know or aren't even aware of what and when each process runs at any given state of a machine.

The point is, with a $2k mac (which I would never get by the way), there's easy room for expansion.


> I haven't met a dev who doesn't find terminals worlds quicker than hunting and pecking in a GUI.

There's a billion Windows users out there, do you think there's 0 developers amongst them? :)


> And quite frankly, even though you can do more with a console, its a lot easier to remember how to do things with a GUI then without one.

I don't know about this. Whenever I do something unfamiliar/complicated on the command line I copy every command I used to a text file for later reference. Repeating these actions in the future is as easy as copy and paste. If I figure out how to do something in a GUI and I don't have to do it regularly I will almost certainly find myself flailing and clicking around randomly when I have to do it again in 10 months.


I'd say it depends on what you're programming, but for most programmers they indeed don't need that much.

If you force your devs to use inefficient tools it might be necessary though. Like our customized Eclipse that needs 8GB RAM. It starts page swapping if you have only 8GB.

Other than that I'd say you need good keyboards and good screens. The processing power usually doesn't make you better.


At a previous gig in the financial world, we had 2013 vintage MB Airs with maxed specs cranking out Scala for a card processing platform.

This was right as the term microservice was becoming better known. We were building a very de-coupled, stateless design, passing messages among services.

The MB Air was fine. We did a lot in the cloud as far as testing. Write code -> push -> get test results

These days so much of the work is “fill out yml files”. I see little value in having this 2017 MacBook Pro 15 with maxed specs. Having anything more than Firefox and my editor on the laptop seems useless.

Others workflows may vary. But for devops, security, and a great many common roles, anything above a mid-spec MB Pro 13 feels like overkill


If you're running Docker and minikube locally on a Mac (both DevOps tools), then it can be important.


In my experience hardware is the cheapest part of hiring staff and I've found providing the best equipment that the employee wants its a cheap way of aiding retention. Penny pinching is pretty stupid in the medium to long term.


I remember that third party developers for BeOS had machines (BeBoxes, PowerPC 603) with 16MB (not GB) of RAM, but internally Be engineers had to use 8MB systems. It did seem to have positive consequences at the time...


Causing the race between bloated environment and ultra performant hardware to carry text editing.

Many times I've observed that people running programs on older hardware realized stupid design non-decisions. Fast ~= blind.


I have an iMac and a MacBook pro 2015 ; I wanted a Unix like OS. My youngest sons use the iMac and MacBook for games essentially. I have a much older hp 600 notebook with openbsd. This perfectly fits my needs: emacs, TeX, a browser and some R. No need for Bluetooth either. I guess I paid the premium with the Macs for the build quality and the screens. Never cared about any of their apps and brewed what I needed ( openbsd provides everything I need ( except yet for pandoc) Ymmv.


Only argument I really buy when it comes to needing bleeding edge hardware is if you're doing something like game development where it's reasonable you might be compiling, testing, and running a big IDE.

This whole "productivity increase for saving 10 minutes" is a little silly, people take breaks, get coffee use the bathroom, etc. Most folks I know doing processing heavy stuff do it all on a remote server.

With a few exceptions, I think it really comes down to people want nice toys.


If you're going to be developing in Visual Studio Code, run Spotify and Slack, as well as a few other Electron / Chromium apps then I can understand why someone would want top of the line computers.


On the one hand I agree. I personally don’t need the latest spec to be productive. On the other hand computers are so cheap relative to developer’s salary - you can almost buy a new one every month.


Why not upgrade to more powerful desktop pc's unless your developing native mac software you don't "need" to use macs to develop on


As well, you don't need the overhead price of a laptop, especially as a startup. At my job, we're nowhere near a startup, but only well-established, well-trusted employees get a laptop.


I 100% agree until we start talking about compile times. If you're a Chromium dev, you can't use an ODroid.


I really dislike generalizations.

If you can use an old computer for development — great, I'm happy for you. But please don't assume everyone can, or wants to. For my type of work, no CPU has enough single-core performance (I need quick bursts). And since programming is what I do several hours every day, it is quite an important part of my life, so I'm not willing to torture myself on old hardware just for the sake of it.


I see no such generalization or assumption in the grandparent comment.


My mine take-away from this article is something that I've always thought was key: When you're developing for other people, use the same hardware as them.

I was at a fairly large financial firm in ~2004 that was rolling out a system to a large number of users. These users had impulse control issues, and were not allowed to use mice or keyboards (they would be torn off and thrown across the room). So the screens they used were touch screens. Inventory was shocked when I asked for one of the touchscreens to use as my second monitor.

When launch time came around my parts of the UI were the only ones usable. Everyone else had built tiny tap targets that were easy to hit with a mouse, but impossible to fat-finger.


> ... rolling out a system to a large number of users. These users had impulse control issues, and were not allowed to use mice or keyboards (they would be torn off and thrown across the room).

Trading desk, I assume?


On the floor at an open-outcry exchange.


Trading desks have amazing keyboards!


There is a similar line of thinking for audio engineers who do a final mix/master of music before it is sent off to radio, streaming platforms, etc.

After spending hundreds or thousands of dollars on expensive sound systems and professional speakers, engineers realized the average users listen to their music through inexpensive mediums. A cheap pair of headphones, a small bluetooth speaker, or even laptop speakers. They rarely listen to their music through mediums where the full audio spectrum is present, so in order for an engineer to get a mix that translates well across all mediums, they do their final mixes and tweaks on devices like a pair of apple earpods, or a more famous example, the Yamaha NS-10 [0].

The line of reasoning being: if you can get it to sound good on a basic speaker, it will sound great on high end systems, which is a win-win!

[0] - https://www.soundonsound.com/reviews/yamaha-ns10-story


Just to clarify, they do the final mix and mastering of each piece on a variety of speakers. But right - the closer they get to the finished product the less they care how it sounds through expensive equipment.


This is also done in the toy industry--a sound effect may sound fine through good studio monitors, but will be screech and unpleasant when played through a 10 cent speaker with the crappies amplifier money can buy, so it is important to listen on the actual hardware.


We need to fight software bloat.

Old computers and hardware are awesome for several reasons, specially if you really enjoy tinkering with computers!

- Linux/Un*x compatibility is usually pretty good. and those OSes are good for tinkering and producing new stuff. You can reuse your old boxes for staging servers and test beds.

- Old hardware forces you to learn more about how the hardware works and how to optimize your toolset. No, you don't need that insane number of daemons that run automatically if you install a vanilla distribution like Ubuntu. It is a nice stance against this ongoing trend of consumerism and disposable hardware.

We are wasting and rebuilding and tearing apart all our hardware each 2-3 years just for the sake of having that new instruction corrected in microcode or for having a conversational piece. it's bollocks.

We now have some insanely fast consumer processors which gives us the power to run and simulate real time graphics, audio synthesis, AR, etc.... but following the same trend, software is becoming more and more inefficient.

Some software solutions now require several hundred megs of binaries and local web servers and dlls and whatnot to solve the simplest of use cases. We have text editors and music players that now need a full Node.js server instance and a separate build of Chromium running just for doing shit that we used to get by with a tenth of ram and processing power. Thank god we still have some survivors like foobar2000, notepad++ and etc.

The effect of that inefficiency on us is that we get to do the same things we did 10 or 20 years ago (study, read, code, listen to music?), only using way more cycles for nothing.

Besides very few people running insanely complex simulations (which when scaled, are gonna be run on clusters or AWS anyway), most power users don't need that multicpu rig with 128 gigs of ram.

I have 2 old laptops now and one of those is a pretty usable 2010 Macbook with 3gigs of ram. It ran Mac OS X dog slow but now with Xubuntu, it's still pretty usable. And when XFCE starts becoming too bloaty, this box is gonna allow me to study tiling window managers and more lightweight customizable distros like Arch. The net effect is me studying more, which is always a win.


It's really shocking how much Linux Desktop has let itself go over the years. I remember when you could recommend someone install Linux on an old laptop to get it responsive again, but nowadays that advice isn't true anymore.

https://www.youtube.com/watch?v=7kvT40umKL8


I recommend people to grab Caldera OpenLinux 2.3 from Archive[0] and try it in 86box[1] on an emulated Pentium 166MHz with 32MB of RAM and 2GB (or so) HDD. Do an installation that includes everything. Note that you only need the first CD (the one named "OpenLinux 2.3 CD.iso"), the rest are source code packages and some commercial stuff (that most people probably never saw, but the first CD was distributed via magazines).

This single 650MB CD IMO it is one of the most complete distributions in terms of what it includes and you can do with it. It includes a complete desktop environment (KDE 1), development tools (with an IDE), graphics editors, productivity tools, typesetting tools (TeX, LaTeX, Groff), databases, clients for mail, irc, web, etc, media players, a bunch of games and - most importantly - documentation for pretty much everything accessible right away from the panel at the bottom (...although ironically while there is also a "getting started with linux" document that describes the basics, for some reason it is not linked and not even compiled... you need to already know the console and docbook tools to compile it and read it :-P).

And on that machine is still very fast and usable. I tried it at some point a while ago and after playing with it for a while i started reading some of the included stuff and after a while i forgot i was in an emulated environment since everything was responsive. And this is, btw, exactly how i remembered it - this distribution was the first Linux distribution i ever used on my PC in 1999 (which was a 200MHz Pentium MMX with 32MB of RAM).

(as a sidenote, yes this is the same Caldera that renamed themselves to SCO and made the infamous lawsuites, but this distro was made before that time)

[0] https://archive.org/details/OpenLinux2.3CD [1] https://github.com/86Box/86Box


Thanks for drowning the remainder of my Wednesday in nostalgia. Caldera OpenLinux was my first distro. I got its first CD bundled with some random book about Linux in 1999. The book came from a store that sold used books, so I was surprised to find the CD still nestled in there.

The book turned out to be garbage, but the CD changed my life. I still keep it at my desk 20 years later.


My first distro was Monkey Linux running on Caldera OpenDOS - circa 1994-5-ish. On a 386 laptop with 6 MB.

My first real distro was TurboLinux 2.0 - IIRC, that was on a Zenith 486 laptop with 8 MB.

Then I moved to RedHat 5.2 (don't recall the machine, some old desktop tower I think - maybe a P90 w/ 16 MB but I am not certain of that - might've been my AMD 586/133 box at the time).


A default install of Slackware from a 2.4Gb .iso will give you a similar environment that is reasonably functional on an X200 similar to the OAs. I actually use an X220 which is slightly more powerful but same ballpark.

x4 inflation over 20 years?


Slackware is the closest i know of, but only because there isn't anything better. Beyond that it isn't really comparable since Slackware is just a collection of software thrown in together and it has a ton of redundant software, a large part of which is "bloated". The reason i mentioned to try out COL was that it provided an integrated system (it is probably the only time i used a Linux system where things felt like they were meant to work with each other - and kinda sad that this was on the first distro i ever used) with little redundant functionality (beyond practical matters - it has both KDE's web browser - not called Konqueror yet at the time - and Netscape since in 1999 the latter was pretty much needed for browsing the web on Linux) yet with a lot of functionality provided in a small footprint. The full install is a little above 900MB - in comparison Slackware's full (and only supported) install is a bit above 9GB (so it is more of a 10x inflation :-P) with a couple different desktop environments, a ton of window managers and a bunch of duplicated things.

But as i wrote, Slackware is the closest which is why i have an ISO of the latest DVD available on my external HDD to throw in a VM whenever i want to try something on Linux. Also (and this is only tangentially related) i like that it still provides "full" distribution media (in comparison Debian has become so overloaded with packages that to download installation media for archival/offline use you need to build them manually yourself and even then it is several blu-ray disks).

FWIW i do not think it is impossible to make something similar, but it would require a lot of work, especially on the desktop environment front.


Hey, that's only 7% per year!


I think the issue is just that the popular Linux Desktop environments have now moved to GPU acceleration to match the Windows and MacOS counterparts, simply because there's enough user demand for these features. However, lightweight desktop environments which don't rely on the GPU, like Xfce, MATE, and LXDE, absolutely still run on older hardware and can make it zip right along.

The unsolvable problem for older hardware unfortunately is the web browser; there's nothing the operating system can do to make websites less complex, and more and more sites are becoming ridiculously bloated with scripts and media which will lag badly even on modern PCs. Still, ignoring the web browser element, Linux as an OS is highly flexible, and it can be made to run well on just about anything. That's still one of its core strengths.


> However, lightweight desktop environments which don't rely on the GPU, like Xfce, MATE, and LXDE, absolutely still run on older hardware and can make it zip right along.

This is simply not true. It isn't just about the GPU, it's about everything accruing bloat.


Yep. It's amazing how the early netbooks went from new hotness to piece of rubbish simply because Linux left them behind.

I pulled out my old eeePC 700 a few weeks ago and couldn't find a single currnet distro that would install on it. So it's stuck with the one that came with it and the old Firefox browser, both of which are probably full of security holes.


> couldn't find a single currnet distro that would install on it

Not even Debian? Mind you, I'm not aware of any other distro that's both current and has 32-bit support... And the Asus eeePC 700 is a potato by modern standards. (I mean, 2GB SSD? At least it can easily be made to boot from an external SD card.) But even the bulk of netbook-class hardware is well supported, AFAICT.


I have an eeePC 900a (?) - It came with, IIRC, a 32 GB SSD and 1 GB of ram - I ended up putting 2 GB in it, and a 128 GB SSD; made it much more comfortable (for all that's worth). Still a dog CPU - but for what it is, it ain't too bad. Somewhere, if I am ever brave enough to delve into doing it, I have a camera upgrade for it. I also think I got a GPS upgrade for it too that I need to install (can't remember; there were a couple of pieces the next model up had that I didn't get at the time - and the parts could be bought on ebay not too long ago, so I grabbed 'em).


Xubuntu has a 32-bit version still, and should run well...


Sadly, no. Crashed on installation with a message about letting the developers know that the installer crashed.

But it at least got farther than Q4OS, Haiku, and others.


Presently there's a bug in Haiku's Intel video driver on EEE PCs. If you enable VESA video in the bootloader, it should work just fine; I know plenty of people who say that Haiku its the best OS for netbooks. :)


Current versions of antiX or Q4OS should run OK on it.


Thanks for the suggestions. Just finished trying antiX, and it's a no-go. It doesn't see the internal drive. Trying Q4OS now...


You might have more luck with the text installer in antiX - run "cli-installer" in the terminal.


I don't have any issues with Debian and lightweight environments such as LXDE and Xfce, let alone something even lighter like i3wm. Even MATE is pretty good. The biggest issue with old hardware is having enough RAM for the modern web, but FF Quantum makes things quite tolerable even on as little as 1GB, at least for now.


I'm on an x201 running i3wm. All good.


We have different definitions of "old". I'm running Windows 10 on an i3 right now for work, compared to your supposedly old i5 with the same RAM.


That ain't old. My TRS-80 Model 100 - that's old.

My Altair - older still.

I have a chunk of core memory with 3mm torroids and bakelite framing; that's almost positively ancient in computer terms...


i3wm is a window manager, not a processor.


The x201 is a thinkpad with an i5.


The x201 is a Thinkpad sold with an i3, an i5, or an i7.


Many years back I had a peer who insisted on sub par hardware for all his non-prod environments. He’d also insist that his team meet any production performance goals on said sub-par hardware. End result was things would fly when they finally got to production.


I call this the Demostenes' performance hack.


Imagine how much better off the whole planet would be if we recycled computers for use by other people, kept and repaired computers, and didn't accept the idea that rendering a web page should take half a gigabyte.


I have a shelf of old parts including a Pentium 4 ATX board with working CPU and some DDR2 I found. I have a dozen or so hard drives that still pass all their SMART readings (even if most have old age and wear level warnings). I have a few 10+ year old monitors that look like spoiled eggs with how inaccurate their colors are but aren't broken.

I keep it all around as a "if someones board etc dies, at least it works" kind of thing. I have some old GPUs too that still work including some 8800s, an X1800, and a GT 520.

If they broke, it wouldn't be worth my time to fix them, but they are still working hardware, even if they aren't the latest tech.


I once saved my bacon with an "old" (I don't really consider it old) 8 MB S3 PCI video card I had lying around.

I had bought an AMD chipset motherboard that assumed (in the BIOS) that you were going to use a CPU with onboard video (ie - an AMD "APU"). I was setting things up, and my CPU was an AMD without onboard video, because I figured the motherboard, like all the others I'd ever set up, would switch automagically to the PCIe video slot, which had my old NVidia GPU in it.

Nope. POST beeped that I didn't have any video output. So - I could either buy another CPU - or another mobo. Neither was an option. But I thought - maybe I could plug in a PCI video card and it would recognize that...?

Dug through my various boxes-o-junk and found one. Dropped it in, and it worked - long enough for me to switch the BIOS video option over, after which I could use my GPU.

I was kinda shocked at the BIOS, though - it was my first time using something "modern" - and it was a pretty GUI with a mouse and everything (then I started learning about UEFI which the mobo supported - a lot of things had changed since my last system, which was a Core2Duo board).


I wonder at which point the power efficiency of these old machines outweighs the environmental impact of tossing them and buying a modern machine.


The oldest machine I own which theoretically works (I've never plugged it in, because I don't want to start a fire) is my Altair 8800.

It has a linear power supply with huge caps (one is about the size of a small soda can) - which is why I've never powered it up, because a fire could be the literal result (the whole system needs a complete refurbishment to get it into operating condition).

Even so, I doubt it consumes more than a 100 or so watts, though a chunk of that would be dumped as heat of course. The entire system is probably far under the general TDP of today's desktop systems (maybe just the GPU alone).


While I love those old Thinkpads (and I have one), my main issue with these posts is that nobody talks about battery life.

If I work on a laptop, I don't want to have to look for a power socket around, I need something to work on battery for at least half day.


My x120 still gets more than 6 hours on its original 9-cell battery. Sway works great on it -- heavyweight DEs long since became unusable. The only problem it's developed is that if I pick it up by the bottom of the keyboard, it locks up. Some loose internal wiring, no doubt. Have to remember to pick it up by the hinge end.


This is one of the users for whom I optimize my software, folks :)


Thank you!


I generally upgrade when the battery can't keep up any more (or the damn thing falls apart, like my old X201). My 1st gen Thinkpad X1 was put out to pasture when the battery would only last about 2 hours and the (proprietary) ssd started going bad. Upgraded to a T460 with an i5 and love it. The hot-swappable battery is great in my book, although I don't understand why they don't have a separate charging dock for them.


The battery is hot swappable? Meaning the laptop stays powered on while you change the battery out? That is cool.


Yes. Many ThinkPads had a small internal battery to power it when you were swapping the main battery.


Heh - my TRS-80 Model 100 has that feature:

A small NiCad (though I'll be swapping it with a newer super capacitor soon) keeps the memory contents available when you swap out the four AA batteries if they run low (you get several days of runtime off them).


That is awesome, I had no idea. Dumb question, do you know what the feature is called, just hot-swappable capable or something for when Im shopping?


“ThinkPad Internal Battery” works on Google.


When it comes to battery, no one talks about performance. If I yank the power cable, my battery lasts maybe 1-2 hours, but the performance is 1/2 or even less. So I just avoid it.


x220 with ips screen, 9 cell battery. Lasts 9 hours if you just ssh. You can get it for $300-$400.


I do have a similar experience. However, when I start compiling, the battery goes down in 2-3 hours. Simply because the compilation takes so much time...


I do that stuff on AWS. I try and treat my laptop as a terminal.


Oooh, I like that idea! I'm gonna try to figure that out today.


Or if you have some spare basement space, pick up an old Dell PowerEdge or some HP equivalent from eBay and slap ESXi on it. Works well for me, and I use my PowerEdge T410 for some other home services. 12 cores, 24 threads and 64GB DDR3 ECC RAM goes a pretty long way.


Plus keeps your house warm in the winter :)


If you’re an Emacs user, TRAMP makes running compilers on an external system feel almost completely transparent. New shells automatically open up on the remote machine and it feels almost local if your network latency is low enough.


How much is that AWS instance?


I fire up a suitable instance when I need it rather than leave one running. I have an AMI ready to roll with tools on it.

Average about $7-25 a month depending on what I’m doing.


Yes, but that machine is quite a whole lot newer than an old X200. It's from 2012, X200 is from 2008.


Leveno sells new original batteries for old thinkpads. I have a x230 with a new battery and it lasts >8 hours.


You can also get NOS batteries on eBay for virtually nothing. Recently got a 68+ for my T440 for £39 delivered.

Be careful though as anything over 4 years old can be DOA as the batteries slowly discharge and the BMS in the battery itself disconnects them permanently for safety. This applies to ones Lenovo sell as well. I found this out when I bought the above but the seller sent another one out FOC. I dismantled the dead one, pulled the 18650s out and charged them up standalone and they were fine. Free cells!


The big problem with NOS battery is that sometimes its a gamble! I bought one off eBay for my x230 and I had to change it.

But the seller was cool enough to change it ( getting 4-5 hours ) and a smaller one which gets me around 2 hours more.


Yes it is a gamble but as I said, it is from Lenovo as well. The battery manufacturing date I got from a "new" one was 4 years old once.


I use a 9-cell battery on my X200, and it holds for around 4~5 hours. So for me it's perfectly fine.


My former boss (front end web) had a similar rationale for forcing the team to use old MacBook airs.

I think it’s reasonable if you have cloud everything. We did not. We had to deploy directly from our machines, which basically made the laptop unusable for 12 minutes on every deploy. Also, sketch + slack + chrome + dev env basically meant swap was the normal state of affairs (random 10 second hangs all day every day). But at least I could talk to other people and decrease their productivity while I was waiting to be able to work.

On the plus side I was very motivated to optimize our build system.


This is a _really_ good point the author makes. I definitely am guilty of getting into the trap of wanting to get new hardware when, realistically, I could do all of my work on a 10-year-old machine.

This concepts also applies to website performance. I test my projects on physical iOS and Android devices, but they're both relatively new. I should start taking into account lower-end devices, especially on the Android side as JS is notoriously much slower.


> In reality, this high-end hardware isn’t really necessary for most applications outside of video encoding, machine learning, and a few other domains.

Like browsing the modern web.


Yes, this is what drove me to abandon the netbook I'd been using for years -- everything was fast enough except any sort of web browsing.


It's better if you disable JavaScript by default and gradually change your habits to avoid websites which work poorly.


You end up cutting out a huge part of the internet then.


Yep, and nothing of value is lost.


I'm fine with slower hardware, and agree in principle with what the article is saying, but not poor displays. The reason I pick a MacBook over a Thinkpad is because the screen quality is cash money compared to the 1080p on the stock T series. I just can't do 1080p anymore.

Also, you could solve this problem by just using a VM. If it really bothers you to have so much RAM and CPU, just spin up a 1401 in VirtualBox and use that for all your dev needs.


I agree with you on display quality. I picked up a ThinkPad x220 for not much money, but I swapped an IPS panel into it because the TN panel it came with was really, really awful. I don't even hate all TN panels - the one one my old MacBook Air was quite decent.

When you start getting up toward MBP prices, you can find ThinkPads with decent displays. Even some of the older W series came equipped with a 3k matte IPS, which I found quite nice. I prefer the W series keyboards over the new MBP keyboards as well, but I found the trackpad on my W540 to be ridiculously awful. It's not exactly cheap to buy one of these machines on eBay, but they aren't crazily expensive either.


My current desktop I bought with my own money is from 2012. I selected my components based on low wattage, not based on highest specs, because the computer previous to that, a Xeon server, used 200 watts idle and it cost me at least $50/month in electricity.

I've been using this computer since then, with absolutely no reason to upgrade. The only thing I would need more horsepower for these days is ripping/transcoding video faster or editing video faster. I ran into issues playing 4K video and thought I was processor-limited, but it turns out it was my codec was also from 2012 and when I upgraded that, everything was fine.

Even at work, I'm still on a 2015 Macbook Pro because I refuse to upgrade based on those fucking Touchbars.


I co-founded Sonalytic and served as CTO until we got acquired. We developed AI for music information retrieval. Clearly an area where every developer needs the newest hardware. Or do they?

The entire time I worked from a 2010 Macbook Pro. For really resource hungry stuff, we had a few dedicated servers with 64 cores, 512Gb of RAM and Nvidia GPUs. I.e., nothing you’d ever get in a laptop.

Because my Macbook wasn’t particularly fast, we ended up avoiding all kinds of bloat. We kept things simple, which actually helped us move really fast, and used the resources on the big machines for scientific computing.

Worked out well for us :-).


While my hardware isn't that old or that low-powered, I strongly agree with the sentiment that software developers are notorious for under-appreciating the importance of performance. One of our luminaries once opined about optimization in such a way that some of us incorrectly think performance optimization is something we should de-prioritize. Knuth's commentary has value, but he is not a deity and his advice left us with an ambiguous notion that you must know when to optimize and, for many pieces of software, optimization ends up never happening or happening far too late, yielding a bad user experience. His advice should be framed not as when to optimize, but how: profile, focus on low-hanging fruit that yields the maximum performance benefits first, and only as needed dig into the lesser bottlenecks.

Despite this confusion about when and how to focus on performance, many of us instinctively know software is, in the large, too slow. Microsoft Office, a slew of web sites and web apps, music players, most Electron apps, etc. As developers, we sometimes get clever and use animation to hide slowness (or, pathologically to mistakenly add slowness). As the OP mentions, some of these apps end up with "good enough" performance on high-end hardware, but that "good enough" isn't in fact good enough.


My opinion:

The article implies the hardware is fine, it's your workflow that's too heavy. I want a machine that fits my workflow, not the inverse.

Even though the article implies that there isn't much difference, the difference in productivity can be HUGE. Specially for workflows that depend on crappy^H^H^H^H^H^Hdemanding applications.

From the perspective of a company:

The cost of laptops relative to salaries is minimal - Salaries are usually close to six figures/y, a very good laptop is $2~3k, and can last multiple years. The productivity increase will most probably cover the investment (not only in terms of direct return, but business value: could be time to market, fulfilling a tight deadline, etc).

From the perspective of a freelancer:

You are selling your time. A laptop that costs $2~3k will free some time (time saved) and be used for years. Time saved = FREE MONEY.

The only reason I see to don't upgrade a laptop/desktop used to work is if it's performance it's already close enough of a new one. BONUS REASON: newer versions have keyboards that self-destruct.


Likewise, I use a now 8 year old Dell XPS laptop. With any reasonably high quality hardware, maintenance is not hard to do. I upgraded the screen and RAM a few years ago, and that has precluded me from needing anything newer. I've changed the thermal paste and blew the dust out of it. The only things I have consistently had to replace are the charger and batteries. Both are <= $30 on Ebay.

I think replacing a laptop that's 4 years or less old is pretty wasteful, especially with how little the performance demands have changed in that time. We should treat computers more like we treat cars: knowing they can't last forever, but still a long term investment that requires some maintenance. Just like with a car, if you can't perform the maintenance yourself, take it to someone who can.


Nice to see I'm not the only one still using an X200, it's still a pretty decent machine with an SSD, 4GiB RAM and coreboot. I fully agree with Drew.


I have a "consumer grade" personal laptop from 2013, I use a company issued MacBook Pro and just recently was still using a HP EliteBook 840 G2 issued by my former workplace.

Checked the first two over the same TypeScript project and the MBP takes 10s, while my laptop twice as much.

It does indeed feel kinda slower overall, but it never occurred to me that it was by that much. I mean - it's still fine for my after-hours stuff.

Meanwhile the HP, even though in the middle in terms of age here was a pain to work with, because the fans on its i5-5200U would spin furiously on any hint of actual work being required from it. Performance was equally bad.

I guess the takeout here is that while hardware doesn't matter that much, it has to be good hardware to begin with.


Do both of your non-MacBook laptops have old spindle hard drives? I find that is much of the difference between “slower” and “faster” computers. SSDs have come down in price significantly, so it’s worth the upgrade if you use the computers on a regular basis.


All of those mentioned have had SSDs from the get go, but with the exception of the MBP connected via SATA III, so they never reached their full potential in this regard.

My first ever laptop(2010) is still running and has a magnetic hard drive. I can't believe I went through half of college on this thing.


My trusty T400 died about three years ago, and I occasionally need a laptop to work remotely doing web development and server maintenance. I also had a X41 tablet which wasn't really being used outside of being another old ThinkPad in my collection. I started using the 1.6Ghz single core X41, running Linux and Windows 7 dual boot, to work remotely and really had no issues doing what I needed to do. I did need to have a bit of patience at times, but it worked.

This September I did finally get a new laptop, a Lenovo X1 Tablet, but I still like to fire up the 13 year old thing and use it to test my own websites, just to make sure they are still usable on slow systems.


TLDR: Old Thinkpads are powerful and rugged, but the screens are terrible.

A wistful story:

I used a Thinkpad X201 (2010 model) as my primary laptop for 6 years. I called it "lappy" (yes, a homestar runner reference). That thing took at serious beating. I worked on solar power system monitoring and control, and this involved a lot of field-work to debug problems. Lappy was exposed to a lot of dirt, dust, moisture, heat, cold, grease, and spilled coffee. Yet, after I upgraded the HD to a SSD it was the fastest build machine on our little software team.

It did require repairs, but Thinkpads are designed for repairability. The most fragile part is the power connector, which will break if the laptop falls and lands on the connector while attached to the power cord. This repair requires taking EVERY part out, but I followed the instructions and was able to replace the connector.

After taking a break from Lappy for a few years and working on a Macbook with a retina screen, I have to say the biggest deficiency of the Thinkpad was the screen. When I read about using an 11 year old Thinkpad as a primary machine, this is the first thing I think of. After using a modern laptop display I can't go back to that washed out, fuzzy display on the X201.

2 years ago I took it on a "last mission" when I was building an installation at Burning Man, which is probably the most hostile field environment I've worked in. It had started to shut down randomly, but it kept running reliably enough for my week-long dusty hack-a-thon.

[edited to fix some typos]


I don't plan on giving up my ThinkPad X201t any time soon, its probably slower than an X200 as the ULV CPU throttles a bit but I've got 8gig of RAM, a solid state disk, a real keyboard, a 16:10 VA display and no trackpad.

If I fancy more graphics performance I've always got the option of an express card PCIe dock.

I find that this setup gives me a good yardstick for the kind of hardware an average end user might have and it pushes me to make better choices when writing software.


> This laptop is a great piece of hardware. 100% of the hardware is supported by the upstream Linux kernel, including the usual offenders like WiFi and Bluetooth. Niche operating systems like 9front and Minix work great, too. Even coreboot works!

If it's a Thinkpad X200 like he says, it could even support Libreboot:

https://libreboot.org/docs/hardware/x200.html


Yes, but you can also use Coreboot without blobs on this particular machine, so it doesn't really matter. Libreboot is just a Coreboot fork which is only compatible with machines that are usable without blobs. And they deliver the firmware as a blob, for more easy installation, Coreboot is basically a git repo.


Over the last thirty years, I have bought exactly one brand new device - an Asus Eee, which I still have around somewhere, but which, ironically, I never really used. Everything elese has been other people's lay-offs, including my beloved Thinkpad T440p on which I'm typing this. I am forever baffled by the quality and newness of hardware everybody simply wants to get rid of.

Believe me, I ship lean and speedy code.


What are some features that you would look for in a laptop so that you could still be using it in 10 years if you bought it new today?


- User upgradable/servicable parts, notably the battery & HDD

- All hardware supported by open source drivers in the upstream Linux kernel

- Durable frame

- Standard ports which have already been shown to stand the test of time (USB, 2.5mm headphone jack, and HDMI)

- x86_64 or RISC-V architecture


At least a 1080p IPS screen and a decent keyboard (no chiclets) with plenty of travel (I don't care how thick it makes the laptop).


Guess what? Electron is everywhere and everything. Electron is taking over - its raining hamburgers.... RUN

Seriously though, I guarantee it would nuke old hardware into orbit. Its unfortunately not feasible IMHO due to modern software glut. More power to Drew, but many people are forced into using Skype, Slack, Discord :/


I've got an interesting comparison here: I have a 2nd hand x220 thinkpad, 10 years old. This machine serves me very well today on linux and KDE. I did give it an SSD and 16g ram. Otoh, most of the time it is forced on the lowest CPU speed to get some extra battery.

My previous employer gave us recent laptops. But it has a virus scannerbogging it down to unusable slowness. It has tons and tons of agents for licensing, network security, proxies,etc... It has draconian group policies eating away speed. It has SCCM scanning the whole hard drive on a regular base, which of course awakens tje virus scanner just for some bonus slowdown. I dreaded developing on that beast, and coworkers tended to copy code to their personal laptop, just to get some work done.

Now on paper, that work machine was a lot better than my thinkpad. In reality however...


However your ancient Thinkpad is now a hole in your network security...

Most virus scanners are security theater IMHO, so you're probably not much different than those other laptops. At the very least full disk scans are pretty unnecessary after the first if you're running in the default active protection mode. Anything the active protection misses your full disk scan is unlikely to find.


why would it be? It runs a fully patched Debian with ports closed, etc...

There is no difference software wise with a laptop from today.


This makes me question my desire to replace my 4 year i7 laptop with ssd and 16gb memory. To be fair. The performance is fine. But it's a 17" screen with an full HDD as well. It weighs a lot and traveling is a bitch. I guess I could get an older used slim and light system for just travel....?


I like my X200s too, but Lenovo were so miserly with the TFT panels they specced. Absolute bottom of the barrel stuff in terms of colour and viewing angles.

Something that's cool about the X200 design is that they didn't bother putting a token trackpad on it. It's got the TrackPoint and that's it!


At least the X200 (so I presume the X200s as well) you can put a good AFFS screen [LCD HV121WX4-120] in.


KDE development in particular seems to have zero figs available for users with hardware limitations. It seems to me all sorts of apps are constantly building indexes and scanning my system, which might be fine on a fancy modern desktop, but not on the curmudgeonly junk I always end up using.


That's my one gripe about KDE. I'd like an easy way to disable these indices, since as far as I know I've never used any of them.


I've been using an x201 for programming and music production (and everything else) since 2014 and don't see myself making a change anytime soon. (Though I recently picked up an IBM server from around the same era for less than the cost of a new cellphone, and now I have what feels like a supercomputer to do offline audio renders and such that are possible on the x201 but take a long time...)

I do enjoy the peace of mind that if I drop my laptop in a lake, the cost of replacement is somewhere in the $100 range and dropping every year. But I agree with the author of this article that it's also a nice way to (without really having to think about it) keep the software I write reigned in and usable for people who don't have the latest multi-thousand-dollar hardware.


Unless you’re doing music production in a non-standard way, you would benefit greatly from a better computer.

DAWs and plugins eat a lot of ram and processing power.


Regarding the 11-yr-old laptop I would be interested to know not only the compile time but also the storage required, e.g. size of src tree plus any tmp space required, etc.

I like to compile kernels using memory-backed filesystem without any persistent storage mounted. This helps with speed.

For example, with BSD, I only need about 200 MB RAM to produce a more-than-adequate kernel (about 17 MB).

One of the benefits of doing "development" on old hardware IMO is that the programs I write on the old machines is guaranteed to be very fast on the new machines. I can move these programs back and forth between machines without worrying about performance.

If I only used new machines to write programs, e.g., small, relatively simple ones, then I could not be sure that they would run well on the older machines.


Just built a NAS from an Asus m3a78-MC, Phenom 2 x4 920, 8gb of ddr2800 and an old case. my Desktop pc is only one gen newer on the motherboard from Gigabyte (990FX-UD3) and 2 gens newer on Processor (8350) with a 560ti, and 16gb of DDR3. the NAS cost $75 for all the parts listed in the last month ($17 cpu, $38 mobo, $20 memory). The desktop was probably $400 for the parts when they were new ($135 mobo, $230 CPU, $50 memory). Right now going just a few years back gets you gobs of processing power for very little. But of course its all relative, in another 10 years these DDR2 based systems might seem quite slow.


> I showed him how it could cold boot to a productive sway desktop in <30 seconds

I'm currently using a new cheapo HP laptop, it cost €600 (substantially lower than the 2k+ Macbook Pro's I've been using prior). I run Manjaro Linux with i3 as the window manager. I'm blown away at how fast it boots. From power on to login prompt in about 3 seconds (bootloader is setup to just boot right away) and from login screen to usable i3 desktop in about a second. I've never had anything boot so fast before. Not that boot speed is meaningful for actual usage performance, but anyway.


In the old days it made a lot of sense to have the fastest computer available and multiple monitors. It was a huge boon to productivity and short-sighted to ignore.

However, computers these days are fast enough, have SSDs, and enough memory for 95%† of tasks, for at least a decade. For everything else there are beefy servers and the cloud. As the poster says, there are advantages to old hardware. Like useable laptop keyboards with key travel.

A second monitor would probably be a better investment if you don't have one already. I recommend portrait orientation to avoid scrolling.


I wonder...do the makers of MacOS use old macs? Do the makers of Windows use old ibm pc clones? You can learn to empathize with your user by testing on a range of hardware that would simulate their environment. Your own development hardware doesn't need to be handicapped for this to happen.

This is all very context specific. If I am developer working on a web app, and my shop uses several heavy but feature rich IDEs, chances are I need pretty decent hardware. If I'm programming nintendo gaming cartridges, I'd probably be ok with simpler hardware.


Imagine how much smaller/faster Chromium would be (or any of Google's big web apps) if the primary developers didn't get to use supercomputers to build and test on every day.


The X200 is an exception I think. I can get along with an X200 because of the incredible build quality, keyboard, upgradeability, etc. But any other laptop from 2008? probably not.


Energy efficiency gains mean that usually a new computer can pay for itself after a certain number of years or months, in terms of electricity savings.

And fast hardware is key for performance in certain careers, such as software engineering.

Even for a working class casual user, it would make a lot of sense to upgrade an old power hogging computer to a new netbook. It would be better in every way, and still likely save money in the long run. Not to mention time.


This is so close to my experience it's eerie, down to the X-series thinkpad and KDE. I downright dread each major OS upgrade and the KDE that comes with it - they only ever get slower and more unreliable. I will almost certainly make the jump for the next debian release.

I don't know if goes without saying, but the single thing that makes "old" hardware bearable these days is an ssd. Without that the pain can be unbearable.


and it can play 1080p video in real-time

I'm skeptical of this claim given my X220 (Sandy Bridge) has trouble decoding 1080p with any medium/high quality playback options. Enabling debanding algorithms, for example, which is default in mpv, causes dropped frames with high-bitrate 1080p x264 both software and hardware decoding, and dropped frames in general decoding 1080p HEVC (software only, iGPU doesn't support HEVC).


For my laptop(s) I tend to go with the best I can afford for multi-purpose development (because everything I do is HPC oriented or virtualized with a need for SMP vms) but I still have a Dell e6400 latitude that provides directory, dns and monitoring services for the in-house testbed. When it dies I have a celeron road warrior from WalMart ready to take it's place. Fit purpose to function to hardware.


I think the answer to this question has to do with his previously featured blog post: "I'm going to work full-time on free software" :)


Haha no doubt!


I doubt it's possible watch Youtube in a browser on Linux on a 10 year old system. That's the only factor that forced me to upgrade my old PC: I just couldn't watch the damn youtube videos. Everything else worked perfectly fine. Windows 7 32bit upgraded to Windows 10, latest Ubuntu Linux with Gnome desktop. And thanks to an SSD it cold booted quickly enough as well.


Now I am definitely switching to an old computer. I can’t wait to get streaming video and Youtube out of my professional life. :)

(I find video and talking to be in general an amazingly “low bandwidth” way to ingest information, even watching/listening at 1.5-2x speed.)

I do listen to podcasts/NPR while doing the dishes or other mindless housework though.


What I do on my netbook is use VLC to play Youtube videos since it supports hardware video decoding.


I try and get the best computer money can buy, because the quality of life is better on better hardware, I can always configure software-based restrictions if I need to create performant software (network throttling, cgroups, etc.), and because there's things you can do with new hardware you can't reasonably do with old (like training ml models). It keeps your options open.


Here I was hoping for actual "old hardware"...sigh.

I'm in the process of refurbishing, possibly with some custom upgrades (ESP8266 wifi-serial dongle would be interesting), of a TRS-80 Model 100.

Days of runtime from a handful of AA batteries.

True, I won't be able to do anything modern with it - but that's kinda the point.


You might appreciate one of my older blog posts:

https://drewdevault.com/2016/03/22/Integrating-a-VT220-into-...


I use similar class laptops. My only issue is they get hot when compiling too much or HD video (or even non HD actually).

Maybe it's a thermal paste issue but anyway, that's my main issue with them, if they could run at full speed without reaching 90degC they'd be near perfect.


My current employer gave me an HP with a TN panel. That's made it quite frustrating when viewing code. Average memory and average cpu are find especially when ssh into a build server. IPS displays have really changed how I use laptops and switching back to TN shows it.


I've been developing in go/node/python on a headless 1cpu 512mb 20gb vultr instance. Tmux, vim, w3m and weechat. I use mosh and firefox from a laptop/desktop/phone/tablet and never had any issues. It does prevent me from writing rust though.


Here's my reason for never wanting to use old hardware. Admittedly, I don't need the latest and greatest, but I want something that is reasonable. Several years ago I was optimizing a DB query that was a real hog in the development environment. Two days later, and after trying everything I can do, I essentially gave up and checked in the meager optimizations I squeezed out of it. The query gets moved to QA, and runs completely fine. Why? Because the company had a mindset that developers should program on slower hardware to "make it more efficient" and none of them thought through that the run time optimizers might actually perform differently when given enough memory and processor power to run effectively.

Older hardware within certain performance metrics are fine, but it's often a waste of everyone's time to do it to try and save a dollar here or there.


I have the same problem at the moment, I am working with a 4.5 year old MacBook Pro and asking myself when replacing is actually increasing my productivity rather than just a luxury to have a shiny new laptop.


I'd add "working on a monolith java app, along with running Node, IntelliJ and occasionally a VM with Gitlab in it" to the list of domains that require enthusiast grade specs. Oh, and slack.


> In reality, this high-end hardware isn’t really necessary for most applications outside of video encoding, machine learning, and a few other domains.

This should say "really isn't" instead of "isn't really".

In practice, most Microsoft software will still run slow even on high-end hardware. I have a high-end gaming laptop that cost 1500 dollars, and my download folder still takes 10+ seconds to load. Even at work, I regularly find myself waiting because Visual Studio hangs (and this is on a Xeon Phi with 12 gigs of RAM). Meanwhile, XFCE runs happily snappily on my $200 Thinkpad that is more than a decade old. Newer systems just seem to focus on improving throughput instead of improving latency. I am so fed up with unresponsive systems!


Do you have an SSD?


I have the same problem with Windows 10 on my XPS 13 and it has an SSD. File browsing in certain folders makes the machine nearly unusable. Also, windows updates fill the drive and require periodic manual cleanup. My other Windows machine doesn't have this problem.


Interesting. I have an X1 extreme and even it feels slow at times w/ Windows 10 however my much older Win10 desktop is much more responsive.

I am not sure why laptops suck in this regard. Maybe drivers.


Have you upgrades the specs on the x200 at all, say to an IPS display or are you still using the standard?

I personally use an x220 for very similar reasons (upgraded to the Surface Pro 3, hated it, and went back).


I use an X2xx too and agree it's awesome if you just run i3 or something. The major downside, which is not mentioned, is the 1366x768 max resolution. Ugh.


Bigger, better monitors are important accessories for the productive developer. Integrated graphics can't power those monitors. Most laptops sport only integrated graphics. Therefore most laptops are insufficient for the productive developer nowadays. There are a few scenarios where the developer's hardware should certainly be better than the user's, and screen space is one of them. Old hardware is not an excuse to stick to 1080p. Every developer should have moved to the much more spacious 1440p or 4K by now.


As a reasonably productive developer who has used both 4K and 1080p displays regularly, 4K has done very little for my productivity. It just looks good.


If you're going to be staring at a screen for 8 hours a day, you should expect it to look good.


Okay, but it doesn't make me more productive.


How is sway for a daily driver?


Naturally I have a severe bias when answering this question, but I think it's quite good. If you give it a shot, be sure to build the latest version from master or one of the recent betas - don't use 0.15.2.

https://repology.org/metapackage/sway/versions


I love the perspective of the author. Beautifully tuned pragmatism and humility.


I used to inherit computers for free from friends and relatives who were "upgrading" their Windows systems and turn them into servers or workstations at my company. I hadn't bought hardware in many years as I was getting off the ground.

Nowadays, I guess most people have laptops or use mobile cause I haven't gotten one of those in quite a while.


> can compile the Linux kernel from scratch in 20 minutes

so large projects compile time is very big, that's actually kinda bad for software developers.

> it can play 1080p video in real-time

yep until he tries to watch 1080p60fps on youtube, or gets videofiles encoded with 10bit x264 with large amount of reframes (that happens more often than you think btw)


Well, how often do you recompile the whole project from scratch? Incremental builds are a thing.


Happens quite often with large c++ projects and some quirky includes. Change 1 header, recompile whole thing.

For comparison, rebuilding linux kernel with make defconfig takes about 3-4 minutes tops on my laptop, rebuilding one of the work projects - at least 15.


I don't think this points to a problem with the workflow, but rather to a problem with the codebase. If your codebase is so big and complicated and interlinked that it requires frequently recompiling large swaths of it - it may be poorly designed.


And ccache!


It is too buggy and has too many false hits to be used in large projects.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: