I don't know if it should be considered cheating to use an emulator because they have a very different CPU instruction timing profile than real hardware; here's a notable result from before, with real hardware running XP down at 8MHz:
Interestingly, the Pentium Overdrive mentioned in your linked article will actually run relatively recent OSes - the latest release of Slackware, for instance, should still run on it.
I have the 83MHz version in my old 486 (though it's really not a 486 anymore - ha ha), and got Slackware booting on it out of sheer curiosity a few years ago. What surprised me is that, modern software aside, 14.x wasn't much slower than a period-accurate Linux - Slackware 14 (2010s) and Mandrake 7 (1999) booted at more or less the same speed.
That's a really interesting observation, I sort of noticed C++ being slow to compile doesn't necessarily improve a whole lot even on newer hardware, so I wonder if there's a different OS that runs on 90s and 2010's hardware that could run better on more modern hardware.
That reminds me of when I had my first computer as a teenager, an old 386. My buddy said it would be impossible to install Windows 95 on it, so I promptly proved him wrong! Great memory, it's good to know that I could've gone further
I was disappointed when I read that the CPU was virtual, I thought this will be about real hardware. Still, I appreciate the notes about stripping down the OS.
Lol. This gets rediscovered every few years. Pretty funny.
Slowest I’ve run any modern OS on an x86 chip would probably be about 60mhz pentium 1.
I used to daily drive Ubuntu (I think 5.04?) with Beryl and Compiz on a pentium 75 with 512mb of ram (awesome socket5 board) and a pci nvidia 6200. The offloaded composition etc helped a lot. Was a very snappy box all things considered. Helped that I also ran an IDE SSD on it.
> The offloaded composition etc helped a lot. Was a very snappy box all things considered
This has started to bother me recently, performance has gone such a long way but some things are still so damn slow for no good reason.
The input lag article gets posted on HN yearly. Something I just recently encountered, why does Windows still install updates for tens of minutes on a multi-GB/s SSD? Why do email clients lag with just 100 000 emails? It's a bit sad I find.
Right? A lot of people will start going on about how modern OSes “do more” blah blah but a lot of that is complete hokum.
Imo it’s the endless layers of abstraction and frameworks on top of frameworks getting in the way. It’s hard to get a modern OS to respond to input in under, what, 80ms?
I’m not a programmer nor am I a systems designer (for a living. I dabble in this stuff tho) so I can’t really speak for a lot of it but it’s really annoying how modern stuff doesn’t seem to prioritize the interface performance both in input and output over everything else like you’d think they should. At least for consumer end user systems.
The overall perceived performance of my systems has sorta gotten back in line with my systems from the 90s with the advent of fast SSDs. My NeXTs, BeOS systems, SGIs etc were all, perceptually, snappy (OK, the SGIs were hit or miss at times lol). And that’s on low RAM and slow HDDs. But they ran smaller, tighter programs for the most part.
My PCs, running windows especially, seemed to get slower and slower after NT4 (and maybe Windows 2000) from my perspective as all the programs I used ballooned in overall size and switched to “modern” frameworks and cross-platform designs.
Finally started to get decent SSDs and the performance started getting back to how I either remembered older systems or could objectively compare against the ones I still had/have.
You can sorta improve things in a lot of systems by manually changing CPU and IO priority for various system processes (and other tweaks to system schedulers etc). Like on Windows I use Prio to reset CPU affinity and priority for various processes on reboots. The dwm process I always have set to real-time which helps a lot in responsiveness for Windows.
On android, I run some scripts I wrote to renice the IO and CPU priority for several processes related to the UI.
I just feel like this shouldn’t be a thing in 2022.
BeOS was insanely multithreaded and prioritized the UI seemingly above all else. It was always snappy :(
225 processes running on my Windows 11 PC at startup, back with windows XP I got this down to 14, I no longer even try. Microsoft should plant a couple of trees for every copy of Windows 11 installed.
Slow OS updates drive me bonkers on macOS, too. What could it possibly be doing for 40 minutes to upgrade a minor version on a new M1 mac with 32GB RAM and an SSD?
Back in the day I had W2K Pro running (very slowly) on a 66MHz 486. XP was only 18 months or so behind that, and the system requirements probably weren't much different. Wouldn't wanted to have to use it for anything though.
I did some Java development on windows NT 3.51 and a 486 (33Mhz?) back in 1996. Worked fine. That was considered a fast machine at the time. I used Ultraedit and IBM's jikes incremental compiler. We were building some applet. Builds took a few seconds with jikes (as opposed to using javac that came with the jdk). There were no ant, maven, or gradle wasting more time. I had a little batch file that set up the classpath just right and then called the compiler.
That setup was actually faster than building kotlin projects on my current mac M1 laptop. Of course it's doing more stuff these days but build times seem a constant through the ages. Better hardware just leads to more complicated tooling.
Something is wrong with your story, possibly the year (or you lived in a poor place). The 486 DX33 was released in 1990 and was lower midrange in 1994. In 1996, it was more or less outdated. I had that problem, a slow 486 in 1996 ;)
I had a 33MHz 386 in 1998 while my friends were playing quake on their early pentiums. It was hard to get DOS games at that time, being stuck on civI and wolfenstein…
Basically free, my first upgrade a year or two earlier was a 486DX2-66 CPU + board "acquired" from an open container in front of a recycling operation.
I think it's the difference between getting stabbed and shot. Not a lot of love for either. But I can work with both and gradle seems less hassle these days. Especially if you are doing Kotlin.
Reminds me of the time I accidentally set my CPU voltage way too low in AMD PBO.
I thought it was in milivolt, but it was in microvolt IIRC. So I did 1100 base and it was .1v.
It posted actually, but the CPU (2900x) was capped at 500mhz, total package power was like 15w. Took about 3 minutes to boot and that was when I realized something was terribly wrong.
- Every pointer is 8 bytes, and a lot of integers are also 8 bytes, and a ton of structs have padding here and there to the 64-bit boundary.
- All graphics are rendered on huge bitmaps which are later composited using advanced hardware, using double or sometimes triple buffering. This allows for nice effects like translucency, shadows, blurring, etc, while requiring simpler logic and producing basically no visual glitches.
- While at it, just one screenful at 4K / 32bop is slightly less that 32 megabytes.
- A bunch of fonts is pre-rasterized for the high resolution (effectively 3x the horizontal resolution for subpixel smoothing), including fancy emoji fonts.
- A ton of services start by default, from system-wide spell-checking to weather and news reporting. The OS's GUI shell strives to be rich and fancy, with nice large previews of images, documents, etc, and also voice output and input.
- But this all is nothing compared to the resource needs of an HTML5 browser.with a couple dozen graphically intense tabs running SPAs like Gmail or Slack, with all the state, and all the JIT'ed JS.
There's plenty to spend RAM on, compared to my fancy Linux desktop from 1999 which happily lived on 64 megabytes of RAM. Opening ten browser tabs was out of question: tabs mostly did not exist yet :)
Thanks. That was one of the topic we touched on discussing Flash [1] a few days ago. GPU has no idea what Vector is. I wonder how much memory could be saved if we could somehow "compute" all 2D fonts and graphics.
From entry level computing to every appliances. If we want to move baseline memory from 8GB to 16GB, even if it was DDR ( discounting LPDDR price ) that is at least $24 increase in BOM cost, or a total of $48 BOM cost. This is worst for something like Pi or NAS, Router to Smart Appliances where their ASP are much lower. Because of the increase in memory usage, they will either have to rise their price ( which they dont ), let it eat into their margin ( which they wont ), or find $3 BOM cost cutting. Likely coming from SoC.
If you want to experience similar masochism or torture someone else, set their used-memory value very low in msconfig. It can take ages to figure out for a victim.
It's impressive to see the persistence and dedication of people who undertake projects like building miniature versions of historical computers. These types of projects require a lot of time, effort, and attention to detail, and it's impressive to see the results of those efforts.
It is not specifically 5mhz CPU in truth. No pentium class CPU has ever been 5mhz in speed, and pentium has tons of more instructions that Windows require to function than actual original 5mhz CPUs had.
28 Minutes to boot? That's roughly what will be the norm in another decade ;)
It's insane how much faster CPUs have become and how slow an average machine boots to full functionality. The days of 'instant on, instant off' are long behind us.
The amount of persistence behind these projects never fails to amaze me, see also the guy that made a miniature version of the Cray-1:
I threw a solid state drive in my turbo XT and load times even for MS-DOS 3 aren’t good. Loading up Windows 3 is terrible. WordStar can boot okay if you’re not using any TSRs to enhance functionality. You want to load Civilization? There’s a wait.
If we move on to Win98 or 2K on say a Pentium3, with a solid state disk everything becomes somewhat snappy. The SSD is the main gain. Take it away and everything is abysmal.
I love ancient hardware, but if you go back to relive using those machines… it just ain’t what we remember. Our expectations have changed.
The 8bit cartridge based computers really are instant on. I think that may be what the op is referring to. I don’t think any system with a hard drive or sdram can even get to a boot loader in the same amount of time regardless of the latency of the boot medium. It really was a different era.
They also only loaded BASIC instantly. Want to load from tape or floppy and you’re back to waiting. Same thing if you run your BASIC program. Instant on in that case was restricted to the ROM BASIC and cartridges.
> It's insane how much faster CPUs have become and how slow an average machine boots to full functionality. The days of 'instant on, instant off' are long behind us.
Both my Mac and custom built PC boot very quickly. Faster than my PCs in the past. I'm not even doing anything special, just grabbing a generic motherboard, NVMe drive, and installing Windows.
Powering off is a rare event, though. Sleep functionality has been rock solid in both OS X and Windows 10 for me in recent times.
> Both my Mac and custom built PC boot very quickly.
It would be interesting to quantify this across many machines / OSs. Say from cold boot to starting a browser and loading a benchmark webpage (to ensure the network is up and running).
My girlfriend's windows 10 on spinning disks take much less than that for the desktop to appear, but I don't think her desktop is really usable until way beyond 25 minutes.
My professional laptop on windows 10 boot in a few seconds but then it takes a minute or 2 for Microsoft Teams to show up and that is with SSDs. In comparison same laptop on linux take a little more time to boot to desktop but I can quickly get a browser window with teams / outlook and open other apps so while it appears slower to boot it is usable roughly at the same time. There are a lot of shitty technics to make windows appear loading faster than it really does.
What matters is not the time for the login screen or desktop to appear but the time until you can be productive on it.
I guess you are joking, but that's could be really something that was thought by someone in marketing. Such as the switch from making ads about technical features of software (~1980') to ads displaying boxes (~1990).
I have exactly one windows box in this house and I avoid turning it on but it is a requirement to operate a particular piece of hardware. It takes several minutes before it has completely booted, it runs Windows 7 on what is a perfectly fine machine for that OS.
And many others besides. Windows can be such a cycle hog. Probably if I installed 64G of RAM and an SSD I could cut it down to something more decent, but I don't use that machine often enough to offset the expense (and never mind driver/installation media hassles, it isn't broken so don't fix it!).
Other machines take on average about a minute or so but they are running Linux, running Windows on them would easily double that.
My daily driver is a T540p (mostly to conserve power) and it works well enough. Not exactly a speed daemon in use but maxed out on RAM (16G), a 2.5 GHz i7 and a fast SSD it doesn't perform nearly as well as my 20 year old self would have expected it to.
OS and browser will happily eat all of that, and I can't use swap because of the audio drop-outs that would cause. If there was a way to shoehorn in more memory I would take it.
> It takes several minutes before it has completely booted, it runs Windows 7 on what is a perfectly fine machine for that OS.
> Probably if I installed 64G of RAM and an SSD...
You're complaining about boot times, but using an OS that has been deprecated for years and using a mechanical HDD?
SSDs have been the norm for a long, long time. Complaining about boot times of deprecated operating systems on outdated hardware that virtually nobody uses in new builds is very misleading.
I hope you can see the irony of complaining about modern boot times by benchmarking things against a completely outdated system.
That machine shipped with Windows 7 and a mechanical HDD. It has 4G of RAM which should be plenty for the purpose.
As for 'new builds', yes, SSDs are faster. But the speed gain from an SSD is negated to some degree by the amount of bloat present in a typical OS and if not for that you shouldn't be able to tell the difference between 'sleep' and 'cold boot', in fact 'cold boot' should be faster (but it really isn't).
As for the complaint: take it with a grain of humor. It was mostly a contrast between booting say my 1980's BBC Micro (you couldn't move your hand from the powerswitch faster than that the prompt appeared) to the times that we have come to accept. SSDs have cut that time to something more acceptable but I fully expect that advantage to be eroded again over the years because that's what happens with software: it always becomes larger and slower, never smaller and faster.
On identical hardware a newer release of any OS is invariably slower than whatever came before.
One of the features of Windows 8 was decreased boot times. I don't think its the only OS to do that either. SSDs make a massive difference even for Windows 7 too.
Interesting, if I ever decide to risk the stability of that box I may try an upgrade. To give you an idea of what it does: this box is the interface to my Tascam mix deck and the firewire support is something that you really don't want to mess with if you don't have to . It runs a very large number of audio channels through that firewire interface and even the slightest change to that configuration risks a never ending nightmare of drop-outs and crashes, usually when you really don't need them.
With Firewire having been deprecated and this mix deck and the windows machine joined at the hip I take the nuisance for granted and just work my way around it. I was lucky to find a card and driver that worked, I must have tried more than 10 different combinations before I found one that worked reliably. Adding an SSD may well upset that balance and would require a complete re-install anyway and that risks ending up with an unstable or unusable system, so that's why that route is closed, though, maybe if I were to copy the image of the current HD to an SSD I could make it work without upsetting things too much.
I think you could probably just clone the HDD to an SSD using a Linux live CD and "pv" or whatever and have no issues. I've done that for Windows installs in the past with very little trouble. Only issue I had was expanding the partition on the new drive with the recovery partition in the way.
Yes, I see that as one possible avenue. The possible issue with that approach is that the SSD is able to interrupt the OS at a much higher rate than an SSD, which may well increase the throughput of the system as a whole but cause latency issues substantial enough to cause drop outs.
These cards run with very small output buffers to reduce their latency and any kind of delay will deplete the buffer resulting in a buffer underrun condition and that manifests as a drop-out in the audio stream.
Another alternative is to get rid of the whole kaboodle and do it all in software, that's definitely a possibility with present day gear but I'm kind of partial to this old rig, it sounds great and is very comfortable to work with, much nicer than any pure software solution that I've seen so far.
> Yes, I see that as one possible avenue. The possible issue with that approach is that the SSD is able to interrupt the OS at a much higher rate than an SSD, which may well increase the throughput of the system as a whole but cause latency issues substantial enough to cause drop outs.
If this is a thing, you are already in burn-it-down territory with your hardware.
I've been doing audio for a pretty long time, and I struggle to conceive of a system so marginally-fit-for-purpose-but-still-extant where drive interrupts will threaten audio timings.
Indeed, it is pretty dicey the way it is set up. The machine that came with the deck had the software installed, a broken firewire card, no installation media. Getting it to work was incredibly tedious and I don't doubt that it is a very fine line between success and failure but work it does, no registered drop outs across many, many hours of working with all audio channels live.
If it wasn't for the firewire requirement this would be a very easy problem to solve. I don't understand why that standard is now so blacklisted that it isn't even supported any more by more modern incarnations of windows.
I just timed the boot speed for my Windows 11 desktop. Once with a regular shutdown, once with fast startup disabled (which was introduced with Windows 8). 10-12 seconds to get to the login screen (after typing my BitLocker password). 7 seconds to login using face authentication. I do have very modern specs (Ryzen 9 5900X, 32 GB RAM, Samsung SSD 970 EVO Plus) but my suspicions are that your boot speed could be dramatically reduced by just upgrading to an SSD.
Here's metrics from Tom's Hardware comparing an HDD to two types of SSDs (SATA and NVMe)[1] on a Windows 10 machine. The HDD had a boot time of 42.9 seconds. The SATA SSD a time of 17.2 and the NVMe SSD a time of 16.1 seconds.
With the advantage of fast startup the HDD machine was able to boot under a minute but the SATA SSD still cut the boot time of the machine by nearly 2/3rds.
I installed Windows 7 on an old 1.66Ghz Core Dúo Mac Mini (circa 2006) with 1.5GB of RAM with a slow 60GB 5200 rpm hard drive and gave it to my mom as a secondary computer. It doesn’t take several minutes to boot. I last played with it about a year ago. That thing won’t die.
I don’t boot anymore. Haven’t for years. I just lift the lid and there it is. But I do remember when it felt like 28 mins.
I think “booting” might become a concept that goes the way of “download.” That is, it will eventually be unknowable to those who didn’t grow up with it. Computers are just on, and files are just accessed.
Funny enough - reboots are becoming a thing with Windows 10/11. I really dislike the way my Windows box now patches and reboots when it feels like it. Without fail, I'll have wherever I had my workflow paused for the night... just wrecked. It was so close to right with Windows 7.
I don't mind reboots, since I sleep at least once a day. The main issue is the inability to restore state. MacOS has reopened everything I had open before a reboot for years now and I'm not sure why Windows still struggles with this.
Are you running some kind of an OS that doesn't need reboots for any of its updates? Are you using some kind of live patching?
I (somewhat) rarely do a full reboot either, and Linux on my laptop may go for a couple of months without a reboot, but I don't really trust that updates to the desktop environment or even lower-level libraries or things like systemd would be in effect without restarting most or all of the system. (Not to mention the kernel, of course, but kernel vulnerabilities that are easily exploitable remotely would probably create a ruckus.)
I guess you could also just trust that there aren't remotely exploitable vulnerabilities in those components, but I honestly don't have the energy to keep track.
By that point rebooting occasionally just becomes simpler.
That's an interesting viewpoint, yes, you are probably right. But 'download' is still a thing even if streaming services are more and more common. And with the frequency of OS updates booting is also still a thing unless you never patch your systems.
Machines these days boot and become usable faster than anything I remember, and I've been using computers for 40 years. Yeah, maybe an IBM PC booting DOS 1 or 2 booted faster, but I don't recall, and didn't use either version very long.
I picked up a mini pc for my son a few weeks ago. €439 for a Ryzen 5600H, 32GB DDR4 RAM and a 500GB NVMe SSD. It comes with Windows 11 Pro and boots in ~10 seconds.
Fedora runs great on it as well, booting in around 15 seconds, with the exception of suspend making the system unstable but that isn't uncommon with Linux sadly.
For the price and form factor these mini pc are great. I just hope it lasts more than a few months :D
My current Linux laptop (System76 darter pro 8, Ubuntu jammy, 12th Gen Intel(R) Core(TM) i5-1240P, UEFI firmware) boots to a login screen pretty snappily, less than a minute, and from there it's up to me how quickly I type my password. I get a usable system very shortly thereafter because Window Maker is pretty lean. Getting a system back from a suspend-to-disk is pretty instantaneous.
http://winhistory.de/more/386/xpmini_eng.htm
Also, it looks like he started with something resembling WinPE, which is already stripped-down compared to a regular install.