>> My computer is also not some old rubbish. It has an Intel Core2Duo CPU with 4 Gb RAM. It was top of the line in 2009.
Just because you had a top of the line PC in 2009 does not disqualify it from being 'old rubbish' in 2023. A Core2Duo is ancient at this point.
>> The truth is I have never been hacked once in the eight years I have had this computer.
Or something has been compromised or exploited and you've had no idea because it's not something obvious like opening your CD-ROM drive automatically.
>> My belief is that computer security is highly overrated. With a decent firewall, which every router has these days, only your own activity can expose you to attack from hostile actors
This probably explains the author's mindset. An outdated belief system that the only thing you need is a firewall to stay safe on the modern Internet.
Even worse, the author's successor computer is a Windows 7 PC, which is already out of support
I have a 2008 iMac that I use every day—more than my i7 laptop. This iMac is also a Core2Duo.
Why? Well, I really like the screen and it happens to be setup in the nicest spot I have in my home to work. In particular, it is the best place to video conference ( Zoom, Teams, or GoToMeeting ). The real answer though is because I when I sit down at it to get things done, I rarely run into a limitation that makes me want to move something more powerful to that spot. I am lazy, and it works. The machine is fast and responsive. Mostly I just forget that it is old. When I am thinking about it, I guess I also get a kick out of it. So, it adds a bit of fun.
I do email and office work ( documents, spreadsheets, presentations ) and browse the web of course. I author and run a few containers ( Podman ) to verify stuff engineering is creating for a key project. I am using it to teach myself Kubernetes and a bit of DevOps. A lot of the “heavy processing” I do it “in the cloud” anyway. Obviously I am not gaming or editing hi-res video. That said, I do so a fair bit of audio processing on this machine. It is my “work from home” computer.
I originally put Arch Linux on this machine to “get through a few days” when my laptop got damaged. “A few days” has been a couple of years now because it has worked so well. MacOS was really slow on the few apps I tried. Worse, like the author, I found that none of the modern software I want to use worked. With Arch though, everything is up to the minute. For browsers, I have Firefox, Microsoft Edge, and Slimjet ( all the very latest versions ). All my compilers are right up to date ( Clang, GCC ). I am even running .NET 7 on this machine. As Arch is a rolling release and I update every day, I have the very latest security patches for everything.
I am not advocating we all use old computers. I guess my point is that, for most of my day to day professional life, a machine much like the one the author had has been just fine. It surprised me too.
Right there with you, with an old i5 iMac from 2009 as my "daily driver," also because it is set up in the main room with an incredible view, and I love its 27" non-Retina display.
I use the 2009 iMac more than my 2023 M2 Pro Mini, although the latter machine is IMPRESSIVE (to say the least).
The irony from the author's statement, is that Core 2 Duo was in no way top of the line in 2009.
Core 2 Quad was already out for a couple of years, and in 2009 Intel already had the Nehalem based i7 with quad cores and 8 threads that moved the goalposts an insane amount it became a staple CPU for the next decade till Ryzen launched.
I still have a Core 2 Quad and Duo chips kicking around the house but at around 17 years old they're long in the tooth already to be remotely useful at anything. Even a raspberry pi would be better.
A 4 year old Android flagships has more performance than those, let alone performance per watt where they're just not great to fire up at EU energy prices. At 65w-95w they make the most insane gas guzzling machines you could ever (not) want.
Unless you get your energy for free, best sunset and scrap them for something more power efficient.
> in 2009 Intel already had the Nehalem based i7 with quad cores and 8 threads that moved the goalposts an insane amount it became a staple CPU for the next decade till Ryzen launched
Used one myself until 2018, was a great piece of tech.
That's the thing. I'm an it professional. I've had a computer since I was 6 (30years ago). Couple of decades ago, I could be reasonably confident that my pc was safe.
Today? I take above average precautions but I would never claim that my system is not pwned on some level. That feels like arrogance and hubris (as well as a little bit of "I played Russian roulette several times and I'm fine! It's dangers are overblown:)
Speaking of the CD-ROM drive, Windows XP is vulnerable to malicious CD-ROM discs thanks to AutoRun [1] and can be compromised just by inserting a CD, from which Windows XP will happily execute code automatically. Firewalls don't protect against that.
That's why everyone and their dog held Shift when inserting any disk. Moreover, CD-ROM autorun is nothing at all compared to the real issue of autorun on USB flash drives. That's how viruses really spread among commoners, and specific tools were invented to fight it. That you haven't mentioned it, and focused on something that has existed since Windows 95, when non-factory-made (i. e. burned) CDs were rare, probably means you've had zero experience with any of that.
To be fair, there's a lot of outdated hardware and software running behind the scenes right now where the mitigation is precisely to "firewall" it, but I feel like the sense in which that's used colloquially by people who do this for a living and the way that word is used by this guy are miles apart.
In very broad terms the idea should be to profile what the outdated thing is actually needed for and restrict all possible requests to this narrow allowlist and put it behind a gateway that handles the actual connections from the rest of the network and rewrites the requests it forwards.
This guy is talking about arbitrary usage by a human which is far more risky and all he thinks is needed is to block off some unused ports and protocols.
>Or something has been compromised or exploited and you've had no idea because it's not something obvious like opening your CD-ROM drive automatically.
This is addressed two paragraphs after the one you've quoted.
> This is addressed two paragraphs after the one you've quoted.
And it's equally an incomplete assessment based on the same "This is fine" attitude. The computer could be participating in a botnet dedicated to bring down the network in hospitals or steal their sensitive medical data, and the author would never know.
Does using Windows 11 and keeping up with the latest updates as soon as they’re available prevent that from happening, compared to the author’s setup? Nothing would help if you click on the wrong download link for your favourite software in either case, and an innocent PDF has hacked Linus Tech Tips nevertheless. At least, by not keeping up to date you’re not opening yourself to new bugs.
You're comparing using an obsolete version full of known exploits in the wild, which may be hacked by automated scripts, with adopting a bleeding edge system which may or may not have undisclosed vulnerabilities potentially know only by a few elite researchers? Try again.
Also you're missing the obvious solution, using a long-term support system with a robust audit process and well timed security releases. Author's setup will NOT be safer than that.
This is not a sensible or technically sound thing to do. It is purely irrational. Win7 is already stuck on Chrome 110/111(?).
You are not gaining much by doing this, and are assuming significant risk. Even if specific portions can be quantified away, it's the unknown unknowns that should scare you.
Try a Linux distro again maybe? With some non-default themes and Wine, it really should make a better WinXP than WinXP.
Or at least use something like https://github.com/kirb/LegacyUpdate or the pirated ISOs that get updates from the Windows Embedded POSReady channel -- Win7 POSReady is supported until late 2024, and XP was until 2019.
There are safe ways to do this, but I worry the author may not be aware of them.
The author said that their experiences with Linux was slow (Debian and Lubuntu were examples) and had no intention of using Linux because of these experiences.
Personally, I don't know what they're talking about, but I've always had a smoother/faster (albeit a bit clunky/glitchy sometimes) with Linux than Windows. Even when I dual booted Win7 and Ubuntu, Linux was always faster. Currently, I'm using XFCE w/ Arch on a 6 year old laptop. Programs open very fast, I get regular sw updates, and I've been running it for work for four years.
Just seen this one on my ancient Thinkpad. Mesa-21.3.5 is fine, mesa-23.0.1 fails to pick up the graphics card and drops to llvmpipe software rendering.
It has to be said that llvmpipe seems usable even with Firefox, but not as good as reverting to mesa-21.3.5 so accelerated graphics actually works.
(17 years is a good run and I have my eye on a nice clean X201)
Do we need this comment parroted every single time someone mentions Linux on this forum? I get it, it's too complicated for you, can we move on instead of rehashing this point over and over again?
The past week we have had two major Linux discussions and in each thread there is a semi flame war that starts because some had to voice yet another "this is why Linux is so hard for nan to use", as if it was a novel or constructive observation. It's not.
> Do we need this comment parroted every single time someone mentions Linux on this forum? I get it, it's too complicated for you, can we move on instead of rehashing this point over and over again?
Yes, if it is a reply to a problem where someone "Tried linux and it was slow, so didn't use". If system configuration debugging were in their taste, they would have done so. So it's the "Just configure parameters <x>" which is the pointless rehashing.
I mean, we're talking about a fourteen year old computer. With a couple commandline options you can be supported on a modern, updated OS, which is more than you can say for any other major OS.
Once I tried to use a newish nvidia GPU on an old cpu. It was the pandemic and parts were hard to come by.
The minute windows update would automatically load the Nvidia driver in the background, the screen would go fully black with no going around.
Searching online, the combination was unsupported by Nvidia and there was nothing you could do.
Linux with nouveau worked flawlessly. No kernel parameter, just boot using a live usb and everything works.
Yes, sometimes you can hit a weird hardware issue and need to revert a firmware blob or add a kernel parameter to disable something. These are rare occurrences, not the norm.
Every platform has its quirks. It's just not true that windows or macos is perfect, you are simply used to all the weird quirks.
Amen. I bought a printer and trying to make it work with Ubuntu is just hell.
Ubuntu is great, but it would be nice if there was some Ubuntu+ addon subscription service where I could pay to make the bullshit go away. I've got other stuff to do than still trying to get peripherals to work, in 2023.
The problem with Linux is that people buy random hardware without any research and expect it to support Linux. Try installing MacOS on a random laptop or Windows on M1 Mac and get the same result.
There should be official compatibility list for laptops of all price ranges and whoever buys something not from the list needs to deal with issues themselves.
The problem with this argument is that the exact same crowd of people is saying
"just use linux instead of windows!"
at the same time as
"you can't expect linux to work on everything! do your research!"
So which one is it? Is Linux an OS that you can just replace Windows with straight away, or is it not? Most people don't care about the underlying reasoning why their printer doesn't work on linux - they just know it would have worked on windows fine.
Are you sure it's the exact same crowd? There are many Linux users who are quite fine with suggesting it as an option, while proposing a more rational approach to transitioning. Heck, part of the reason for live media is to ensure that everything works before taking the dive.
We're talking about a 21 year old OS as alternative. Back then, Windows would also require significant manual configuration to get it running. Windows may have become more "automagic" these days, but at least in the Windows XP era a common part of installing Windows was going to the local library to look at guides and manuals for setting up less common hardware.
Even something as simple as setting up a sound card with recent drivers required going to weird, slow-loading taiwanese websites with no english text to get the current drivers right from the manufacturer of the sound chip.
Even worse, with XP you had no GPU accelerated desktop at all. Sure, lower latency, but the PC noticeably struggled even moving windows if something happened in the background.
I never once went to the library or struggled to install XP, nor have I ever encountered anyone who did. It quite literally pretty much just worked. Maybe it comes down to hardware choice?
Then you pretty much only saw the tail end of XP, at the beginning it was very much a struggle. Especially with old network adapters and SCSI devices being problematic, as well as many old 9x drivers not running on XP anymore (after all, XP was the first consumer Windows on NT).
I was using XP prior to public release, and used every version of it, including Media Center edition. Like I said, and like you stated in a very roundabout way whilst dismissing what I had to say, it comes down to choice of hardware. For the average Joe, there weren't problems, they were building machines with current-gen hardware and eschewing yesteryear hardware too. Anybody who had problems just bought new hardware, they didn't muck around at the library. Perhaps libraries where you are provide better information, but libraries in the UK at the time were pretty much the last place you would go for technical documentation.
> Anybody who had problems just bought new hardware
And if you do so, Linux works just fine as well. But you were comparing Windows XP to people trying to install linux on old, specialty hardware bought for use with Windows, so we’ll have to keep the same circumstances as well.
I don't think that this sort of attitude will result in an environment that's encouraging for more people to use Linux distros. A better approach might be going straight into suggesting whatever information helped you in the past.
Not all of those are always up to date, though, so some digging around might be needed. Many will just give up or not even try, if they're faced with a dismissive attitude. Dialogue around what is the most helpful, accurate and up to date guide would be better!
> I don't think that this sort of attitude will result in an environment that's encouraging for more people to use Linux distros.
Linux crowd, seems have not made their mind on what they want to reach as a product for end user. Or even do they want to have end user, not another cool kid to hang on with in IRC and dig inside OS. Some say - I wanna Linux be used by everyone! Other say - works for me and I don't care on the rest [of the loosers who cannot read hex dumps, ha ha ha]. Somehow the success of Chromebooks being ignored and not learnt from.
Thus, without clear goal, mission, product vision and focused team of product managers it's kept being amorphous [as I see it]. Definitely something to learn from WSL project made by Microsoft. They found the need - they did it. You may even see it as cathedral vs bazaar issue.
The only distant focused effort I'm aware about is Canonical/Ubuntu here ( I'm not sure on RH/Suse efforts ) - they are working on MDM with Intune, they have at least some telemetry ( not totally blind on real user cases ), they have Pro edition and even cooperated to be the first WSL distro, naturally paying back in brand awareness, common approaches and so on.
You can either have a system where somebody else is deciding on your behalf how your system is going to be configured, or you can have a system where it's up to you to opt-in to the stuff you want. You can't have both.
I think he's talking about perceived UI speed. Things like moving the mouse, starting an application or dragging a window respond fast on Windows because of how the system is set up (mouse cursor practically has top priority and graphics are hardware accelerated). This is especially noticeable on slower hardware and can cause a machine to feel 'snappier'.
How old is old? I use Debian on an 11-year old cheap laptop, and it runs fine. No GNOME, of course (it's the performance equivalent of Windows on pretty much any hardware).
> This is not a sensible or technically sound thing to do. It is purely irrational.
It is not purely irrational, in fact it is spelled right out in the article why the author does so: he finds the Windows XP interface much better as a user than any other OS.
I can't say i fully agree personally, but the core of the problem and really the reason i don't see any of this as irrational is that people do get used to the interfaces of the OSes and programs they use and updates pretty much always tend to come with changes to those interfaces that are seen as degradation.
Though i do disagree with his assessment of Linux: while perhaps whatever he tried was slower than Windows XP (i remember reading that Lubuntu doesn't focus on low end systems anymore), Linux can be much faster. And also it is the only mainstream OS where you can have the latest and greatest underlying kernel and libraries without being forced to change your desktop interface (though depending on your choices, some DE can take more works than others).
From the mainstream stuff, XFCE might be close after some configuration (especially if you install some of the more Windows-like themes) but MATE/GNOME2 isn't really that XP-like. Trinity (KDE 3.x fork) would be closer though.
As mentioned in another reply IceWM would also be very close - at least compared to classic Windows - but that only gives you the taskbar and window themes.
This is a perfect example of modern day superstitions, in no way different from superstitions of stereotypical hairy cavemen sitting around the fire. You react to a program release that is mere three months old with a perfectly capable layout engine that has gathered enormous amount of functionality over who knows how many man-centuries across itself and its library dependencies as if you were told to drive an ox-cart. The talk about “safety” has long become a creed repeated thoughtlessly again and again. There's an evident manipulation in making people fear hackers and viruses so much when those who make the most money on controlling users' systems are in fact the ones who offer “better safety”.
Updates for the sake of updates is not a passive, neutral status quo. It benefits certain participants (both the ones on top of the power hierarchy capable of rotating the hamster wheel faster and faster, and the ones on the bottom, who, in accordance with this or that fashion, slap together something that is not supposed to work at all unless you update everything) while others pay the price. And it's the path of least resistance that people choose.
In the end, it's just marketing. If you can add a couple of commas here and there to make the whole thing crumble, which allows you to announce “newer better version with support for new technology, etc.”, it's way easier than explaining what you change, and why, to the inquiring public. The IT public, as we can see, is pathetically neutered, and acts like careless kids in the free-for-all amusement park.
Also, the author can totally get Linux with the leanest DE he can still stomach, and have all the fresh software. Core 2 Duo with 4 GB of memory is still a decent system, especially with an SSD. However, his problem is NOT the software, his problem is that the websites he uses stop working.
I honestly have no idea whether you are supporting what was said with that link, or objecting to it.
Anyway, I was talking about being inquiring. When you look at that list, surely, some questions do rise. “Why should I worry about all of that to read three paragraphs of text on some webpage?” “Which actions have made it so?” “What should be done to fix it?” Right?
I am objecting to what you are saying. It's terrible advice and an obtuse vision of the complexity of software nowadays, especially web browser.
Yes, a 3 months old release of chrome is not suited to use day to day. I link all the known and published CVE on chrome, but if you want to nickpick you could "just" check those which start with 2023-*.
OK. Shouldn't we ask questions about real or perceived “complexity of software nowadays”?
Why does displaying a piece of information that would fit onto a single 80×25 text mode screen absolutely require exposing to a third party a potentially (and, as mentioned, effectively) vulnerable WebHID functionality (which is non-standard, and seemingly only exists to make ChromeOS less mediocre operating system), various WebGL libraries and wrappers, ever-growing Javascript and CSS engines, and thousands of other entities? Someone who grants the whole internet access to local service ports by not using a firewall is considered a fool, but at the same time “non-foolish” “security conscious” people start their browsers, and see no problem in all the services embedded in them.
Isn't relying on a constant (and never ending) stream of updates from white knights in the holy castle in the manner you describe just a subscription model without a defined price?
Who controls the Web? Is controlling the web client enough for that? What benefits the endless rat race might give to them?
How come there's a hidden dependence on corporate products and their support cycles even in the process of using seemingly “open” technologies, say, for government sites and services? Is “I have no idea, my code absolutely requires latest libraries” a valid excuse?
Can mindless acceptance and circular finger-pointing between web developers, library authors, browser developers, and users solve these problems? What needs to be done?
My assumption is they like the UX of Windows XP better than later versions of Windows. I sympathize, because I think that flat software UI design has made computers less fun to use, but I just use theming software at the expense of the stability of my explorer.exe
Writing security updates costs money-and as the software gets older, the cost generally goes up too. Is it wrong for them to decide, after a reasonable period, to stop spending that money on outdated versions? What happens then when a small subset of customers says “we need those updates so much, we’ll pay extra for them?” Is it wrong to take their money and spend it on writing the updates they request, with a reasonable margin on top? But once you are selling updates to that small subset of customers, if you start giving the very same updates away for free to everyone else, those customers are going to ask for their money back. I don’t see any “psychopathy” here, just rational business decisions.
This doesn’t just happen for operating systems. Many vendors of on-promise proprietary software do the same-databases, middleware, ERP suites, etc-most of them have a special program for customers who want to stay on really old versions, in which they pay extra $$$ for updates compared to customers willing to move to a more current one. If you accept proprietary software as ethical, I can’t see how this practice can reasonably be accused of being unethical.
The problem with the newer options is that they are not paid strictly with money anymore. Windows 10/11 use a ton of telemetry you can't turn off.
I use them anyway but only for gaming to limit my exposure.
> Many vendors of on-promise proprietary software do the same-databases, middleware, ERP suites, etc-most of them have a special program for customers who want to stay on really old versions
You can't really compare enterprise class software with something that is sold directly to consumers.
> The problem with the newer options is that they are not paid strictly with money anymore. Windows 10/11 use a ton of telemetry you can't turn off.
Enterprise, Server and Education licenses support "Diagnostic Data Off" telemetry level. I think they'll sell Server to anyone, and even though Enterprise is only sold to "businesses", it doesn't cost much to set up a corporation, and now you are a business customer.
Not saying I'm a fan of Microsoft's telemetry policies–but it is not like people are forced to buy Windows, especially nowadays, there are other options: macOS, Linux, Chromebooks. Don't have to worry about telemetry on my Linux box, and Apple is a much more privacy-friendly company than Microsoft. If people don't like Microsoft's policies, they can vote with their feet.
> You can't really compare enterprise class software with something that is sold directly to consumers.
Microsoft no longer supports Windows 7 for consumers, only for certain enterprise use cases. So in the context in which it is still supported, it is enterprise class software, and other enterprise class software is a valid comparison.
> Apple is a much more privacy-friendly company than Microsoft.
I'm not in agreement here. Apple likes to paint themselves as more privacy-friendly, yes. Whether they actually are is another story.
They have a lot better PR that prevents them from doing stupid skullduggery like unwanted news and "special offers" and "payment plans" in edge. But a Mac calls home a lot even when you are turning telemetry "off".
I'll try the enterprise version (I have MDSN) but last time I checked it I am pretty sure it gave the same options.
I used to work for Oracle. I'm still friends with people who I worked with at Oracle. Many of my current colleagues used to work for Oracle. My current manager, and his manager too, used to work for Oracle.
Did we always agree with the decisions of Oracle executive management? Nope. Of course, so long as you work there, you can't say that publicly. But, even though there's quite a few things that Oracle did while I was there that I strongly disagreed with – I can't see the problem with charging extra for supporting really old versions. It is just common sense, and it is part of the package the customer signed up to at the start – the collateral given to the customer as part of every deal explained it. Many other vendors do it too. A consumer might legitimately claim they didn't understand what they'd signed up to, but that's not believable when a billion dollar company signs a million dollar deal.
Oracle has over 100,000 employees today. I have no idea how many ex-employees there are – I've met so many over the years, we are everywhere – but I guess it must be well over 1 million by now.
You are deliberately putting in extra effort to make your own product less secure. Often with consequences that will primarily be felt by people far down the impact chain, who probably didn't even know that the choice was being made.
The extended update programs are/were only available for corporate users as it's a pay-to-subsidize-availability program that costs more per year than the original OS license. While you may disagree if that's good or not jumping straight to labeling it psychopathic behavior seems more like an argument which came from a conclusion than the other way around.
I think a lot of companies agree with you more or less — well, companies don’t have personalities or psychologies, so psychopathy is not really possible. But charging for security updates is too much of a reputation risk.
Of course because they are profit driven entities and they can’t charge for security patches, they just stop writing them.
Companies are not unaccountable entities. They're headed by people who make and carry out these decisions. And withholding security updates as a means of planned obsolescence is absolutely a consciously directed strategy. In many ways companies act as too much of a shield for bad behavior. When a "company" does something awful, it's not the company - but the executive leadership doing that.
"My belief is that computer security is highly overrated. [...] This can actually be quantifiable. [...] In 2021 Project Zero registered 10 0-days that concerned Windows. [...] None affected Windows XP."
Clever! Security by running an OS which none of the exploit authors are targeting anymore.
sure in the tech industry we all know the best practices and more or less try to follow them, but in general I've found that I often have hardware that keeps working long past the desire of the corporate overlords to support it:
- a windows 2000 laptop that I only stopped using (a while) after win2k stopped getting security updates, and the linux i put on it didn't work very well.
- a flip phone that was gloriously good at doing the only two things it knew how to do (calls and sms) that I had to stop using because it literally stopped having compatible towers to talk to.
- a 2013 macbook with a keyboard that still works, but it can't get the latest OS updates anymore, much like OP's XP machine.
I feel like there's a willful blindness -- it's very profitable for companies to keep everyone upgrading all the time, but it creates a lot of trash and there's really no need. There's a huge swath of the population who doesn't mind having 20 year old appliances or cars, and who views their digital tools as just another appliance to be replaced as infrequently as possible.
Those of us in tech need to consider the moral prerogative of slowing the trash cycle and the frugality enabled by letting things work for decades. And to stop pretending there isn't a user base out there who will insist on using things for decades even if we tell them it's not advisable.
> a flip phone that was gloriously good at doing the only two things it knew how to do (calls and sms) that I had to stop using because it stopped having compatible towers to talk to.
That was the situation that finally prompted me to buy an iPhone.
Does it matter? An attacker has to know about a vulnerability to use it. Unless there's some reason that security researchers would be more focused on vulnerabilities in new systems than attackers are, or something like that.
This totally ignores the concept of "zero-days", vulnerabilities discovered that are only known by bad actors or known but not patched yet.
This is very common, and these vulnerabilities are sold for a lot of money and the buyers make sure to protect their investment as long as possible. Of course that also means they go exclusively for high-value targets which means end users are most likely fine (see Pegasus as an example). But that's not the same as being actually secure.
Either Windows XP has no vulnerabilities left and we should all immediately start using it or it has lingering vulnerabilities and they simply aren't noticed by more recent projects like Google Project Zero.
> This totally ignores the concept of "zero-days", vulnerabilities discovered that are only known by bad actors or known but not patched yet.
I'm not ignoring them, I'm just assuming that vulnerabilities discovered by bad actors will follow more or less the same distribution as those found by security researchers.
I would assume that security researchers and most attackers go after what the general public has. But I wonder, are there still a bunch of old industrial/medical machines running XP?
Now I have the mental image of some intelligence guy working on a “get into our geopolitical rivals’ grid in case there’s a war and we need to cause a blackout” discovering the author of the post.
Maybe he will meet up with some friends in “assorted fraud” industry at the local after work watering hole and they can nostalgically steal the post author’s bitcoins together. Hacking an XP machine might trigger a midlife crisis though.
I'd be amazed if there aren't still a number of Win2k or older machines in industrial settings. Until a few weeks ago I was using an Imagesetter at work which had a Windows Server 2003 machine running its RIP software. (Apparently it's just about possible to massage the custom PCI card's driver into working on Windows 7 - but it's easier just to use the intended OS.)
Just the other day I found myself capturing a Vista machine into a VM just to preserve some perpetually-licensed software that we can't ever install again because the activation server's gone away
There is a good chance that all or most of the exploits found in modern windows actually work on Windows XP, but the reporters did not test for it as no one uses XP.
Someone looking to hack the last remaining XP machines would just scroll down the list of new exploits and test them out.
From my experience, there is a lot of software that will not build for platform older than 7/2008 due to missing APIs. Visual studio stopped support for xp compilation awhile ago. You still can use clang or such for simple programs today, but chromium is probably too complicated to port at this point.
I thought I saw it in Visual Studio 2019 but I'm misremembering - Visual Studio 2017 was the last to support building for Windows XP by the looks of it.
There is no bigger security risk with using old OS over time as hackers stop focusing on the old stuff because the attack surface is reduced simply by the number of victims being so low.
As for why the mouse is fast in Windows the interrupts are handled by the kernel in a very special way to perform almost mind reading levels of speed even over the bloated USB stack.
I suspect it to be a nightmare to maintain as USB upgrades happen but who knows.
I used Windows 7 until my SSD corrupted pretty much 5 years on the day after I installed it on the 400GB Intel drive. It started with games crashing and then after one such crash (black screen, not blue screen) it just refused to boot because some important file was unreadable.
Choosing your SSD will be the most important choice you make in the future, as I said previously I have 2x Windows 7 and 2x linux machines on older drives that have lasted 10+ years 24/7. So something is wrong with modern SSDs, no doubt about it.
About sites deprecating browsers, you might be more lucky with really old browsers now a days. My bank refused to work on a 3 months old Firefox, but a 6 year old Firefox (installed to run Java Applets which BTW where WAAAAY better than .js + WASM, and no; they where not insecure like all the naysayers pretend) ran fine with some layout glitches.
Yeah but it's not like they are all guaranteed to fail. I still use an OCZ Agility 3 in one of my PCs to this very day, it runs 24/7. And that was probably one of the worst "early" SSDs for failure rates.
a lot of samsung drives have firmware self-destruct degradation bugs lately for some reason, so absolutely avoid those.
People miss how much of a security risk using w10/11 is because microsoft is constantly pushing updates that break your install like an xp era virus used to, or just straight up doing malicious things adware and other malware used to do. these are not "security minded" operating systems. You are not "better" for using win10 in terms of security.
I actually RTFA and was thinking, "this dude should upgrade and run Win7 like me," - and was pleasantly surprised to find he is.
The only issue I've had was a few months ago Google Chrome now gives a small nag-bar on startup - supposedly dismissisable with a -flag, but the flag itself is either no longer functioning for some smirk sanctimonious reason, or more likely, hilariously not tested, and has since ironically regressed.
The only gripe from XP->Vista->7 is the respective mediocre, to amazing, to abysmal local search functionality.
Regarding 0-days: back when I had virtual valuables and wasn't a complete random, I had been specifically targeted and smitten with a TeamViewer 0-day. But, like the author notes, that would had happened regardless of my subscription to any one of the myriad of dubious AV products of the era.
win7, with its incredible native and specific backward compatibility mode, supports more apps than any other
virtual ecosystem. 25+ years of great software - some of which runs smoother than their great descendent counterpart. Shame to not at least have a box sitting around for the occasional sporadic encounter with an inane, archaic file type, needing to be fandangled into a newer format.
Steam just announced[0] their client will no longer work on Windows 7 come Jan 1st 2024. I figured this would happen as soon as I noticed they adopted Chromium Embedded Framework, guaranteeing that Google's supported OSes become Steam's supported OSes. I'm not exactly sure how I'm supposed to play Steam games from 2004 (or whatever) on the OS they were designed for, when Steam won't even run on that OS anymore. I guess I can hope that compatibility layers work well enough in newer Windows versions? Just kidding, my Windows gaming PC will now be a Linux gaming PC :)
Ah sick, but will they actually launch? IIRC the DRM requires that you sign in at least once a month (or something) to refresh the activation stuff so you can actually launch the game(s).
That was the era when my systems felt “most stable.”
Ok, I was also slipstreaming my installation CD-Rs and really putting time into it. Yet. The things that used the most memory and power were probably flash player and counterstrike. Those were not bad times!
I seriously never got malware, and always wondered what everyone else was doing differently than I.
If only filename search is needed the Everything would be enough, free.
But, surprisingly, most of the time I find what I need with built-in search in the Win10 Start Menu. Although I had to slap her hands not to show me web results.
Once upon a time updates were something I'd get excited for. Now it's a dread of what will break next, what new advertising will assault my eyeballs and waste my attention, and what telemetry will subvert my privacy or deteriorate my control over my device.
This is why I've "just said no" to Windows ≥ 10. I really want it, but these anti-features need a reliable off switch.
To be fair, most of those are uniquely Windows problems. If you pick a halfway decent Linux distribution, you will never see advertising, telemetry, or lack of control, and updates are a mostly positive experience. I switched from Windows 7 to Linux around 10 years ago and it's been an improvement in ~99% of my use cases. The only thing Windows really does better is games and the occasional (rare) piece of software which doesn't have a good equivalent for Linux. In those 10 years I've only found myself running a Windows VM a handful of times to get at some critical functionality I needed from Windows-only software. Switching was one of the best IT decisions I ever made.
Corporations like Microsoft have successfully hammered this idea into people's heads that the evil hacker known as Anonymous will hack into your fridge and make it explode and burn your house down if you forget to update it for two weeks.
Almost nobody who makes a big deal out of this has ever actually been affected by it, they're just parroting what they're told.
I mean, I used to say that updating was important when updates were painless and worked. Now that we've successfully convinced everyone to always update, corporations use it to push fresh malware/ads.
>Almost nobody who makes a big deal out of this has ever actually been affected by it, they're just parroting what they're told.
Either that or they're paid to come and clean up the mess after a company has been hacked due to one or two random servers not being updated in the last few years.
Servers (especially corporate ones) are an entirely different matter. This post is specifically about personal devices that probably aren't running any server software and are almost definitely behind a firewall.
Windows, and more generally Microsoft, updates regularly break things. Each time I see the update notification I kind of shudder. A notable one from 2018 was literally deleting user files. [1] Whenever there's an update for e.g. Visual Studio, I think most have learned that it's smart to wait for the update to the update. And see if there's an update for the update to the update, before finally risking going through with it. And I'm talking about the release channel!
It gets even better there as well, until somewhat recently after Microsoft would release a Visual Studio update, older versions would be removed from their site, and there was no way to roll back. So if you chose to update, and it broke stuff, then you were in a fun spot. Fortunately that is no longer the case, but ugh - bad memories. This is one of the big reasons I'd like to move over to Linux, but games + work make it difficult to pack up and move. For all my bitching about Microsoft's practices in general, I don't see myself moving away from Visual Studio any time soon, and it's Windows only.
Apple has also long decided to hide the scrollbar. And they also change stuff around constantly. This is one of the reasons I moved away from Mac.
Not saying your point isn't valid but it goes for Apple too.
I'm on FreeBSD now myself (with KDE, which has a ton of options so I can finally set things up the way I like again).
Apple was actually pretty much aligned with how I liked things, but in the past years things have become much worse. I hated the flat redesign, I hated mission control and its multiple desktops in a row instead of a grid, and many more changes that were forced on me without having a choice.
In Ubuntu, there is a setting to have all security updates happen automatically every day in the background without ever having to bother the user. And its been that way for over a decade. You don't have to be alerted, wait, and restart your whole computer for every little thing. I'm surprised how Mac/Windows still never bothered to catch up. The user should not be bothered by trivial maintenance tasks that the OS should handle.
Hell, I'd do this for all updates if Firefox didn't require an immediate browser restart every time its upgraded.
> You don't have to be alerted, wait, and restart your whole computer for every little thing. I'm surprised how Mac/Windows still never bothered to catch up.
Cannot comment anything on Mac, have no experience.
On Windows side, your statement is simply waaay overestimated and honestly speaking is not true. Using your words - Hell, it's even can update video/networking drivers _online_, without restart, not even mentioning smaller things like Defender definitions updates.
Those apps installed from Store, do autoupdate without requiring restart as well.
Firefox, shows a tip on update is available and patiently waits, nothing "require an immediate browser restart".
Heck, you can even configure to use devices in your local network to act like caching of downloaded data and to not redownload the same stuff from internet to each workstatation in your network.
You either stuck in the past of WinXP era or producing defamation on purpose.
Coming back to Ubuntu, question regarding
> In Ubuntu, there is a setting to have all security updates happen automatically every day in the background
will it honor you are on metered connection like WiFi hotspot shared from phone?
My daily driver is FreeBSD and it doesn't have this, but even still it's not much of a bother at all. I just reboot when there is a need to and am back running in 5 minutes.
Of course Linux has this live patching thing to the kernel which is pretty cool. But I thought you needed an Ubuntu One subscription to access that on Ubuntu?
Your point may or may not be true, but it doesn't follow from your logic.
I've had more e-ink devices fail by being knocked into the river while fishing than I have from any other cause of accident, does that mean fishing is the most common cause of e-ink device death because it's my personal anecdotal experience? I doubt it.
That's a bad analogy. It would be more applicable if there were a nebulous group of people dedicated to killing the largest number of e-ink devices by any means necessary so they focus on the most popular ones because it is easier to do so.
From a personal end user standpoint, most people will never be the direct target of a attack (and if you are, you are probably in a situation where being 100% up to date won't save you). If they get owned, it will be by a drive-by exploit or because they directly executed malware.
Drive-by exploits tend to target the most popular systems.
>My belief is that computer security is highly overrated. With a decent firewall, which every router has these days, only your own activity can expose you to attack from hostile actors. All internet browsers also help you by warning when you are about to do something stupid like executing code from a remote source.
This is a ridiculous thing to say. The reason that there are so few misleading ads, so few zero days being exploited, and in general a much less tricky landscape today is because the unyielding and unrelenting efforts of people improving computer security.
I can open youtube on my phone right now and find an ad saying "everyone who clicks here will get $1000", misleading ads are still everywhere even if they might not be malware. And I doubt ads exploiting browser 0-days were actually that common back in the day, ads in general weren't nearly as common.
I still use XP 90% of the time when I just need a quick VM to run some odd Windows utility or other. It's lightweight, it's not filled with the intrusive privacy issues of 10 or 11, and a surprising amount of software is still compatible.
Same. I have WinXP on VirtualBox for all my Windows work. I do have a Win7 image on hand, but I've only had to use it twice over the past several years.
None of Microsoft's operating systems were as good as Windows XP and at least one (8) was a complete disaster, the shock waves from that catastrophe still rumbling around.
By failing to reproduce that success, Microsoft has ceded most of the moral market share to Apple. Though, as a Linux user, every time I see a Mac desktop I can't resist noticing how often would it lock up and display that rotating "busy" mouse cursor in place of what the proud Mac owner actually wanted to do at that moment.
This is not even a radical viewpoint, which is why the downvotes seem irrational. I have seen "normies" pontificating on how Windows XP was perfect for them and how they have seen no benefit going forward.
If I am being maximally charitable, the benefits that come afterwards are drivers, gaming-related (DirectX) or security-related (which is not a user-facing feature). All of them are nice, but not at the cost that the end user perceives; a slow, painful experience, with an OS that crashes randomly and has weird inconsistencies everywhere.
I have mostly stuck to Ubuntu for desktops (since 8.04) and MacOS for laptops (since Snow Leopard).
(Unrelated, I also notice the busy cursor, which is why I went back to Ubuntu once I was able to get a desktop)
My daily driver computing timeline is something like Win3.1, 95, 98, ME, XP, OSX 10.4-10.11, then some combo of Ubuntu/Pop_OS.
I think Windows peaked with XP and macOS with Snow Leopard, with the caveats that Spotlight-like searching in both was primitive/nonexistent at that time and would have been a great improvement, and XP didn't have virtual desktops.
That 2001-2006 era featured systems that had learned a bunch of lessons from 10-20 years of mainstream GUI computing, but hadn't been consumed by feature bloat and a bunch of Internet connectivity and convergence stuff that I'm not interested in.
XP's design was apparently polarizing but I loved it. I was in a modding phase at that point and loved making my own themes.
For me, the big change came with Windows 2000. It had user benefits from 9x in a stable package. That was the first version that I can play games on and also leave it running for months, just like a Linux box.
The funny thing though is that Windows 2000 was never meant for consumers playing games.
They would have to put up with Windows ME, at least in Microsoft's eyes. Which was the first version you could play games on and leave it running for hours between crashes, lol.
I also used Win 2k Pro and it was an amazing leap coming from 98, yes. My laptop came with ME but it sucked so bad I changed it after a week.
I did not use NT at home, but I occasionally used it at school. For me at that time, the ability to play games had been as strong a requirement as the ability to be stable and run word processing, email clients and VC++. So I would use Win95 and curse its (in)stability and reboot several times per day, but not switch to NT. Also, NT cost an arm and a leg for a student.
Win2k was a huge improvement in every respect. My 2c.
I was in a computer science program and the department literally just gave Windows 2000 keys away. They had a booklet with sheets of stickers and just said take whatever you want. It was easy to give to your friends.
When Windows XP came out, Microsoft unfortunately made them report every key given out, and the free ride was over.
> When it debuts in February, Windows 2000 Professional will sell for an estimated retail price of $319, the same as its predecessor, Windows NT 4 Workstation.
Thank you for the info! As I said, I definitely remember Win2k, but not how I got it. My guess is OEM CD to be most likely, then bootleg, then full retail.
Another option was student bookstore. Microsoft sold its wares at universities at steep discounts.
Edit: a sister comment jogged my memory. I bet it was either a giveaway by an EECS dept or a heavily discounted disc from the university bookstore. Which is why I do not remember paying a noticeable sum for it.
They had actually planned one but withdrew this at the last moment, replacing it by ME which was a hastily scrambled version of 98 with a 2000 look on it. And extremely unstable, at least at launch.
I still wonder how they would have thought this was better in any kind of way. A windows 2000 home release could not possibly have had as many problems as ME.
I remember attempting to run Windows NT 4, but it lacked a lot of hardware support. Windows 2000 Professional fixed a lot of that. I consider Windows 2000 to be the first stable AND usable (for a desktop) version of Windows.
Windows NT before 2000 had problems with games and such, 2000 worked well because most games were now working to target XP ( and 2k had better compatibility with 98 anyway).
My thoughts exactly! Except I wouldn't call it bling, I'd call it a hideous Fisher Price aesthetic.
I stuck with 2000 as long as I could before gaming (and maybe.. a certain version of Visual Studio?) eventually forced a switch to XP (with the "classic" theme).
I’ve been using MacOS as my daily desktop for over a decade, switching from Linux. Seeing the busy cursor is so rare I can’t recall the last time. And I have tons of things open all the time.
Maybe it's related to some cloud service? (However, MS Word used to give you a fair amount of exposure to the beach ball. But I'm not familiar with any of the later versions.)
I've been working professionally using a Mac for 4ish years now. Can count on a single hand the amount of times the cursor is stuck spinning / loading.
95 was trash, 98 was gold. XP was great, Vista was trash. 7 was great, 8 was trash. 10 was gold, 11 is trash.
Seems Microsoft has to squeeze in at least one bad version between good ones.
But never has a new Windows version been made available where people didn't initially hate them at first, XP, 7, and 10 included. It's not until the version after gets released when people really start appreciating what once was.
You have missed Windows ME between XP and 98, which was so awful people don't remember it anymore. For a revolutionary product, Windows 95 wasn't that bad.
95 Good, 98 Bad, 98 SE Good, ME Bad, XP Good, Vista Bad, 7 Good, 8 Bad, 10 Good.
Counting 98 SE as a separate release does seem like cherry-picking in order to make the pattern work. Then again, you could also say the cycle started with Windows 98 (putting the whole of that release into the good pile), and you'd still have a remarkably consistent pattern across two decades and seven versions of Windows. Honestly makes you wonder if there's some deeper cause related to Microsoft's organizational structure or something.
If you're breaking out 98 into two releases, then you should probably break out XP into pre-service-packs and post. IME it wasn't until SP3 that XP really felt solid, depending on what hardware you were running.
>And then there's 11 which doesn't really feel like a new version at all to me.
That's because it isn't, Windows 10 and 11 both identify internally as NT10.0.
For some context: Windows Vista, 7, 8, and 8.1 identified as NT6.0, 6.1, 6.2, and 6.3 respectively; Windows 2000 and XP identified as NT5.0 and 5.1 respectively.
For all the rightful hate 11 gets for ads and tracking crap, once you turn it off it feels... fine. I even don't mind the UI look and really like the new window management, but I don't use windows for anything serious anymore, only gaming so maybe there are other issues.
Some things about Win95 are lost to nostalgia, as well. Like broken drivers, esp. in concert with up-to-date office, which came with issues of its own, like broken export formats. (It took more than a year for serious support for Postscript printers, etc.) Win95 became only somewhat functional with Windows 95B. (This one, however, started the IE integration.)
I had the one PC in that era that WinME was smooth sailing on. It was a Packard Bell with a 200mhz Pentium 1 (96 mb of ram at the time, i think). For whatever reason, and combination of hardware, ME had no issues running. I had less blue screens on it, and overall found it to be enjoyable and much more stable than 98 and 95c was on that machine.
I would personally argue that Windows 2000 wasn't bad, it was just painful in the beginning. It was the transition point from the 9x kernel/setup to the NT kernel/setup and some manufacturers navigated that change better than others in the beginning.
It was the first stable Windows OS that could run all the latest games and desktop applications without constantly crashing or needing a periodic reinstall. It was their best desktop OS at the time. XP was just 2000 with an ugly default theme and added bloat.
95's stellar reception in particular is probably still unmatched to this day.
The only shift in user experience that I can equate it to is maybe the jump from candy bar and flip phones to the iPhone and Android. In addition to the UI, it brought real multitasking to a consumer OS.
Windows 11 cranked up the [user-unwanted] telemetry, mandated TPM 2.0, and eliminated hardware support for certain CPU's for no other reason than 'because security' (overridable with a registry hack at least for the moment).
Instability aside, Windows 98 marked the point where Internet Explorer leeched itself into the operating system to the point of no return (only to be finally reversed in the past year). The complaints of social media apps being "preinstalled" with Windows 11 seem to echo the channel bar, Active Desktop and everything else that arrived with IE4. I believe that the notion that Windows 98 was good came when IE5 were released and pulled back on some of those integrations, which happened to coincide with the timing of 98 Second Edition. In any case it was by no means great, but it was good enough.
I don't think that's true. Vista was so hated, that 7 was pretty much universally heralded as a big improvement, even if half of that was the ecosystem getting used to the new driver model introduced by Vista.
And as others have observed, 2000 was an immediate hit with everyone who tried it.
If you think 10 is "gold" and 11 is "trash" I assume you're suffering from some sort of stockholm syndrome from using 10 too long and haven't actually tried 11 yet because it's just a reskinned 10 (and visually, it looks a lot better).
> I assume you’re suffering from some sort of Stockholm syndrome
Periodic reminder that “Stockholm Syndrome” was a complete fabrication based on no clinical evidence invented on the spot to discredit an ex-hostage who was making legitimate criticism of an unnecessarily violent police intervention by one of the architects of that intervention, because they had no rational defense.
Its invocations in debates are also typically as an attempt at a thought-terminating trope to evoke an emotional response that covers the lack of actual argument, which fits the history of the term well.
haven't actually tried 11 yet because it's just a reskinned 10 (and visually, it looks a lot better).
I've been on 11 since it was released and I find it to be objectively a worse experience. On Win 10, I could game on one screen and stream a video in a browser on the other. On Win 11, I get stuttering on the streamed video.
It's also failed to update twice now. I invested hours into debugging the issue the first time. This time it's been days and I'm about to just do a fresh Win 10 install over this nightmare.
> It's also failed to update twice now. I invested hours into debugging the issue the first time. This time it's been days and I'm about to just do a fresh Win 10 install over this nightmare.
My current 11 install has never failed to update, but I have experienced various update failures before on 10. This sort of issue is present in both versions, it's just random whether you'll experience it or not, and I wouldn't be surprised if the stuttering issue is just some random driver incompatibility too.
This sort of issue is present in both versions, it's just random whether you'll experience it or not,
No, it's not random, which I know because I stumbled upon the solution. The problem was that it was using an old 100MB partition as the EFI boot partition. Nowhere in any of the system logs did it mention this. No error message said anything about it. Ridiculous.
and I wouldn't be surprised if the stuttering issue is just some random driver incompatibility too.
Why? My system is the same hardware after the update. In fact, the only thing that's changed (and this happened awhile after the Win 11 'upgrade') was a video card that's 3 - 5x faster than the old one.
Some people claim this was mostly due to bad 3rd party drivers and not the OS itself. But even if that's true, from the user perspective it's very hard to tell the difference.
Ubuntu has always been this weird OS for me. Its not free enough or fast enough to be linux, it feels like a knockoff of something, but i cant put my finger on it.
Their DE is just such a horrible POS. I had one extension/addon (?), which, due to some JS problem (all those extensions/addons are written in js), was slowly, over hours, hogging more ram, until eventually OOMimg the entire DE.
Sure, this is just a one-off bug, but please just read that to yourself out loud. Addons are JS. They can OOM the machine. Their interface runs over so many different abstractions that there is a whole web browser in there.
So yes, the author may have had a horribly slow experience with Ubuntu, and I believe that. Try LXDE or LXQt or MATE or anything a little more lightweight, and you may have a very different experience.
There is no browser, only JS interpreter. I think that addons should not be downloadable, user should write them themselves and if there are bugs because of it, it's only user's fault.
It seems strange to blame a distribution for a problem you had with a DE. Just change the DE. Or download one of the many Ubuntu flavours which has your preferred DE preinstalled.
Worse, they had a problem with a third-party add-on they installed. An add-on that likely would've been either two orders of magnitude more complex or straight-up impossible on Windows or macOS.
I've had my fair share of OOMs and crashes because of shell customization on Windows and they were all far harder to deal with.
My dirty little secret is that I like Windows XP. I know it inside out and it maximizes my productivity while sitting at the computer.
I wish our industry would realize you don't need to move everyone's cheese around and slap on new lipstick every few years. Each time you do it throws away all the muscle memory your users have built up.
Sometimes I lament that pixels are so cheap to repaint. Imagine if someone reorganized where you keep your dishes on a regular basis.
I can feel this. My main work computer was/is a late 2008 MacPro, running OS X 10.9.5 (the latest MacOS you can run officially on this). After one and half years, I've still not transitioned fully to a shiny, yellow M1 iMac standing next to it. The usability and productivity on the MacPro, its OS and the apps on it are much better, and, on top of it, it's all software that I own.
The real dealbreaker is evergreen browsers and websites updating from whatever is available on Github, making up-to-date browsers a requirement. The newest browser you can run on this is Firefox 78.15.0esr, which, funny enough, tells you that it's up to date.
I also second that security by out-of-date obscurity may be a concept. (The vectors available just don't scale, if you're not a target prominent enough to make this a sensible investment. Moreover, old hardware gives you things like ECC memory, which is also immune to rowhammer attacks, as it's slow and not that tightly integrated. Having no privileged network connections to any cloud services may help, as well.)
(However, I'm writing this on a rather fresh Apple Silicon MBA, because, well, the Web…)
> The newest browser you can run on this is Firefox 78.15.0esr
This made me wonder if there are still working dev kits/tools with browser components available in such a way as to provide a fairly simple pathway from basic dev skills (or just "web browser example" code) --> up-to-date browser engine, running on 10.9.5?
Thinking of somewhat more obscure dev tools too, e.g. PureBasic, etc.
This would have been a viable path, when WebKit was still "a thing" and enjoyed some integration. I guess, with WebKit2, things became more complicated and resource heavy. Given the rapid development cycles and feature releases, I doubt an integrated web-view would be able to keep up, nowadays. Esp., if nothing short of 100% compatibility to current Chrome will do.
(That said, Firefox 78.15.0esr works still fine in most cases, but I doubt that this will be the case much longer, mostly because of CSS feature adoption.)
>I am not entirely sure why some websites do not work properly in Mypal browser. But a growing number is not. The problem is always content loading, or rather a failure to load all or parts of a website's content. My best guess is that the JavaScript code in these malfunctioning homepages are too modern and that the JavaScript engine in Mypal browser is somehow limited in how it interprets the code.
That is because MyPal is a fork of Pale Moon which forked from Firefox back when Firefox was still Firefox (before removing extension user freedoms, adopting Chrome's extension over it's own, and the heavy ram-bloat of multi-process, etc). Pale Moon only implemented "web components" last week. Sites that use web components features like custom html elements (defined by javascript) would just be blank spaces.
I wish Pale Moon hadn't added support but it should filter back in to MyPal eventually.
I do still like the 95/98/XP paradigm best in terms of, start button, non-grouped buttons for the different windows, and some quick launch icons. luckily because I run linux I can have my pick of the litter, cinnamon, MATE, even KDE with some heavy configuring will do it. I wonder what the author didn't like about linux, seems like it's perfect for someone who's stuck in the past like me, if it's not a software compat problem (which from the sound of the article it isn't)
Exactly! I have run this setup forever; for years and years in Xfce, these days in KDE. Recently I found a couple of screenshots, 1996 and 97, from my Windows days. I was surprised - and maybe slightly concerned - to see how precisely they matched my current layout, right down to ridiculous details. Way back then, I needed a third party utility tweak the clock in the lower right corner to my liking. These days the tweaking is integral, but damn it, the clock looks the same.
Windows XP on it's own, behind Windows Firewall and a router firewall not doing anything too controversial may be fine. Where the risks become higher is when you have many machines across the network. That one host would lower the overall security of everything else around limiting the ability to disable weak protocols.
Backwards compatibility with Windows/AD has always caused issues with Active Directory becoming such a juicy target in the way system-to-systems interacted.
Phew, relatable. I get this kind of well-rooted feeling with my Linuxes all the time.
Just this year I moved my T420 from a phenomenal Xubuntu 18.04 install that was _just_ great, to PCLOS. I'm happy with the result after a lot of work, but I know there are still a ton of things that aren't what they should be.
Hundreds of keyboard shortcuts yet to migrate, hundreds of little add-ons skipped, like "did I grab whatever the equivalent of gimp-addons is, since I couldn't use apt-clone", hundreds of little scripts un-integrated from my desktop, including genmon scripts that do valuable things, which don't even have an equivalent for porting to KDE.
Before that it was...Ubuntu Netbook Edition (UNE). Tiny screen but wow it worked amazingly well on my Linux-reseller MSI Wind Netbook. I could hardly bear the upgrades away from that, to different distros, since everything worked super well.
It's hard to feel torn between such UserLand concerns (my stuff is how I like it!) and regular ol' userland concerns (my software does what I need, with the following somewhat clever workarounds).
> Exactly why even very light-weight Linux versions like Lubuntu feels slow on my computers I do not know. Maybe they lack the drivers for hardware optimization compared to Windows.
I remember this EXACT feeling coming from Windows XP, lol. I never did figure out what the difference was, but it definitely went away for me starting around Windows 8 IIRC. Today I can't discern a difference except that occasionally it seems like Linux is faster by default.
It also felt incredibly unjust back then, to a lot of us happy Creative Labs users, to migrate to Linux and learn that our slick cards & low latency didn't work that way anymore, or at all.
If I could, I’d still be running Windows NT 4.0 just for the simplicity and GUI. It was the best OS to come out if Redmond. I wish they had stuck with their “workstation” strategy, but unfortunately not.
The actors attacking old software, are mostly state actors preparing industrial sabotage for cyber warfare. Most xp instances are some industrial equipment with old pcs as controll equipment.
For malware developers those old boxes are uninteresting. To slow, to few, not worth the effort. They are the sloths of the predator and prey ecosystem of software security.
I wonder if he would've been happy running ReactOS or something like Haiku or SerenityOS.
It seems that the major reason why XP worked for him was how dead simple it was. The fact is that there's no modern "simple" OS. I know. I've looked. The point-and-click simplicity of the 90s is quite plainly no longer out there. Even macOS, championed for its supposed simplicity, has grown a large amount of extraneous features and an outright hostility to those who try to use it in a nonstandard way.
Coming to think of it, it wouldn't even matter. All of the OSes I've suggested struggle with the same issue as WinXP, which is that there's no modern browser support. SerenityOS seems to be the only one which is making any actual strides to fixing that with its Ladybug browser (which theoretically could be ported to the other OSes), but even that is a work-in-progress.
But at the end of the days they're just not good enough reasons.
This author doesn't believe that security is important because their crypto currency hasn't been stolen yet.
I don't know how one could possibly do a better job of conveying that they don't care and are just coming up with nonsense reasons to justify their bad computing habits.
They're free to do whatever they want, just like us all.. but I still cringe at the idea that somebody who doesn't know better could read this article and think running Windows 7 now is a good idea.
I switched to Linux from Windows XP. Linux is great in almost every aspect except for memory usage and lots of tiny bugs and things getting constantly broken. If you are not fine with running commands in terminal, it might not work for you.
Recent example: in Fedora the firewall in default configuration blocks DHCP packets when you try to share Internet via bluetooth. In Windows traditions no message is displayed on screen nor printed to logs. Wireshark also doesn't show that packets are dropped by firewall. Takes a lot of time to figure out why the client sends DHCP requests and never gets a reply. Why do they enable firewall by default? And if they enable it, why didn't they configure all necessary services properly? I am not running a server in a datacenter, it is more an osbtacle for me. By the way, Windows XP had no such problem.
But Linux is better than Windows. For example, Windows 11 doesn't allow to use a computer until you configure a network connection. It has data collection without opt-out, advertisement and involuntary updates.
And Linux is better than MacOS because it doesn't make you pay three times more for hardware and get a laptop with soldered RAM and SSD. It is surprising that they have premium prices and at the same time try to save several cents on sockets.
If only Linux didn't have those tiny bugs everywhere it could be a viable alternative for people not familiar with console.
I sometimes wonder how much of the console barrier is just culture, and some things being deemed to be okay to be treated as "unapproachable" in a memey way. The same way corporate incompetence is maybe judged lighter that volunteer incompetence in tech, because of some social prestige aura.
The act of entering some specific text into a field in a computer is something many people experience with the URL bar, or even (for most users) the Google bar. The act of following a tutorial somewhere to achieve a thing is also something that is nowadays practiced in society. Maybe even moreso that earlier, even if people expect the tutorial to be video for some reason.
This makes me think the gap between Linux having a more approachable knowledge ecosystem and people willing to acquire a little more computer competence should be kind of realistic to close.
I was going to comment that many Linux projects can have questionable hyper-"safe" defaults, another example being KDE forcing you to reboot on GUI update at some point. (I'm aware that apparently people were reporting bugs that would go away after rebooting, and they couldn't figure out a better mitigation.) But on the other hand, there are probably distros other than Fedora willing to forego security and ideological purity for more hassle-free start for the average user. So there's choice.
I find it more of an issue that with Linux you have to specify the distro or DE/WM every time because it's ecosystems fractured nature is both a strength and a flaw.
Lots of people laud Mint but I found Cinnamon to be ugly under the hood when going of the beaten path.
I've enjoyed Plasma a lot more but found Kubuntu to have it with a myriad of bad configurations and defaults whilst my experience with manjaro KDE has been great.
On one I often had to rely on command to deal with stuff or use the commandline to get rare packages and to get em working.
On the other there's UI for everything and I flip some checkboxes to have access to the default repositories, the AUR, snap store, flatpaks, etc so that i'm rarely lacking if i'm not opinionated about the source.
Someone can come up with a washlist of complaints and someone else will have experienced none due to different hardware, distro or wm and this becomes evident when discussing with opponents and detractors.
Isn't that basically already the case? The CPU is a System on a Chip. Not much else on that mainboard besides CPU/RAM. The M1/M2 laptops are relatively serviceable compared to the previous generation.
The amount of people in this thread that like the author are not just comfortable with running EOL software but wear it as a badge of honor is truly staggering.
You would think that if there was one place where the dangers of not having the possibility of patches when the next “heartbleed” comes was obvious and understood by all (read: major software vulnerability affecting critical libraries) — it would be a forum mostly composed of software engineers.
They should really try XFCE, it feels extremely similar to old-school windows (95/XP). You can even theme it to look unbelievably similar to old windows. I feel like them installing the latest Xubuntu with XFCE and Firefox with an ad blocker would let that old laptop run for many more years of service. Heck, the latest version of Wine probably runs all their old XP software just fine as-is too.
Change its appearance theme to the windows one (in the settings app), disable the bottom bar/dock and move the top bar down to the bottom. Now you've got a typical windows style start menu. XFCE is very flexible and you can configure the docks/bars to feel like old windows, old Mac, etc. Its file manager (thunar) is basically a a perfect copy of Windows file explorer from the XP era.
Loved XP once the Fischer Price theme was turned off. Loved the now small footprint.
But, it's hard to give up the real terminal, now that you've used it, isn't it? Modern Firefox, Python, tons of software just doesn't support it any longer.
A shame really. I'd love a modern Windows 2000. ReactOS not quite there yet.
> But I store some crypto currency on my computer and I make crypto transactions from time to time. If my computer had been compromised by an invisible attacker I figure that attacker would have had an interest in crypto assets.
Well that's an interesting idea for detecting certain classes of hackers.
The author obviously likes security updates. He isnt running zero day windows xp (which is a hoot to expose to the internet btw) he is running fully patched windows xp.
The issue he says is productivity. With maybe some amount of trying to squeeze more time out of old computers. I can empathise with the first point but not the second.
Productivity isnt a function of just the OS, its also hardware and user skill related. I feel like he could be just as productive on either flavour of 22H2 as Windows 7.
I still have a WinXP 64bit as my main OS, with Office 2003 too, same as the author.
I have a Win7 machine, for gaming, not directly connected to the internet. I hate it so much it's never going to be my main OS.
I'm forced to use Win10 at work, and I hate it even more.
I'm slowly switching to Linux (in a VM in WinXP64). From linux, currently I'm only using Firefox without a desktop environment; anything else is too difficult, too incompatible, too slow or too bloated. To me, Linux desktops really look and feel a lot like Win10, execept with a lot more bugs and lacking Windows' Control Panel (what's left of it, anyway...).
Anyone focusing on constant updates must be really young in my opinion. I understand your point of view. I was like that too. When you grow older you will start to notice how the OS you grew up with could do 90% of what your current OS does, using just 1% of memory, 1% disk space and maybe 10% CPU. At some point you will want to stop all this constant fight to get minimally better features (or even less features) at the cost of ever-faster new hardware and, at the same time, ever-slower UI.
Only then you will be able to understand my point of view.
Reading this article, I feel the old age holding the author from adapting to new systems, I just end up in the mindset of I never ever want to be this person.
After some time you become tired of having to chase GUI elements which were hidden behind ribbons, chasing hidden scrollbars, fighting with various search dialogues or tabs on the titlebar when you want to move the window on another monitor, etc
Adapting to ever-changing systems that are reduced in functionality and remove control for you just means you're a hamster in a spinning wheel, mindlessly going after the latest trend.
I hope to never ever be that person, but you do you.
Use windows 95. It would fly on that computer of the OP. These types of post are the same as the instagram clout chasing teenager's post but in tech world. Weird.
If you care about security and want to run an old Windows with a peace of mind, you should try Windows inside Qubes OS, https://www.qubes-os.org. The latter relies on hardware virtualization, which makes it fast and secure: https://www.qubes-os.org/doc/windows.
I don't really see why windows XP is a better UI. There's lots of stuff that is majorly missing compared to a more modern OS.
This sounds like many I talk to in that they refuse to learn or make any changes in their life. I don't think this is a positive attitude to have. He also sounds like a major cheapskate and my country is full of people like this very hard to argue why spending your money is a good thing (within reason).
>My belief is that computer security is highly overrated. With a decent firewall, which every router has these days, only your own activity can expose you to attack from hostile actors. All internet browsers also help you by warning when you are about to do something stupid like executing code from a remote source
HN uses might be young, but C# malware will run everywhere. Also, remember Blaster/Sasser? Good times.
Wasn't there a Linux distro that was trying to look and feel almost exactly like Windows XP? I'm sure it's not perfect but it'd be worth a shot if you were so used to XP.
There have been many attempts over the years (remember Lindows?), but I would argue that making Linux look like Windows XP is an excercise in frustration because small differences become even more annoying, as they crop up when you don't expect them, lulled by a false sense of familiarity. An OS Unvanney Valley, if you will.
I think his windows 7 build will be fun. It will experience just as long of a tail as XP due to huge amounts of ram being supported. I'm sure the XP machine he's getting off, would blow a Pi4 out of the water utility wise. I don't mind this article one bit. Thanks for sharing!
Hey, Congratulations on your upgrade to Windows 7!
Not 48 hours ago, I spoke highly of the benefits of Windows 7[1], as a user myself.
It is absolutely possible to use Windows 7 reasonably securely, if you take the appropriate precautions(again, see my comment)
> I can use the latest versions of Google Chrome
Firefox? Yes. Chrome? No.
Chromium 109 is the last version to support Windows 7. Here's the last working ungoogled variant [2].
Switch to LibreWolf[3] Firefox based browser with user.js modifications[3] pre-installed. Or if you don't trust librewolf, use Firefox and manually add the same user.js[4]
It's better for security than Chrome could ever hope to be.
Your web browser's javascript continues to be the predominant way for malware to make its entry. So just make sure to take the appropriate security precautions elsewhere, as mentioned in my comment [1].
> When Firefox stopped receiving upgrades I switched to Mypal browser, an open source browser specially made for Windows XP. It is cruder than Google Chrome but does the job most of the time.
Have you tried K-Meleon on Windows XP[5]? It's Old Firefox(pre-Australis) based, and still gets updates.
-----------
Windows 7 is the last legit good Microsoft Operating System.
It truly is a wonder and a delight to use and modify.
Anyway, all the very best on your Windows 7 journey. May it serve you well.
I have a Windows 7 Professional system with my startup's software in some boxes and, since now my office is functional enough, am about to get that system powered up and running again.
So, here's a piece of information relevant to security and versions of Windows, I'd like to give exact and really clear references instead of working just from memory, but for such references just now I'd have to do a lot of digging.
From memory, I discovered that one of the security updates to Windows 7 Professional, as I recall, the last one,
was the same as for a version of Windows Server.
So I guess: Since that version of Windows Server was intended for some of the most serious business computing on the planet, that security update should be pretty good, maybe good enough to be pretty good now!
I've been intending to use Windows 7 Professional as my server for my startup, but ... I'm also considering some version of Windows Server. I do notice that it appears that the versions of Windows Server go much longer, years longer, before "end of life" or "end of support" than the versions of Windows not Windows Server.
So, I'm considering getting a version of Windows Server 2019. I've done the shopping and am intending to buy.
Also I'm thinking of using my computer with Windows 7 Professional as my general purpose and software development computer (it is where I wrote the 100,000 lines of code for my startup) and plug together a new computer for the server for the Web site, etc. That computer would run Windows Server 2019.
I would like to know more about Windows Server, e.g., version 2019, as in
Why not a fresh start with Windows 11? Not that I'm a fan of it or anything, but at least it's the latest and the greatest (in terms of Windows) and will be well supported moving forward, especially compared to ancient versions of Windows.
It’s my humble opinion that a setup like this is only “not hacked” from a combination of dumb luck and low value. Maybe that’s enough for most people. IDK
No disrespect, if I was still a windows user I would be on win7 too, however I can’t help but think about how naive this was with regard to security while I read it. I personally don’t know how XP could make a person more productive, but I really can’t understand throwing all caution to the wind and doing mental gymnastics to rationalize it.
I admit it’s possible that being on XP has some obscurity benefit, just like if you only used DOS. Having a firewall is de rigueur as noted in TFA, but far from foolproof. No single layer of security is airtight.
Probably the OP has heard of defense in depth, maybe not. Security of any system is porous. It’s like layers of Swiss cheese and so long as all the holes don’t line up, you don’t get owned. Being that windows exploits often affect versions all the way back to 3.11, XP might be more holes than cheese.
For quite a long while I feel like my operating system has been Chrome and VS Code, maybe a few other things. I work on both Windows and mac, and forget for extended periods of time what computer I'm even using.
That’s a fair point, and to some extent I think it’s true - especially in modern networked environments.
Short of booting directly into the browser, you’ll still need some minimal UI, at least for switching between VS Code and your browser of choice. That’s the part where the modern OS should not get in the way, but often does.
OP spent next to no time exploring Linux, which is a shame. running modern Debian on a Core 2 Duo is barely a better option than running a modern version of Windows. Alpine would likely run far better.
I maintain that XP had the best UI of all Windows, but you (at least me) can't stick with it, I need the latest updates in order to use the latest software and get my work done.
0patch is an additional tool that is worth checking out. Their engineers reverse engineer Windows patches and backport certain ones to out-of-support OS's.
The author keeps talking about 0-days but fails to mention the myriad of known kernel-level vulnerabilities that are simply left unpatched in older OSes.
You should also quote the rest of the paragraph then:
> Maybe I have been hacked but simply have not noticed. That is also a possibility. But I store some crypto currency on my computer and I make crypto transactions from time to time. If my computer had been compromised by an invisible attacker I figure that attacker would have had an interest in crypto assets.
>> My computer is also not some old rubbish. It has an Intel Core2Duo CPU with 4 Gb RAM. It was top of the line in 2009.
Just because you had a top of the line PC in 2009 does not disqualify it from being 'old rubbish' in 2023. A Core2Duo is ancient at this point.
>> The truth is I have never been hacked once in the eight years I have had this computer.
Or something has been compromised or exploited and you've had no idea because it's not something obvious like opening your CD-ROM drive automatically.
>> My belief is that computer security is highly overrated. With a decent firewall, which every router has these days, only your own activity can expose you to attack from hostile actors
This probably explains the author's mindset. An outdated belief system that the only thing you need is a firewall to stay safe on the modern Internet.
Even worse, the author's successor computer is a Windows 7 PC, which is already out of support