Hacker News new | past | comments | ask | show | jobs | submit login

This is not a sensible or technically sound thing to do. It is purely irrational. Win7 is already stuck on Chrome 110/111(?).

You are not gaining much by doing this, and are assuming significant risk. Even if specific portions can be quantified away, it's the unknown unknowns that should scare you.

Try a Linux distro again maybe? With some non-default themes and Wine, it really should make a better WinXP than WinXP.

Or at least use something like https://github.com/kirb/LegacyUpdate or the pirated ISOs that get updates from the Windows Embedded POSReady channel -- Win7 POSReady is supported until late 2024, and XP was until 2019.

There are safe ways to do this, but I worry the author may not be aware of them.




The author said that their experiences with Linux was slow (Debian and Lubuntu were examples) and had no intention of using Linux because of these experiences.

Personally, I don't know what they're talking about, but I've always had a smoother/faster (albeit a bit clunky/glitchy sometimes) with Linux than Windows. Even when I dual booted Win7 and Ubuntu, Linux was always faster. Currently, I'm using XFCE w/ Arch on a 6 year old laptop. Programs open very fast, I get regular sw updates, and I've been running it for work for four years.


I would bet on GPU driver issues. Some cards without the right driver can force CPU rendering making it extremely slow


Just seen this one on my ancient Thinkpad. Mesa-21.3.5 is fine, mesa-23.0.1 fails to pick up the graphics card and drops to llvmpipe software rendering.

It has to be said that llvmpipe seems usable even with Firefox, but not as good as reverting to mesa-21.3.5 so accelerated graphics actually works.

(17 years is a good run and I have my eye on a nice clean X201)


Maybe the OP just needed to enable e.g. Nouveau with the appropriate kernel module parameters?

I am regularly using a 14 year-old MacBook on plain X and it is surprisingly fast.


> just needed to enable e.g. Nouveau with the appropriate kernel module parameters?

And that's why Linux will stay in the server for majority of people, not on their desktop.

"You just need to enable something with some parameters and you're all set!"

Seriously?


Do we need this comment parroted every single time someone mentions Linux on this forum? I get it, it's too complicated for you, can we move on instead of rehashing this point over and over again?

The past week we have had two major Linux discussions and in each thread there is a semi flame war that starts because some had to voice yet another "this is why Linux is so hard for nan to use", as if it was a novel or constructive observation. It's not.


> Do we need this comment parroted every single time someone mentions Linux on this forum? I get it, it's too complicated for you, can we move on instead of rehashing this point over and over again?

Yes, if it is a reply to a problem where someone "Tried linux and it was slow, so didn't use". If system configuration debugging were in their taste, they would have done so. So it's the "Just configure parameters <x>" which is the pointless rehashing.


>I get it, it's too complicated for you

If it is too complicated for a HN reader it is going to be too complicated for 99% of people in the world. Get off your high horse.


i agree. people tend to forget that a whole generation of computer users was breast fed by microsoft well into their 30s


I mean, we're talking about a fourteen year old computer. With a couple commandline options you can be supported on a modern, updated OS, which is more than you can say for any other major OS.


Once I tried to use a newish nvidia GPU on an old cpu. It was the pandemic and parts were hard to come by.

The minute windows update would automatically load the Nvidia driver in the background, the screen would go fully black with no going around.

Searching online, the combination was unsupported by Nvidia and there was nothing you could do.

Linux with nouveau worked flawlessly. No kernel parameter, just boot using a live usb and everything works.

Yes, sometimes you can hit a weird hardware issue and need to revert a firmware blob or add a kernel parameter to disable something. These are rare occurrences, not the norm.

Every platform has its quirks. It's just not true that windows or macos is perfect, you are simply used to all the weird quirks.


There's a good chance that on windows 10, you would get a laggy 800x600 after install with such an old config which is worse


The guy was willing to hopscotch browsers when it was required.

Seriously.


Amen. I bought a printer and trying to make it work with Ubuntu is just hell.

Ubuntu is great, but it would be nice if there was some Ubuntu+ addon subscription service where I could pay to make the bullshit go away. I've got other stuff to do than still trying to get peripherals to work, in 2023.


The problem with Linux is that people buy random hardware without any research and expect it to support Linux. Try installing MacOS on a random laptop or Windows on M1 Mac and get the same result.

There should be official compatibility list for laptops of all price ranges and whoever buys something not from the list needs to deal with issues themselves.


The problem with this argument is that the exact same crowd of people is saying

"just use linux instead of windows!"

at the same time as

"you can't expect linux to work on everything! do your research!"

So which one is it? Is Linux an OS that you can just replace Windows with straight away, or is it not? Most people don't care about the underlying reasoning why their printer doesn't work on linux - they just know it would have worked on windows fine.


Are you sure it's the exact same crowd? There are many Linux users who are quite fine with suggesting it as an option, while proposing a more rational approach to transitioning. Heck, part of the reason for live media is to ensure that everything works before taking the dive.


We're talking about a 21 year old OS as alternative. Back then, Windows would also require significant manual configuration to get it running. Windows may have become more "automagic" these days, but at least in the Windows XP era a common part of installing Windows was going to the local library to look at guides and manuals for setting up less common hardware.

Even something as simple as setting up a sound card with recent drivers required going to weird, slow-loading taiwanese websites with no english text to get the current drivers right from the manufacturer of the sound chip.

Even worse, with XP you had no GPU accelerated desktop at all. Sure, lower latency, but the PC noticeably struggled even moving windows if something happened in the background.


I never once went to the library or struggled to install XP, nor have I ever encountered anyone who did. It quite literally pretty much just worked. Maybe it comes down to hardware choice?


Then you pretty much only saw the tail end of XP, at the beginning it was very much a struggle. Especially with old network adapters and SCSI devices being problematic, as well as many old 9x drivers not running on XP anymore (after all, XP was the first consumer Windows on NT).


I was using XP prior to public release, and used every version of it, including Media Center edition. Like I said, and like you stated in a very roundabout way whilst dismissing what I had to say, it comes down to choice of hardware. For the average Joe, there weren't problems, they were building machines with current-gen hardware and eschewing yesteryear hardware too. Anybody who had problems just bought new hardware, they didn't muck around at the library. Perhaps libraries where you are provide better information, but libraries in the UK at the time were pretty much the last place you would go for technical documentation.


> Anybody who had problems just bought new hardware

And if you do so, Linux works just fine as well. But you were comparing Windows XP to people trying to install linux on old, specialty hardware bought for use with Windows, so we’ll have to keep the same circumstances as well.


Yes, seriously.

I know how to enable something with parameters.

if others don't - that's fine. They can learn or they can deal with other operating systems.

I really don't care.


> I really don't care.

I don't think that this sort of attitude will result in an environment that's encouraging for more people to use Linux distros. A better approach might be going straight into suggesting whatever information helped you in the past.

For example, the Arch Wiki typically has useful information on many topics: https://wiki.archlinux.org/title/Nouveau

When dealing with other distros, there can be more specific sites too, like: https://askubuntu.com/questions/1032357/how-to-switch-from-n...

Not all of those are always up to date, though, so some digging around might be needed. Many will just give up or not even try, if they're faced with a dismissive attitude. Dialogue around what is the most helpful, accurate and up to date guide would be better!


> I don't think that this sort of attitude will result in an environment that's encouraging for more people to use Linux distros.

Linux crowd, seems have not made their mind on what they want to reach as a product for end user. Or even do they want to have end user, not another cool kid to hang on with in IRC and dig inside OS. Some say - I wanna Linux be used by everyone! Other say - works for me and I don't care on the rest [of the loosers who cannot read hex dumps, ha ha ha]. Somehow the success of Chromebooks being ignored and not learnt from.

Thus, without clear goal, mission, product vision and focused team of product managers it's kept being amorphous [as I see it]. Definitely something to learn from WSL project made by Microsoft. They found the need - they did it. You may even see it as cathedral vs bazaar issue.

The only distant focused effort I'm aware about is Canonical/Ubuntu here ( I'm not sure on RH/Suse efforts ) - they are working on MDM with Intune, they have at least some telemetry ( not totally blind on real user cases ), they have Pro edition and even cooperated to be the first WSL distro, naturally paying back in brand awareness, common approaches and so on.

Make your mind, Linux.


Yes, seriously.

You can either have a system where somebody else is deciding on your behalf how your system is going to be configured, or you can have a system where it's up to you to opt-in to the stuff you want. You can't have both.


I think he's talking about perceived UI speed. Things like moving the mouse, starting an application or dragging a window respond fast on Windows because of how the system is set up (mouse cursor practically has top priority and graphics are hardware accelerated). This is especially noticeable on slower hardware and can cause a machine to feel 'snappier'.


You should try, say, Debian 12 on an old computer with encrypted disks. it's unusable


How old is old? I use Debian on an 11-year old cheap laptop, and it runs fine. No GNOME, of course (it's the performance equivalent of Windows on pretty much any hardware).

The laptop was something like $300 back in 2012.


without GNOME and disk encryption, of course you can run Debian on a toaster


Encrypted disks would be the key issue here, older CPUs likely can't handle the encryption work required.


> This is not a sensible or technically sound thing to do. It is purely irrational.

It is not purely irrational, in fact it is spelled right out in the article why the author does so: he finds the Windows XP interface much better as a user than any other OS.

I can't say i fully agree personally, but the core of the problem and really the reason i don't see any of this as irrational is that people do get used to the interfaces of the OSes and programs they use and updates pretty much always tend to come with changes to those interfaces that are seen as degradation.

Though i do disagree with his assessment of Linux: while perhaps whatever he tried was slower than Windows XP (i remember reading that Lubuntu doesn't focus on low end systems anymore), Linux can be much faster. And also it is the only mainstream OS where you can have the latest and greatest underlying kernel and libraries without being forced to change your desktop interface (though depending on your choices, some DE can take more works than others).


Doesn't MATE or other FOSS DE try to emulate an XP-like feel but a bit more modernized?


There are definitely options there. IceWM for example is very much like old windows, and is blazingly fast.


From the mainstream stuff, XFCE might be close after some configuration (especially if you install some of the more Windows-like themes) but MATE/GNOME2 isn't really that XP-like. Trinity (KDE 3.x fork) would be closer though.

As mentioned in another reply IceWM would also be very close - at least compared to classic Windows - but that only gives you the taskbar and window themes.


posting about it is kind of irrational


Complaining about posting about it is irrational too


who is complaining and why is that irrational?


This is a perfect example of modern day superstitions, in no way different from superstitions of stereotypical hairy cavemen sitting around the fire. You react to a program release that is mere three months old with a perfectly capable layout engine that has gathered enormous amount of functionality over who knows how many man-centuries across itself and its library dependencies as if you were told to drive an ox-cart. The talk about “safety” has long become a creed repeated thoughtlessly again and again. There's an evident manipulation in making people fear hackers and viruses so much when those who make the most money on controlling users' systems are in fact the ones who offer “better safety”.

Updates for the sake of updates is not a passive, neutral status quo. It benefits certain participants (both the ones on top of the power hierarchy capable of rotating the hamster wheel faster and faster, and the ones on the bottom, who, in accordance with this or that fashion, slap together something that is not supposed to work at all unless you update everything) while others pay the price. And it's the path of least resistance that people choose.

In the end, it's just marketing. If you can add a couple of commas here and there to make the whole thing crumble, which allows you to announce “newer better version with support for new technology, etc.”, it's way easier than explaining what you change, and why, to the inquiring public. The IT public, as we can see, is pathetically neutered, and acts like careless kids in the free-for-all amusement park.

Also, the author can totally get Linux with the leanest DE he can still stomach, and have all the fresh software. Core 2 Duo with 4 GB of memory is still a decent system, especially with an SSD. However, his problem is NOT the software, his problem is that the websites he uses stop working.



I honestly have no idea whether you are supporting what was said with that link, or objecting to it.

Anyway, I was talking about being inquiring. When you look at that list, surely, some questions do rise. “Why should I worry about all of that to read three paragraphs of text on some webpage?” “Which actions have made it so?” “What should be done to fix it?” Right?


I am objecting to what you are saying. It's terrible advice and an obtuse vision of the complexity of software nowadays, especially web browser.

Yes, a 3 months old release of chrome is not suited to use day to day. I link all the known and published CVE on chrome, but if you want to nickpick you could "just" check those which start with 2023-*.


OK. Shouldn't we ask questions about real or perceived “complexity of software nowadays”?

Why does displaying a piece of information that would fit onto a single 80×25 text mode screen absolutely require exposing to a third party a potentially (and, as mentioned, effectively) vulnerable WebHID functionality (which is non-standard, and seemingly only exists to make ChromeOS less mediocre operating system), various WebGL libraries and wrappers, ever-growing Javascript and CSS engines, and thousands of other entities? Someone who grants the whole internet access to local service ports by not using a firewall is considered a fool, but at the same time “non-foolish” “security conscious” people start their browsers, and see no problem in all the services embedded in them.

Isn't relying on a constant (and never ending) stream of updates from white knights in the holy castle in the manner you describe just a subscription model without a defined price?

Who controls the Web? Is controlling the web client enough for that? What benefits the endless rat race might give to them?

How come there's a hidden dependence on corporate products and their support cycles even in the process of using seemingly “open” technologies, say, for government sites and services? Is “I have no idea, my code absolutely requires latest libraries” a valid excuse?

Can mindless acceptance and circular finger-pointing between web developers, library authors, browser developers, and users solve these problems? What needs to be done?


I feel like the author and everyone else in this thread probably uses computers differently


> It is purely irrational

My assumption is they like the UX of Windows XP better than later versions of Windows. I sympathize, because I think that flat software UI design has made computers less fun to use, but I just use theming software at the expense of the stability of my explorer.exe


"We're going to write security updates for this OS but we won't give them to you" is such psychopathic behavior.


Writing security updates costs money-and as the software gets older, the cost generally goes up too. Is it wrong for them to decide, after a reasonable period, to stop spending that money on outdated versions? What happens then when a small subset of customers says “we need those updates so much, we’ll pay extra for them?” Is it wrong to take their money and spend it on writing the updates they request, with a reasonable margin on top? But once you are selling updates to that small subset of customers, if you start giving the very same updates away for free to everyone else, those customers are going to ask for their money back. I don’t see any “psychopathy” here, just rational business decisions.

This doesn’t just happen for operating systems. Many vendors of on-promise proprietary software do the same-databases, middleware, ERP suites, etc-most of them have a special program for customers who want to stay on really old versions, in which they pay extra $$$ for updates compared to customers willing to move to a more current one. If you accept proprietary software as ethical, I can’t see how this practice can reasonably be accused of being unethical.


The problem with the newer options is that they are not paid strictly with money anymore. Windows 10/11 use a ton of telemetry you can't turn off.

I use them anyway but only for gaming to limit my exposure.

> Many vendors of on-promise proprietary software do the same-databases, middleware, ERP suites, etc-most of them have a special program for customers who want to stay on really old versions

You can't really compare enterprise class software with something that is sold directly to consumers.


> The problem with the newer options is that they are not paid strictly with money anymore. Windows 10/11 use a ton of telemetry you can't turn off.

Enterprise, Server and Education licenses support "Diagnostic Data Off" telemetry level. I think they'll sell Server to anyone, and even though Enterprise is only sold to "businesses", it doesn't cost much to set up a corporation, and now you are a business customer.

Not saying I'm a fan of Microsoft's telemetry policies–but it is not like people are forced to buy Windows, especially nowadays, there are other options: macOS, Linux, Chromebooks. Don't have to worry about telemetry on my Linux box, and Apple is a much more privacy-friendly company than Microsoft. If people don't like Microsoft's policies, they can vote with their feet.

> You can't really compare enterprise class software with something that is sold directly to consumers.

Microsoft no longer supports Windows 7 for consumers, only for certain enterprise use cases. So in the context in which it is still supported, it is enterprise class software, and other enterprise class software is a valid comparison.


> Apple is a much more privacy-friendly company than Microsoft.

I'm not in agreement here. Apple likes to paint themselves as more privacy-friendly, yes. Whether they actually are is another story.

They have a lot better PR that prevents them from doing stupid skullduggery like unwanted news and "special offers" and "payment plans" in edge. But a Mac calls home a lot even when you are turning telemetry "off".

I'll try the enterprise version (I have MDSN) but last time I checked it I am pretty sure it gave the same options.


> I don’t see any “psychopathy” here, just rational business decisions.

That's just a prettier word for the same thing.

And bringing up Oracle as your reference for ethical behaviour is certainly... a take that doesn't really say what you think it says.


I used to work for Oracle. I'm still friends with people who I worked with at Oracle. Many of my current colleagues used to work for Oracle. My current manager, and his manager too, used to work for Oracle.

Did we always agree with the decisions of Oracle executive management? Nope. Of course, so long as you work there, you can't say that publicly. But, even though there's quite a few things that Oracle did while I was there that I strongly disagreed with – I can't see the problem with charging extra for supporting really old versions. It is just common sense, and it is part of the package the customer signed up to at the start – the collateral given to the customer as part of every deal explained it. Many other vendors do it too. A consumer might legitimately claim they didn't understand what they'd signed up to, but that's not believable when a billion dollar company signs a million dollar deal.

Oracle has over 100,000 employees today. I have no idea how many ex-employees there are – I've met so many over the years, we are everywhere – but I guess it must be well over 1 million by now.


You are deliberately putting in extra effort to make your own product less secure. Often with consequences that will primarily be felt by people far down the impact chain, who probably didn't even know that the choice was being made.


The extended update programs are/were only available for corporate users as it's a pay-to-subsidize-availability program that costs more per year than the original OS license. While you may disagree if that's good or not jumping straight to labeling it psychopathic behavior seems more like an argument which came from a conclusion than the other way around.


0patch is free for personal use, going back to Windows 7.

https://0patch.com/pricing.html


A lot about modern proprietary OSes is psychopathic lol


I think a lot of companies agree with you more or less — well, companies don’t have personalities or psychologies, so psychopathy is not really possible. But charging for security updates is too much of a reputation risk.

Of course because they are profit driven entities and they can’t charge for security patches, they just stop writing them.


Companies are not unaccountable entities. They're headed by people who make and carry out these decisions. And withholding security updates as a means of planned obsolescence is absolutely a consciously directed strategy. In many ways companies act as too much of a shield for bad behavior. When a "company" does something awful, it's not the company - but the executive leadership doing that.


s/security updates for this OS/software/


Linux Mint + non networked VM with winXP..




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: