Not surprised though, with AMD being able to patch the Linux kernel and scheduler to its liking in order to squeeze out the maximum performance for it's architecture, versus depending on Microsoft to have the good will to figure it out on their own dime.
I'd be curious if the businesses buying these $20k+, 96-core, HEDT workstations actually do run Windows on them in order for this to be that big of a problem for Microsoft worth addressing, or if it's Linux all the way anyway for them so Windows isn't even on the radar.
Anyone here in the know?
Also, obligatory: "But can it run Crysis?" (in software render)
ive worked at companies where we buy these types of workstations for fea/cfd/ML
if the company has a overbearing IT, we run windows for "security". at places where we move fast and break things, we run ubuntu and IT says for us to manage it ourselves.
for anyone wondering why we done use the cloud or a server, we do too. but model setup, licensing, and small jobs are easier and quicker to do locally.
> "...depending on Microsoft to have the good will to figure it out on their own dime."
They don't have to figure it out on their own; everybody maintains close contact. For example, Intel has a field office literally one block away from Microsoft's main campus.
One would assume AMD also has a field office similarly near by. (EDIT: they do, a couple of blocks further.)
Funny you mention this because a lot of big visual arts studios use Mac for the artwork in the Adobe ecosystem, and heavily invested in Linux for the 3D rendering pipelines as they moved away from SGI/BSD with Windows almost nowhere on the radar.
I'd estimate the number of big-name shops running Windows is a minority with only smaller indie studios like Corridor Digital being all-in on Windows because it's a jack of all trades can-run-anything OS, and managing that one ecosystem without sys-admins and an internal IT department is a lot cheaper easier for small businesses of amateur-professionals.
The majority of professional movie and television VFX work is now done on Windows and Linux computers, with Linux especially being used for the rendering portion of the process.
Mac was the dominant platform for a long time, but Windows caught up and zoomed past Mac about a decade ago. Same or better performance, cheaper hardware, cheaper software, easier to upgrade/repair, and more choice for all of the above.
There's a reason that Apple has to pay big bucks to Olivia Rodrigo and others to use Apple products to film and edit their music videos: it's because Apple fell out of favor with creatives and Apple is trying to buy its way back in.
Yup - A while back (not sure if changed much since), Weta Digital people would talk about being pretty much mostly Linux for modelling/rendering/etc and heavily Mac for audio. Very little Windows.
The improvements coming for Wayland's HDR/color management are likely to help with that, as the features they're aiming for appears to beat Window's more slapdash implementation with per-window color management, where window contents are accurately tonemapped and composited within the widest color space the monitor supports.
Adobe would need to be incentivized to port their suite over for it to be taken seriously, but maybe Wine could bridge that gap at first.
Will Wayland be able to render mixed HDR and SDR content correctly, with e.g. an HDR video on YouTube rendering with extended range while the rest of the screen renders as normal?
Currently only macOS can do that. With Windows you have to choose between SDR and HDR display modes which affects everything on screen regardless of type, which makes SDR content look dingy in HDR mode.
On that subject, anyone know why shifting to HDR dims everything that way? My mental model of it is that SDR brightness goes from 0 to 100, and HDR brightness goes from -100 to 100, and that turning on HDR moves everything not HDR-aware down to the bottom of the brightness space.
I could look this up, but never think about it outside of conversations like this and figure it might be more fun to talk about it.
Well, it's pretty arbitrary. What is missing from most image (or sound) data is metadata to say what physical intensity is represented by the signal. I.e. how many nits (or decibels) should be emitted for this signal.
AFAIK, most encoding standards only define a relative interpretation of the values within one data set. And even if standards did have a way to pin absolute physical ranges, many applications and content producers would likely ignore this and do their own normalization anyway, based on preconceptions about typical consumers or consumer equipment.
To have a "correct" mixing, you would need all content to be labeled with its intended normalization, so that you can apply the right gain to place each source into the output. And of course there might be a need for user policy to adjust the mix. I think an HDR compositor ought to have gain adjustments per stream, just like the audio mixer layer.
In single mode where it’s just SDR, it’s mapped to take the full range of the display up to whatever is deemed a comfortable cap.
In a mixed HDR/SDR mode, the H is range above the S , so it doesn’t make sense to scale it up. I prefer Apple’s terminology of Extended Dynamic Range because it’s clearer that its range above the SDR range.
Now you could say that you intend for that SDR to be treated as HDR, but without extra information you don’t know what the scaling should be. Doing a naive linear scale will always look wrong.
Because windows map sdr contents to srgb color space. Which nobody except designers uses. Most monitor today ship with a much brighter and, high contrast, vivid color profile by default. If you toggle your monitor to srgb color profile. You should see a color that looks really similar to sdr contents in Windows hdr mode.
In my opinion, I also don't like it. But there is surely no way for Microsoft to chose a color profile that looks like without toggle hdr on given there are so many monitor manufacturers in the market. I think chose the most safe srgb option is understandable.
I have monitor that have 187% sRGB, 129% Adobe RGB and 133% DCI P3 gammut volume. But to have correct sRGB colors with maximum coverage on monitor I need to clamp ICC profile via Novideo SRGB. Without it, sRGB content looks oversaturated in orange spectrum.
It's more like SDR goes from 0 to 255 and HDR goes from 0 to 1024. In SDR mode, 255 = (say) 500 nits while in HDR mode 1024 = 1000 nits and thus 255 = 250 nits so SDR content looks dimmer.
SDR goes from 1-100. HDR goes from 0.01 to 100. Twice as many orders of magnitude difference from bottom to top. So if you peg the top to max brightness in both cases, the HDR looks brighter because the contrast is bigger.
(Note that this is an analogy. In other words, it's wrong, but a way of looking at it)
I am super excited for this, Wayland seems extremely promising. I'll probably try Fedora out again soon - my last experience with Wayland was GNOME and it was super nice.
> versus depending on Microsoft to have the good will to figure it out on their own dime.
Cutler stated in a recent interview that he had a 96-core machine as one of his two daily drivers. I wondered at the time if he pre-announced that CPU since the press about it seemed to reach me a few days later.
Epic Games runs mostly Windows Threadrippers for Unreal Engine development. Compiling Unreal faster, or anything really, even Windows itself is a compelling argument.
Back in the day I used Linux for rendering my Blender animations because my tests had shown it was 10% faster than the same machine running Windows 10. With render times of more than 8 days this quickly paid off.
I noticed that AMD/Xilinx Vivado synthesis runs 20% faster on WSL2 than on Windows. After seeing such differences using it on plain Windows becomes unpleasant.
Yeah, when I was doing more of that sort of thing I'd generally use ISE or Vivado on a big Ubuntu server and VNC or X-forward it to my machine (the X forwarding could be a bit janky with Vivado but it was OK for getting things going in a pinch).
In the context of Linux adoption on the desktop it is not big news, unfortunately. To my mind, overall end-user experience is already better in Linux compared to Windows where similar tasks are more complicated, even before the raw performance starts to matter.
Better performance would be a prerequisite indeed - no one wants to move a slower OS - but then there must be other convincing reasons to drive the desktop user adoption. And no roadblocks.
Edit: sorry, rephrased a lot, but the core idea is hopefully the same.
> And yet, users don't convert to Linux. So there must be other reasons, and performance benefits don't seem to work so far.
For enterprise desktop purposes:
One central reason is that there is no Excel for GNU/Linux. Accept it or not: the workflows of lots of departments are deeply intertwined with Excel (and it would take an insane amount of work to replace Excel by some other spreadsheet application). Another important reason is that GNU/Linux is not some singular operating system, but a proliferation of various different distributions - this is not something that enterprise customers like.
We run thousands of Linux desktop workstations from low-end workstations for developers to dual-socket machines with >1TB of memory at work. That's primarily because productivity for our use cases is so much higher with a Linux/Unix environment than Windows that it's not even remotely funny. However, almost all users still have either the standard corporate-issue Windows laptop, or a Windows VM on their workstation. Practically exclusively for Desktop Excel-based administrative tasks.
Generally speaking, the vast majority of our userbase likes their Linux workstations far better than their Windows machines, and that's after experiencing the significant downgrade to Gnome desktops due to Red Hat removing KDE.
As a dev, wish I worked there, IBM was the only other place I worked where many devs and even a few thousand ordinary users like marketing, ran Linux. Also, I wasn't aware RedHat removed KDE, what an odd and crappy move. KDE rocks. Gnome not. RPM based distros sooner or later corrupt themselves, I run and recommend Ubuntu or Kubuntu (for KDE people), with Snap disabled with a small apt preferences file.
I rarely, if ever, use Excel. But I find that Outlook in the browser wipes the floor with local Outlook, be it "OG" or "new" Outlook. Everything is much smoother, there are no window widgets crapping up (right now, the min / max / close and window title of my new outlook are black on a dark grey background - I'm using the windows dark theme). This holds even in Firefox on Linux – it's how I mostly interact with MS Office.
What's also fun to me is that the other day, I tried "opening in word" a .docx attachment I received in my locally-installed "new" outlook. It didn't even try to open my locally installed word on ask me anything. It uploaded the file to onedrive without asking and proceeded to launch word online to read it.
The end user experience is way better on Windows than Linux. And I say that as someone who uses Linux (Ubuntu Gnome) daily. You really have to appreciate the freedom of choice that FOSS brings you to appreciate Linux. If you don't care, as most people do, you're probably better off with Windows.
Ofcourse you can counter my reasoning by listing a lot of problems with Windows. And I can assure you that most people don't care about the stuff you (as a Linux user) would care about.
Windows just works. It has Office, Excel, Word, PowerPoint. It's familiar. Instructions for any kind of software contain Windows instructions. Friends and family can help you if you have problems. Etc.
(Windows is the only OS that can handle different fractional scaling factors properly for multiple displays. Fractional scaling on Linux is a joke, and macOS doesn't care about low DPI screens at all so everything looks terrible.)
I agreed with this for a long time, but in the past 4ish years I no longer agree. On Windows I regularly run into frustrating issues with audio, drivers, printers, random odd errors.
On linux (currently PoP OS, first on a System76, now a Dell) I have not been running into issues. The biggest issue I ran into was strange Wifi connection issues, which System76 had me submit logs for and identified it as a failed Wifi card. But my audio, drivers, printer, scanner and day to day tasks just work.
Yes there are features that are missing on one platform or another. But day to day is now much more pleasant for me on desktop linux as I run into far less frustrating issues.
I'll reinforce what the parent is saying: for most users, Windows just works. Although I don't like Macs too much, they also just work.
I'm a Linux fan, but I can get myself productive in a new Windows install with no grief. For Linux, I always need to go online to remember "that config change" I had to make to get things working for me - and even then, from time to time I still have to tweak and work around a problem or two as I add more stuff to my computer. What you just wrote as your biggest issue is actually a fairly common Linux experience - submit or search logs from some piece of hardware to diagnose some little glitch.
I'm willing to put on the time, but my nobody else in my family is.
I'm not sure I agree that "most people don't care about the stuff you (as a Linux user) would care about". Some people put up with Windows problems because they don't know it could be different, to them that's just how computers work. Take automatic updates for example. More than once have I heard from a friend how they lost unsaved work or an overnight render because their PC rebooted for updates. While on Linux months long uptimes are the norm, and you can trust it won't reboot on you out of blue. Same with ads and other Windows annoyances.
Other people are bound to Windows by their software needs. You're right that many important programs are just not available on Linux (to your Office example I'd also add the Adobe suite). But the end user experience of Windows itself has been downright abysmal for a long time now, and for casual users who don't need much besides Chrome and a couple other Electron apps, modern Linux desktop might genuinely be the better choice.
I'd say the windows experience is maybe more familiar, but not better. As another commenter said, people are just used to computers being crappy the windows way. And even that familiarity may not be all that great with all the things moving around in windows 11.
As another commenter has said, window management is a shitshow, and I don't even mean "missing X feature I love from this other WM". Virtual desktops are broken. You have UAC windows sometimes going to the front, sometimes staying in the back. Sometimes they're at the front covering your old window but they don't have focus, even though the caret is blinking. You get windows maximizing behind the taskbar (Teams). Window management is handed-off to applications, so a broken application will poison the rest of the window management.
> Windows is the only OS that can handle different fractional scaling factors properly for multiple displays
I don't know what you mean by "properly", likely not the same thing as I do.
That it kinda sorta works as in "can turn it on"? But even basic OS things are borked. Have you tried opening the start menu after switching scaling factors?
My use case: I have a laptop running at 100%, and an external monitor running at 150%. If I boot it up this way, the start menu looks fine. If I plug or unplug the external screen while running, the start menu breaks. But here's the kicker, which shows the quality engineering: pressing the start button shows an OK menu. Start typing looking for something, and it becomes blurry! This is under windows 11 with all the latest updates.
Also, Wayland on Linux supports this, too. I don't use Wayland, but I understand that whatever problems there are, as in "not all apps are compatible" is the same problem with windows: the app has to cooperate. Try opening some configuration panels in Windows on a 200% screen, and you better have some good glasses on hand.
> The end user experience is way better on Windows than Linux.
Window management in Windows 10 is so terrible that it makes TWM look modern.
Cluterred title bars so you need to aim for some free space to be able to move the window on other monitor, 1px window borders - good luck resizing windows on HD and UHD monitors. Gray on gray. Terrible scrollbars.
And the best one: Programs keeping files open after you close them.
20% extra performance matters a lot to anybody buying a $5000 Threadripper machine. If they didn't care about performance, they could have saved a lot of money...
I have this suspicion that Windows has basically a layer of analytics tracking over the top of the DE that slows the system for most desktop uses, and that they essentially "tunnel" through that layer when a program actually needs the performance. Like gaming or exporting a video project.
This is why Windows is able to benchmark high for specific tasks, but for overall usage it feels very slow compared to every linux desktop.
That's good to know! Is there any formal proof of said theory? Like what is this analytics layer called, and is the tunneling something that devs are aware of? Just interested in putting some names to this theory so I don't sound like a loon when I try and describe it.
It doesn’t actually work that way, but the net result is the same. Windows is a Trojan horse that manifests itself as an OS but is really just a delivery mechanism for crap from Microsoft. One drive. Cortana. Edge. It’s all so invasive.
I hope the community really gets behind Snap or Flatpak or one of the other systems for bringing a modern permissions and privileges system to the Linux desktop. It would help me be a lot more comfortable recommending Linux to non-technical people.
What distro would you recommend to somebody like me who wants to be asked before an application gets access to my location, microphone, camera, network, etc...?
Flatpak... I'll warily try it for "user facing" apps.
But after a clusterfuck of trying to get a flatpak Jellyfin to work (HW accel fell over, and it wanted me to dumpster dive into flatseal to unsandbox it so that media OUTSIDE OF HOME could be accessed... never got around to looking at getting it to even boot at startup) I'll shy the hell away from it for daemons.
As someone who has been using Linux exclusively for my personal machines since 2007, here's the chicken-and-egg problem that the Linux world is constantly battling: new technology is always being released and pushed out before it actually works.
It's happened constantly since I started using Linux: KDE 4.0, PulseAudio, btrfs (to be fair, most distros didn't really push btrfs as a default), Wayland, etc, and now Flatpak and Snap.
And it's not to say that I don't understand the dilemma: you need to get user feedback to improve the tech, and the pool of Linux users is very small (for desktop stuff). But what that means is that desktop Linux is almost always in a state of partial brokenness. (And please also note that I'm giving a 100% pass on driver issues--that's a given when hardware manufacturers simply only care about working with Windows.)
Honestly, when it comes to "just working," desktop Linux is worse today than it was when I first installed it on my budget laptop in 2007. Wayland is mostly fine for my uses, but others still struggle with its missing features; snaps/flatpaks are slow, bloated, and have strange behaviors because of the permissions models that are NEVER going to be obvious to a casual computer user; and managing an Ubuntu install is still really bad UX.
As an example of the latter, I usually run an "expert" distro on my personal machines, but I decided to slap Ubuntu on a PC that I hooked up to my TV, and after a year or so of just chugging along and running updates whenever prompted, it started displaying errors that my boot partition was full. You've gotta be kidding me... THIS is supposed to be the "easy" distro for normal humans? How the hell do you expect non-tech people to micromanage the /boot partition and delete old kernels and shit?
So, while I agree with all of the technical issues and complaints about things like permissions, security, etc, that advocate for Wayland and Flatpaks/Snaps and whatever, you can't pretend to care about adoption while also pushing out half-baked technology to "stable" versions as though everyone will be happy with beta-at-best functionality.
I think part of it is because a lot of these more user-oriented features (desktop compositing, sandboxed app stores, hdr, etc.) don't get a lot of attention on Linux until after the two major commercial operating systems get them and prove people actually want them.
The monied interests driving a lot of Linux development do not have the same need for these things as end users. To them, Linux is a server, or a lightweight embedded OS, or needed for some other specialized use case.
I'm not sure what distros you have been using but Linux Mint and Debian (if you set it up right) are extremely stable. Snaps have been slow but I have never experienced that with Flapaks; reasonable critique is it takes a lot of disk space and takes too much time to update different versions of underlying tech gtk or qt.
Ubuntu has not been a good distro for a few years now. Even Fedora is more stable now and you get new software.
We never had as good Linux desktop situation as it is now. The biggest complaint is that if you want to use graphics optimized programs you kinda have to use Nvidia and that's still problematic on Linux.
My experience does not match your assertions. The Linux desktop has never been more accessible.
My mother in law has been happily running manjaro with gnome for a while. Now, truth to be told, she was navigating gnome 2 Ubuntu more than a decade ago but things were mostly not working back then. Upgrading versions was a minefield with mostly terrible results.
Maybe the state of things suited you better somehow. I just can't imagine how.
> Honestly, when it comes to "just working," desktop Linux is worse today than it was when I first installed it on my budget laptop in 2007.
I bought a new Alienware PC in summer 2022, installed Ubuntu immediately, wiping Windows. Everything worked. Still does. This happens on every new Dell (or Alienware), HP or Lenovo desktop or laptop I've bought the past 2 decades. I can't buy a new machine that has issues with Linux, what on Earth could I be doing wrong that they all work.
I shouldn't feed the trolls, but I'll take the bait this time.
Read my comment again. In it I literally say that I run Linux exclusively on my personal machines. I don't have a Windows computer in my home and the only Mac I have that's still running macOS is a work-issued laptop. What are your Linux-never-Windows credentials? Because I bet they don't beat mine unless you're just significantly older than me and have been doing the same as me but for longer.
If one were to read my comment history, they could only conclude that I'm a huge Microsoft hater/skeptic.
Given that the ONLY mention of Microsoft or Windows in my comment was a statement that hardware manufacturers only test their drivers with Windows, this comment is just a strange emotional reaction to my point.
> Everything worked. Still does.
Read the rest of the thread about Flatpak and Snap. Read people's complaints about Wayland. Dig up some old discussions about PulseAudio and Xorg and how we used to have to hand-edit xorg.conf files to set up multiple monitors. If you insist that there are no problems, shortcomings, or missing functionality with Flatpaks, Snaps, and Wayland, you're deluding yourself.
I'm in that older+longer cohort so I'll comment...
I think you are talking about workflow-breaking changes or stability issues during transitions to newer Linux bits. You explicitly excluded hardware/driver issues that a lot of people gripe about with Linux on laptops. I think your experience here can vary dramatically depending on your choice of Linux distribution, your upgrade tempo, and your expectations.
I've had some frustrations with the UEFI transition and secure boot, because it forced me do research when I just wanted my system to boot. I've similarly suffered some regressions with MATE recently, where it boots to a black screen and I had to dig around to find that switching to a text console and restarting lightdm would get it unstuck. I'm struggling to answer whether these are "driver" things or not.
But, I've been on Fedora for ~20 years now and am happily oblivious to Flatpak, Snap, or even Docker because they have nothing to do with my day to day experience. I also don't think I've dipped my toes into Wayland yet. I've been using XFCE or the MATE desktop rather than GNOME. But I do remember when ALSA was the new thing, then pulseaudio, and now pipewire. I mostly didn't care unless trying to setup some specific sound peripheral. I never had btrfs anywhere except an experimental system, because I always customized partitioning and never really considered the defaults to matter.
If I exclude hardware/driver issues, it seems like this same story applies to Windows. It all depends on what part of the ecosystem you consider to be part of your platform experience. Some people might have some favorite apps that essentially work the same as 25 years ago, while others have experienced multiple upheavals as third parties abandoned or hijacked an old favorite application or forced some kind of migration. It's only gotten worse with all the rent-seeking cloud integrations with everything.
The thing about snap and flatpak is, that while canonical is heavily pushing snaps, the community seems to be mostly behind flatpak.
What this boils down to for regular use is that some big companies only offer a snap for their proprietary app while most open source stuff seems to be on flatpak. With significant overlap of course.
Ideally, we get the best of both worlds. The isolation technology that Flatpak uses (bubblewrap) isn't exclusive to Flatpaked apps, so someone could probably make a distro with isolation by-default if they were motivated enough.
> What distro would you recommend to somebody like me who wants to be asked before an application gets access
Use an immutable distro like Silverblue or Kinoite. Those are more-or-less Flatpak-only, extremely stable and fairly rigid with security.
> The isolation technology that Flatpak uses (bubblewrap) isn't exclusive to Flatpacked
I wish flatpacked apps actually used the isolation (most don't, so http://flatkill.org/2020/ is still true today) but that isolation is mostly under the control of the packager, not you.
On this point, Deno ( https://docs.deno.com/runtime/manual/basics/permissions ) and pledge.com ( https://justine.lol/pledge ) are better; you have to explicitly allow-list what you want, instead of hoping that the potentially very untrustworthy and third-party packager had your best interests in mind and denylisting what you don't want.
It would be nice to grant permissions in a fine grained way and not just have a blanket accept or reject. For example, if I download a weather app, chances are it will want location and internet access. If I don't want it to have my precise location I should be able to deny location but grant access to the internet.
Enough of the the community is behind Flatpak that it's possible to run an immutable distro and get most of your apps as flatpaks without much trouble.
Snap is terrible; there is no reason to consider it. Mostly because there is only one app store allowed and it's really slow to launch apps.
> I hope the community really gets behind Snap or Flatpak
NO. No. No. No Snap. No flatpak. No rolling release distribution nightmares either. Standard LTS for me (with snap and flatpak disabled with an apt preferences file).
A standard stable LTS system with the ability to update your gui apps is basically what flatpak is all about. I agree though that Canonical is trying too hard to move system packages over to snaps.
Snaps are a dumpster fire on Ubuntu. In particular Firefox.
Flatpak... I tried a dosbox flatpak and it was 3 gigs. Regular package? 3 megs.
I run Linux mint to avoid them as much as possible. All of the flatpaks pretend nobody has files somewhere other than the home directory, and softlinks don't work. Have a mdadm raid in /mnt? Can't see it.
So basic frequent use cases can't be supported by a "modern security and permission system". Then no thanks.
All communication of this with the developers of snap, for example, are dismissed with a high amount of condescension. Which is pretty typical with most things whenever you have to deal with security. They imposed the dogma, and you have to comply to it.
And the delay whenever the snap has to interact with the file system, there's like a 5 second delay every time it happens. When trying to access a SSD based file system. That is unacceptable
Look we do not have infinite Moore's law steppings, or gigahertz of serial speed coming down the pipeline. Moore's law is running out. We can't be adopting some system like a stealing light, 50% of the practical performance of the user interface. Hard disc drives are going to start to not get bigger. The technologies we use for increasing the size of hard disk now already reduce the reliability of the drive.
So snaps in their elk really are not something I look forward to. For now just run the distro's lake Linux mint which doesn't use them
> I tried a dosbox flatpak and it was 3 gigs. Regular package? 3 megs.
You are counting the runtime size. That is shared between all apps that use the same runtime. As I already had it installed for another app, the download size for the dosbox flatpak on my fedora machine was 1.6MB.
I also recommend installing flatseal to easily see and manage file access permissions of flatpak apps.
By default, it shoves it into some parallel file system within the snap.
Oh did you forget to actually explicitly save it into the downloads directory? Yeah, you have no idea where it is now.
Oh and of course when the download file explorer widget opens up, you can't get the mouse focus. Probably because there's some security boundary between Firefox and the file system.
Usability of firefoxes vastly decreased as a result of snaps
Finally, well this isn't a big deal to me. Apparently a lot of the open source people don't like it because the Firefox nap is basically just a call. The internet pull down binary command that pulls a huge binary blob and they don't like this.
This is why I've been using Firefox downloaded directly from Mozilla for a while now instead of the broken one shipped with Ubuntu. I just need to manually update via the "Help -> About" every once in a while, but things look much better otherwise.
It's nearly a year now that I am using Ubuntu 22.04 (and that the Firefox from Mozilla has no problems) so I don't remember all the details. I also tried using the apt version of Firefox as others also suggested. But it eventually got automatically replaced with the snap version after an update.
Yes, it’s stupid. Recent Ubuntu releases have added meta packages to apt that actually instruct snap to install instead. You have to first add the Mozilla Firefox ppa and then use apt pinning to say that you want the “Firefox” package from this repo and not that one. But then it works forever after.
It's a well known fact that firefox startup time when packaged as snap is significantly worse. This should be less noticeable for actual use.
> On Firefox 119 I thought I was losing my mental faculties (even more) since I kept losing track of the cursor for a whole day. Turns out that it’s a Windows “feature” where you hide the cursor when you start typing into some input box. Maybe this is a Firefox-wide regression on Linux though. Difficult to find other examples of it on Linux via Google.
Firefox hides the cursor while typing, but it should unhide as soon as it's moved (at least it does for me)
> - More fun than a problem: `firefox --version` spews a warning: `update.go:85: cannot change mount namespace according to change mount ...`
> - My dear XCompose wasn’t respected because of course you have to copy it to inside some dang `/home/me/snap/common/utensils/computer-applicances/apps/Firefox` tree
Those should both be snap specific.
> - The keyboard just does not work sometimes. No other apps seem to have this problem. So it’s apparently not some idiotic “applications that use X work but under Wayland it doesn’t” (or vice versa). Does a restart help? No. Seemingly only a computer reboot.
I don't know about that one. Could be Ubuntu or snap, but it does not happen on Arch. Out of curiosity, are you running firefox through xwayland or native wayland? (It says so on the about:support page under "Window Protocol")
I hope for a future where we are not trapped between Windows, MacOS, and Linux. They are all very complicated, and have much state. They're great for general-purpose computing, but it would be cool if you could use the same hardware in these GP computers with lightweight firmware that doesn't get in the way, and can be tuned to a specific task better. Simple, responsive GUI, could run your specialty software etc with direct hardware interaction.
> Simple, responsive GUI, could run your specialty software etc with direct hardware interaction.
Are "simple and responsive" network protocols stack and file system, for example, also included in the FW? There is a reason why systems that used to work like that (game consoles) started to ship with general purpose OS to handle such things some time ago.
Is there really performance to gain? I doubt a well configured OS is slowing the CPU down.
The alternative could be GPU like devices: A PCI-E card with sockets for the chip and some RAM running it's own firmware. Intel tried that with it's Xeon Phi line and they opted to run Linux (called uOS) on the accelerator board, so apparently that was never an issue.
I've been playing around with Ubuntu training myself for the day that I have to give up windows 10. My only complaints are that Linux can be tedious, everything is just 43 commands away; it requires greater understanding of not just what the system is doing, but how it is doing it; and the documentation is written like a Wikipedia article on advanced nuclear theory.
Windows and OSX have, for the most part, just worked, and it was easier for me to understand what was going on, and how to engage tasks, fix problems, and use the system. With Linux it can feel like I'm fighting with the system at times.
I'm enjoying the experience but it has not been without frustration. Perhaps I should have chosen another flavor?
No worries. I have installed windows 11 lately and i find it tedious. It constantly wants to register me somewhere, it asks me to have this as standard, and it is always updating something and asks me to reboot. And why i cant remove certain things from my taskbar? So, i guess, we both have out difficulties. Like always.
Windows 11 should be classified as a hate crime.. Just the fact that I'm deceitfully "required" to create a Microsoft just to use the system is an abomination. On some systems, I've seen that there is a way to bypass that account creation, and just have a local account, but it was so hidden and deceitful that it was… emotional..
This isn't any different than being "required" to create an Apple account. I agree that it is a problem that both systems make you jump through hoops to use them without doing that, but it doesn't seem to elicit the same emotions that come from it being Microsoft.
They're both trillion dollar companies that only care about keeping you in their walled garden.
The same goes for the Microsoft account. That's the whole point. I have been happily running Windows without a Microsoft accounts for years. You can just skip the step during installation.
Actually, Windows 11 Home does require one. And I don't mean "You can just be offline" or "Just click use Domain Account", but actually "Without Internet and a Microsoft Account, you cannot continue".
Maybe someone found a script/custom installation image, but the previous posters point about it being required stands for that one SKU at least. (I hope that there's a EU version that one can acquire in the future that gets around that, but I don't know if the Account requirement falls under the current legislation.)
Indeed there isn't. I bought an OEM license for gaming and decided to unplug my ethernet just in case. Wasn't asked to create an account. Not sure if this is the only way to avoid it short of installing one of those cool corporate builds.
As both Windows/Mac user Windows seems way more aggressive with have this online account. I don't know its got that kind of hectoring tone. Well you've been stalling for 2 months and now its time to have login created.
> And why i cant remove certain things from my taskbar?
You actually can, it's just they don't expose most of it to the user. However people have built tools [1] that give you a pretty UI to manage those more complex configs and apply them to windows. Bonus points in that you can use all the old windows UI elements from previous versions of windows and mix and match them as you choose.
Honestly I couldn't tolerate Windows 11 without something like ExplorerPatcher.
be careful. i used this for a while, but then windows updated and ended up bricking my normal user account. had to go into safemode and figure out the incantations to uninstall and resolve. all during a high priority time for needing my computer of course.
Yeah I'm not sure how long ago you used it but ExplorerPatcher nowadays is setup so that it disables itself if explorer crashes more than like twice in a row so that you can debug the issue. Normally it's just a matter of fetching the new symbols (which happens automatically) and occasionally updating. Afterwards you can relaunch EP... or you can just leave it disabled if you are busy.
I've been able to disable most of the things I do not like on win11 pro, so far, but if it weren't for how much I use Ableton I would be all in on Linux at this point.
If anybody knows anyone at Ableton, please tell them to do a Linux edition :P
I honestly don't understand how macOS and Linux are better with respect to OS updates. With Ubuntu I always have to update and reboot, almost weekly. And with macOS if there's an update, you can at least schedule it, but not without this big bright red notification badge that's always in your face and can't be dismissed.
>And why i cant remove certain things from my taskbar?
IDK, because the real question here is why can't YOU do it? Because that's definitely possible, anyone can do it with a few clicks from the GUI, there's nothing stopping you from doing that.
Pretty sure my 5 year old nephew can figure out how to get to Right click -> Taskbar setting, where stuff can be removed from the taskbar.
There are plenty of faults with Win11, but the fact that you haven't managed this, is more an issue on your side ("I tried nothing and it doesn't work") rather than with the product in question.
Excuse the bluntness but I'm tired of seeing all the FUD spread on HN and I need to correct it.
How do i remove teams or the stupid weather badge from the task bar? I can remove explorer and edge with a right-click, just those two i can not. It is inconsistent.
That it is not impossible with a few more clicks and that i have to know, where to find it, well, duh.
>How do i remove teams or the stupid weather badge from the task bar?
In the time you have written this rant/question here you could have typed it instead in Google, Copilot or ChatGPT and have gotten your answer already instead of falsely complaining that "its impossible to remove something" when it's not.
Yes, some things will always be unintuitive to the user that's new to any OS, but that's the issue with any other OS including MacOS and Linux distros when you're brand new to them and use it for the first time.
There's always gonna be a learning curve and you'll need to Google a bit to figure some stuff out in the beginning regardless of your OS of choice, but that's a long way from saying "it cannot be done" when the main issue is you can't be bothered to Google something basic.
Excuse the bluntness here in my comments, but how do people manage to get into highly paid technical careers while not being able to Google "how to remove X from Windows taskbar?".
Sometimes I think people compalaining about windows are just to lazy to look through the settings and search for the things they want to change. Windows has a lot of things to complain and be 'angry' about, but 90% of the issues I see people mentionning are completly solvable with a few clicks.
i really dislike this line of reasoning. Windows has a much larger installed base so finding support and solutions are pretty easy. Whether that is a video, a blog or a forum, you can always find someone with the same problem.
No, windows nagging you to register is not in the same ballpark as your audio not working or your video resolution changing with multiple monitors. No, windows updates adding or removing features are not equivalent to the problems linux has, many of which are showstoppers.
You're trying to be snarky and this is one reason why so many regular Joes are reluctant to adopt linux. you never know when you will need the terminal and when you will encounter an unhelpful response like this.
So serious. I find the linux community indeed very resourceful. And the ones for windows, after you have clicked thousands of banners away, that tell you the workaround for avoiding the registering after a long text of nothing, i find them resourceful too.
My point was, that it is always difficult to enter a new domain. So it is a shared pain.
My limited point here is that linux has a lot of problems that are showstoppers. You sometimes cannot do what you want to do.
The average person cares about about usability first. Visually, linux is excellent and very useable. The problem is, should things go wrong, the level of technical skill needed to use community resources is relatively high.
On a superficial level maybe. But generally it’s still very unpolished, inconsistent and has poor UX design. Of course it’s matter of taste.
Not sure why your comment is being downvoted though? I find it hard to imagine how could anyone disagree with it from the perspective of an “average” computer user.
> But generally it’s still very unpolished, inconsistent and has poor UX design.
Came here to say that. If you stick to the 2 or 3 desktop apps that they offer and do everything else in a terminal, then no worries. But the minute you try to get things done, beyond just the terminal - it comes apart.
>I find the linux community indeed very resourceful
I do to. As a “power user”, however it feels that for people who don’t know how/fear/don’t want to use the command line and don’t know how to read documentation accomplishing even basic stuff on Linux would be a struggle.
The old joke goes, though, that you have to prompt-engineer the Linux community to get a useful response, e.g., "Linux sucks because I can't do X" instead of "How do I do X?"
I don't disagree with you but have you ever had to google a windows issue in recent history? You get 90 links of 'Microsoft experts' that ask you to run "sfc /scannow" or "chdsk". None of them having a clue what they are talking about.
I think it is usually easier to fix windows if you have 'audio not working' but windows 10+ has started to have a lot of non trivial issues and there are no simple google fixes. I have had serious problems as an IT worker with EVERY large windows update in the last 3 years.
On the other side of things, I have seen linux desktop issues that are a nightmare to figure out, including audio not working after a recent update.
The trick with linux isn't to learn the flavour, it's to learn linux - once you get an overview of how all the bits work then switching distro is (much) more straightforward.
Ubuntu is an excellent first choice because it's one of the major distro's so you'll be able to google stuff more easily.
My personal recommendation would be Fedora (and if you want a familiar windows 10 at least in approach GUI) Cinnamon which is excellent.
Where do I learn modern desktop Linux? I am using it on servers since 1993 and used for desktop 2004-2017 (and was very unhappy with it) so my desktop knowledge is severely outdated.
This is almost me. I used Linux on and off over the last 15 years. Mostly in servers, but I often installed a linux partition to try it out.
I always went back to windows since windows was much simpler and I thought KDE and Gnome was similar anyway, I didn't see any benefit in switching.
I permanently switched to Linux in august. I found out about window managers and now I see a real benefit on using Linux over Windows. As much as I'm forced to use windows at work and I try to emulate the functionalities of a WM.
What helped me is support for most of my devices. Linux progressed a lot in the last few years.
Can you hotplug a GPU or do you need to still (effectively) reboot? (I know it's only the windowmanager that needed a restart but if all programs run under then, well.)
What do you mean? Switching between integrated and dedicated GPU? Or opening your case and directly removing or adding a GPU?
If its the first, I think there is support for this. There's NVIDIA optimus for NVIDIA for instance.
The second one, I never thought it could be a use case, even less that Windows would even support that. I always turn off my computer to do anything on my motherboard.
Oh yeah, didn't think of that one. Following what's written on arch wiki https://wiki.archlinux.org/title/External_GPU, Xorg doesn't support it and never will. Wayland seems to support it, but I don't know what it implies in use. The issues for KDE, Gnome and wlroot are all mergred.
I'm not sure what there is to learn, if you have a good working knowledge of Linux servers. This is exactly why I prefer the terminal to any of the GUI components that may exist. Much more stable interface. As far as programs that I use daily, other than a few obscure VPN clients, I've rarely, if ever, needed anything that I couldn't find on AUR. For that reason and for reasons of having up-to-date software available, I like something in the Arch family for Desktop use. On the other hand, .deb packages are often available for more obscure software like those VPN clients...
> I'm not sure what there is to learn, if you have a good working knowledge of Linux servers.
Everything about the graphics and windowing stack, for starters. Why isn't my external display being detected after waking from sleep? Why is it always set to the wrong resolution at boot? Why doesn't VLC have any controls? Why is this simple Unity game suddenly running at 2 fps?
Thermal management, CPU governors, that sort of thing. Not really an issue on servers, occasionally misbehaving on laptops.
Occasional stuff around device connectivity - bluetooth speakers, NetworkManager, etc.
I would be really interested what you were unhappy with? the only major thing that truely changed after 2017 was that wine became almost universally compatible.
It was freakish how little issues i had in 2019 when i switched completely.
Ubuntu distupgrade breaks my machine so badly I need to spend days putting it together again
Switched to Arch
Upgrade. Bluetooth breaks. MFC device breaks. One of the two almost always.
Compared to this the only thing I dislike with Windows upgrades is how it can wake up the machine even from S0 state (wtf!) to upgrade and reboot. But aside from that, it's a complete nonevent when it upgrades. It's been near six years and I never had an issue.
I had a client using an F5 VPN requiring MFA and there was no MFA support except on Windows. I got it working with an old Firefox running as root (because that supported old style extensions and I managed to find an old extension that worked). The only question regarding this setup was from the head of IT at this company asking but not receiving any responses on how to do this.
Weird enterprise wifi always a headache.
I just got a GPD G1. Here's my experience getting it to work:
1. Plug in power.
2. Plug in the Thunderbolt cable
3. Install AMD Adrenaline driver only (only step that requires a tiny bit of expertise -- don't install all the stuff)
4. Reboot
5. It works.
6. If I unplug it, machine goes back to the nVidia GPU inside the laptop. No fuss.
Tell me true, is this going to work on Linux ? I seriously doubt.
2017 was in the early phase of pretty disruptive era with Linux - eg systemd, audio, wayland, app sandboxes etc etc. I get the feeling things are settling now, and stabilising / consolidating into a better place overall with less churn. But for a while things weren't as stable as they typically were before that.
I tried wayland maybe a little over a year ago and the copy and paste behavior was bad enough to switch back. I recently switched to get freesync to work on multimonitor and the issue is now manageable.
app sandboxes I fell less positive about - I dislike them since I think they are only fixing stuff that is already unaceptable to happen.
As for audio - yes while pulseaudio was working i remember some issues with that (mostly related to s/pdiv). This also has lessened with pipewire.
Qubes is cool, but Tails is for Tor and is meant to be a boot-from-live-cd or flash drive.
The whole point is that you want all instances of Tails to look the same on the network so they can't fingerprint you. Once you start making it a daily driver, picking up a lot of cookies, making it your own, etc. then the point of using it is moot.
The duplication of work is staggering. The best people at Mint are working on the same problems as those at Fedora and PopOS and Cube and Suse … infinite loop.
Any comment on Arch one you get going? I find the documentation quite thorough for the niggly bits, and have used it to troubleshoot Ubuntu (taking into account differences between the distros).
I am an Arch enthusiast since I like building my system up and learning how things are put together, but it is not a terribly good platform for learning how to use or manage a Linux system. One of the issues is that you end up with a system is tailored to your own needs. While that sounds great, you have to make an effort to generalize those skills since no one else will have a setup quite like yours.
Thank you, The documentation is really important for me I guess. I feel more comfortable with documentation carefully laid everything out; screenshots are gold.
Is there a book that you would recommend that would help a novice like myself learn Linux better?
IMO, the best way to get comfortable with Linux is to get comfortable with the command line, because although every distribution is going to have different UI and built-in apps, the command line is going to stay pretty consistent. Also, a lot of troubleshooting you Google is going to involve interacting with the command line, and it's essential to understand what the commands you're executing are actually doing.
I'd recommend The Linux Command Line by William Schotts to get started.
Was trying to configure a network bridge for a vm just the other day from cli. The guide (for Ubuntu which I was also using) was using nmcli (Network Manager), tried it and command not found, back to searching and was nudged to systemd networking by stackoverflow which didnt work either. Turns out that my system was using Netplan. Three different systems to handle networking, really? Ok, chatgpt convert this nmcli command to netplan, sure here you go just put this in your netplan config file and apply config. Ends up with a botched network config on a headless system.
netplan(.io) is an abstraction layer on top of either NetworkManager (GUI installs) or systemd-networkd (servers/non-GUI) and is not really needed except as a convenience for Canonical's own designs for automated mass deployments especially linked to cloud-init. Under the hood it just converts its YAML configuration files into the syntax for the underlying actual network management tool.
For NetworkManager it'll write the config file to /run/NetworkManager/system-connections/ and for networkd to /run/systemd/network/ on EVERY boot since /run/ is a tmpfs (file-system in RAM).
For almost all servers, and most workstations, netplan is an unnecessary indirection since most hosts (including containers) have pretty static network configurations that only require writing once (to /etc/NetworkManager/system-connections/ or /etc/systemd/network/ ).
nmcli is the NetworkManager command-line tool. There is also nmtui for a text user interface. These are terminal alternatives to the GUI applets such as nm-applet (network-manager-gnome) or plasma-nm for KDE.
networkctl is the CLI interface to systemd-networkd. There is no widely used GUI interface to it (yet).
That's the exact experience I went through about a year ago trying to set up a bridged VM on a headless Ubuntu system. I mean right down to the sequence of nmcli, systemd, and Netplan, winding up with wiping it all away and just running Virtual Box on a way overpowered and mostly idle Win 10 system. Because I just wanted to run a VM connected to my local LAN.
Linux networking and DNS resolution, while working fine for the happy path, are a dumpster fire from a system management viewpoint. Especially if you want to do anything even mildly off-script. And I say this as a Linux user since before the kernel hit 1.0.
I don't know, maybe it's just a documentation problem. The accumulated junk of 50 years of obsolete documentation that you have to wade through to find out that the whiz-bang Linux distro you're using today is not the Linux which worked fine last year.
And that right there is why Linux is so frustrating to start with. It is easier to solve most problems via command line however a normal user should never ever have to touch a command line. Everything for a common person should be easily accessible via a gui.
> Everything for a common person should be easily accessible via a gui.
They are. Just there are many different GUIs. Each distro by definition has different ones and you can easily change them yourself etc.
There are dozens of sound GUIs, hell, I've tried 4 redshift GUIs before settling on QRedshift. It's so easy to write software for Linux that you have countless opportunities. That's why the command line is easier - it always works(ish).
I've been using an immutable Fedora for quite a while now and it's amazing. I use toolbox and run everything in containers, including GUI apps, so not even using Flatpak which I'm not a big fan of. I think I have like 3 or 4 packages installed on top of the base image. The peace of mind while upgrading can't be overstated. Highly recommended.
Just installed Onyx yesterday on my work machine! So far so good, some niggles as always (it's my first time using Budgie), but overall fairly usable and easy.
Same here but the other way. I find Linux much simpler to grasp than endless hard to navigate menus, windows where each is from different land, registry, shell and what not. I rarely use Windows but when I do, it feels like MS either ruined it or I simply forgot how to do things. I used to think like you so I guess it's just a matter of taking the leap and learning new OS
Bonus points: no ads, no Edge or Drive bullshit, nearly any small quirk that annoys me can be rid of (which admittely can take some effort BUT it can be done)
Example:
- Linux: just type the command vetted by hundreds of people on StackOverflow to edit the config file
- Windows: open app, click this and that... Where is it? Ohhh outside of screen because it uses super duper stylish gaming UI. Let me change system-wide UI scaling just to see thing I want to toggle. No, there was no way to just drag that window.
If you're having to use the terminal for anything other than your own work (actually even then it should be optional, but I find it easier to run make/gcc/etc. directly vs using an IDE) everyday something has also gone wrong, so I think it's a fair comparison.
> I include installing a fresh copy of Windows to resolve the inevitable corrosion that Windows has over time.
Remember that one from the past and countless times I had to do this for others because Windows would become a mess.
Nowadays I just upgrade old Linux distros running for 5+ years because of security concerns or out of boredom. Magically still running same as fresh install
I'm a Poweruser of Linux since the late 90s and I can tell you that if you're a Windows or Mac poweruser with zero Linux experience, you're not going to have a good time in Linux.
Linux has come amazingly far, but I still wouldn't recommend it for everyone.
It's all about context. You're a power user so you're going to want to do advanced things, which requires total immersion in a completely different ecosystem.
But my 80 year old father on the other hand, he just wants to edit documents, scan images, browse facebook and play solitaire. He can run Fedora Silverblue with no problems.
I have been using using Linux for 16 years in total and 7+ years as my primary OS.
Ubuntu itself is a performance hog due to snaps. And Ubuntu is NOT beginner friendly. As an ex-IT guy and as a linux enthusiast, It was not when I deployed it to developers and tech people in my company. And it is not beginner friendly when I install it to my friends & family.
I used to use Ubuntu for work until last week. Used it because of the worry for Ubuntu compatibility since most work related tools etc supports Ubuntu if you want to use Linux at work. It used to use 7.5-8GB ram and >40% CPU. So I switched to Linux Mint. It is based off Ubuntu but it is faster & is better for UX. And I have been using Linux Mint since monday. Ram usage wend down to only max 6GB RAM and <25% CPU usage. For the same workload. Since it is Ubuntu compatible, Linux Mint will support all those work related stuff for which they need Ubuntu for. And I have a lot of horror stories with Ubuntu.
I recommend Linux Mint if you want point release distro. It literally takes care of you and works OOTB. The only problem is it doesn't have wayland support yet. If you have the time to invest (which I think you don't since you mentioned tedious), and only if you have time, I recommend Arch Linux.
Another green flag is to prefer community led/oriented Linux flavours over corporate funded for long term UX. Way too many cuts and bleeds to trust a corporate run Linux distro. CentOS and Ubuntu being the latest ones. The only exception so far has been OpenSuse IMO. And I have heard good things about POPOS as well. I haven't used it to comment on it though.
I really see FreeBSD as my north star. But it is not yet there for my work and desktop usecases.
Arch can be made easier if you use Manjaro[0], for example. I used Manjaro for years before switching to Mac, and it was rock stable. I had Timeshift[1] installed and set up to be able to roll back just in case I messed something up. In general though, I never felt like I had to mess around with the terminal and a bunch of commands, only if I really wanted to.
I used Manjaro for years before moving to Arch Linux myself. Manjaro broke their 2 weeks testing cycle promise. It used to be stable. Then they started cherry picking systemd and kde directly from upstream instead of the traditional Arch stable repo -> Manjaro unstable -> Manjaro testing -> Manjaro stable cycle for a package. Things started breaking and they were hostile to the community which resulted in a new forum. I have both technical and reasons of morale to ditch Manjaro being an early ~2016 Manjaro user.
I also have a sour taste regarding Manjaro because I was one in the minority of people who supported them when they wanted to kickstart a company with Manjaro. I am also going to ignore the whole issue about the leadership issues and outsting of Jonathon (R.I.P Jonathon!). I walked away from Manjaro because there was no technical reason or joy to use Manjaro and partly for Jonathon. He was the glue for our community. Old Manjaro users know how awesome their community was initially.
Thank you for the detailed explanation! I didn't know about any of that. I either got "lucky" and switched to Mac before all these happened, or just didn't pay enough attention to see these changes.
Is it possible that some of the perceived performance gains come from comparing an old installation (with lots of tools and packages that are no longer used) with a brand new installation?
I don't think so. Based on my experience. I have enough experience with snap performance issues. When I was using snaps in another distro (Manjaro), removing snaps shaved me a nice 11-12 seconds boot time. Snaps adds boot time. They are slow to start. They also have these performance issues. What I didn't know or somehow missed was Firefox snaps was eating way too much memory than I thought. And the whole package was not appealing or solving my problems.
Linux Mint in my past experience have always stood tall even in old installations. So I don't expect that as a reason. And I don't expect LM to slow down as well. But thanks. I will keep an eye on it and see.
> Arch is terrible due to their bare testing package release policy.
It is NOT. This is biggest mystery I have about people's perception about Arch Linux. Arch provides latest upstream stable packages unlike point releases. NOT developer branches of packages BUT STABLE versions of package as soon as it is available. Point releases like ubuntu gives you old versions of software. And backport security patches against that version from latest stable packages which again leads to possible bugs etc etc. Apart from Linux, you will never see anyone calling a latest stable version of a software as "bleeding edge". You don't say iOS 17 as bleeding edge. It is the latest stable version of the iOS. Windows 11 is not bleeding edge, it is the latest stable version of windows OS.
I have a 5+ year installation Arch installation from which I am writing this comment. It is fast easier to use and predictable. Point releases always give you a easy jumpstart and it is often a pain to maintain in the longer run. Arch definitely requires initial time investment (It is after all a DIY distro, not a managed distro), but it is easier to maintain. I use Arch because I am lazy. I don't want unpredicatable changes.
I don't think I am going to complain getting firefox 119 (which is the latest STABLE version at the time of writing) within a week or two. Especially in these times where we should be using latest packages for security and stability.
Also repost for Linux distros (like stable/unstable/testing) is testing a package against the distro. It is NOT checking the stability of the software itself. It is checking the stability of a software/packages against a particular distro.
Not who you're replying to but I'm sorry, he's right.
The GRUB incident last summer was the last straw for me. I found myself with an unbootable system. The r/arch sub was radio silent. r/EndevourOs posted a sticky about the grub issue. On researching it more, they were using a build of grub off master (not even a release!). When I asked an arch dev if he thought it was appropriate to do that given he'd just left thousands of people without a bootable system, he dismissed me with something to the effect of 'If you can't repair your bootloader maybe you should use an easier distribution'
You know what? I agreed with him. I formatted my laptop and installed PopOs and it's been really nice running a system where the devs actually seem to care about stability.
Oh no. I am sorry that happened to you. My experience have been pleasant from the community. But I understand that might not be everyone's experience.
Might I suggest setting up timeshift or other things like BTRFS which have rolling back features? I never have to use it in my 5+ years old arch install, but I have used it once while using Manjaro (Arch derivative). If your system breaks, I just use Linux Mint live boot and revert it back to last working state from Linux Mint which bundles timeshift. And you just wait for a fix before you update again.
I suggest this because my experience has been that point releases creates problems in the long term and hence is not without issue. So this might be helpful.
PS: I have heard nice things about PopOS. Hope things are fun over there. :)
Arch has an explicit policy that if there are breaking issues with a package, they will push it regardless and it is your task to read about these yourself in the package update/release messages.
No sane person is going to do that for the many many packages that their system is comprised of.
SUSE holds fresh packages for a little bit and puts them through a more rigorous testing process, before pushing them onto Tumbeleweed. This leads to much much less breakage.
Maybe you aren't aware, but Tumbleweed is also a rolling distro. This is why I specifically mention it as a replacement for Arch.
This is an interesting point you make. I agree that this is a policy and I am not a fan as well. The solution is keeping an eye on the forums and news. But yeah, it can sound problematic. But like I mentioned just above in my comments, zero issues with just sudo pacman -Syu for the last 5 years. I have only done manual intervention twice, both times as per https://archlinux.org/news/. One was something I can't recall and recently JRE, JDK. It didn't break.
Tumbleweed updates are big batches because of the way they update. Which I am not a fant of as well. Not to mention, I am a big fan of Arch community. More over, albeit only hearing good things about OpenSUSE (AUR is based on OpenSUSE tech IIRC), I would like to use only community led/oriented distros. They are always better for the end users. Especially Arch, whatever technical problems exists, they are always improving things. Look at the archinstall. Now I can have a new arch install in like <5 mins.
Linux Mint also looks pretty close to Windows, and they seem to explicitly try to accommodate Windows-experienced folks, with a lot of preinstalled tools and a similar UX.
I run NixOS with KDE on both my desktop and laptop, and whilst it is closer to Windows than Gnome is, its not as close as Mint, or rather, Cinnamon. KDE looks close to Windows with the default UI, but the UX differences are more noticeable.
Another good distro for Windows users would be Zorin, with their Windows layout.
It's the complete opposite for me. With Linux I'm in control of the OS and can make it work the way I like. Sure, you need to know what you're doing but the learning curve these days is rather easy. With Windows and macOS I feel like they're treating me like an idiot and I always have to settle for 80-90% of the behaviour I want if I'm lucky.
> Windows and OSX have, for the most part, just worked
I bought a System76 machine running their own PopOS distro and everything Just Works as I'd expect, with the added benefit of actually having the ability to muck around with the innards of my system when I want to, unlike Mac or Windows, which are increasingly locked-down, opaque, and user-hostile. Neither Apple nor Microsoft are consumer desktop OS-focused companies these days; the former is a mostly a phone manufacturer and the latter is a confusing mess that might be best described as an enterprise software company. They have no material incentive to care about the quality of Mac or Windows, and it shows; their desktop OSes are afterthoughts.
It has touchpad gestures, and I even used a GUI program (I think it's called Touche, trivially installed via the app store) to customize the gestures further (it's quite powerful, it lets you assign arbitrary keys or shell scripts to gestures and scope them by context, e.g. I assigned four-finger swipe left/right to Alt+Left/Right while inside Firefox to perform back/forward navigation (this was before Firefox started shipping with gestures natively)).
As for browser hardware video acceleration, I sure hope so (the model I got has a beefy GPU) but I'm not sure how to check, and in any case I reflexively watch all videos in 480p after years of watching my old Windows laptop overheat while trying to decode video so I'm the wrong person to ask. :P
Firefox has hardware video decoding enabled by default on linux for years now, only google chrome doesn't want to bother doing the work to enable it. That said, in some cases hardware video decoding may be unavailable due to IP issues that forbid the necessary code to be distributed freely.
I switched off of OSX as my primary machine around 2015 and moved to full time Linux. Just decided to commit and fully dove in. I tried several different flavors for months at a time from Fedora to Ubuntu. Eventually I settled on PopOS.
I was really happy with everything up until my work shifted to management and spending > half my day on Zoom calls. For some reason, I periodically would have issues with peripherals. Picking the wrong mic, having to close and reopen Zoom, correct camera not working, etc. None of this is an issue if you're on a laptop but it regularly was a problem since I used my machine as a desktop all day.
I decided I finally need to get a machine with a beefy GPU this year with all of the LLM stuff happening (plus my son is getting into PC gaming) so I bought an Alienware desktop with Windows. First Windows computer I've owned since 2005. Now I have a 3 screen setup where my middle and left screen are the Windows computer, my fully loaded System76 Meerkat is on the right using Barrier and 99% of my development work is remote on that machine.
No issues with the peripherals for meetings on the Windows machine. Still greatly prefer the Linux machine for everything else but the reliability of knowing that my peripherals are going to work for meetings has been important. Plus, I started a tech podcast (Carolina Code Cast) and that's been important for all of the A/V that goes with it.
All that to say, there are tradeoffs. Eventually, I hope that Linux will have first class peripheral support. I'd like nothing more than to install it on this Alienware machine one day.
> Still greatly prefer the Linux machine for everything else but the reliability of knowing that my peripherals are going to work for meetings has been important.
As a long time Ubuntu user, used to use Skype, now Teams, Google Meet, Google Voice (used to be Hangouts) and sometimes Zoom, I have had no issues, or no more than when work makes me use Windows, when using peripherals.
>"For some reason, I periodically would have issues with peripherals. Picking the wrong mic, having to close and reopen Zoom, correct camera not working, etc.".
This problem is probably even worse on windows for me, FYI.
Have you tried a modern Fedora? I don't know about Alienware but for me in a wide range of laptops (including Intel Macbook Pros) hardware issues are a thing of the past.
Ubuntu is mostly fine and a good middle ground for new users between bleeding edge (Arch, Fedora) and outdated software (Debian) IMHO. It's also probably the best place to start simply because of the mountains of documentation that exist for it because of its large userbase.
I would also like to add that often you'll find answers to stuff telling you to use a couple of commands when instead you could use existing gui tools. That might be because commands are faster for people that already know them, but often they're not the only way.
Instructions like type "this and that" in a terminal are short and perfectly reproducible. Instructions like open Settings, find This, click that" are less precise and could leave people stuck halfway.
Even on Windows forums there are often NET USE or Powershell commands mixed with instructions to perform the same task using GUI apps.
As ex-UNIX zealot, that came back into Windows around Windows 7, and then settled to use GNU/Linux from VMWare, Windows has gotten worse, but still not bad enough for me to use GNU/Linux as main OS.
Just last weekend I have spent a good part of it fixing the way installing clang messed up with Ubuntu's system clang, plugged via LAN cable, because after all these years my little Asus netbook still cannot keep a stable WLAN connection to my router.
I feel the exact opposite. Changing things in Linux is usually just a checkbox in the settings. Whereas changing things in Mac or Windows can be an uphill battle, digging through registry keys, or installing third-party software just to change basic features.
For example, how do you rebind the "switch window" hotkey on a Mac from Cmd+Tab to Alt+Tab? There's no way to do it out of the box. You need to install some third party program called AltTab just to set hotkeys.
How do you disable tracking in Windows 10? Well here's a 50 step plan. And you better pray that none of these settings will be reset the next time there's a forced update.
Your Windows link is interestingly all examples of "just a checkbox in the settings", no registry editing, no group policy fiddling and definitely no obscure DOS/PowerShell commands to paste!
(Personally I go further with Win10 - disabling web crap in the start menu does need a registry change. I gather Win11 is worse still.)
OTOH a Linux change I wanted to make recently was to rename Gnome's "Files" and "Files" (!) apps to to "Nautilus" and "Nemo" so I could tell the f*king difference. Right-click & rename? Nope! We're in editing-little-files-land straight away.
About 15 years ago I received the advice, "If you want to be hacker, stop using Windows and start using Linux." Today, I am well aware that this is not the only path to enlightenment, but I count it as some of the best professional advice that I have ever received. This is colored by the direction my career has taken me: the technologies I use are open source, and that fits much better into the Linux box than others.
It's also not about writing code. Sure, when using Linux exclusively, sometimes you might have to hack together a little script to make your computer do what you want, but that's really not necessary, especially in the year of our Lord 2023. It's about tooling. So many younger devs that I meet still have irrational fear of the command line. Inability to use built-in documentation (like manpages), and again a fear of trying (because web browsers exist). Worst of all is the lack of understanding that these younger devs have of Unix permissions. We all know the guy who just pastes `chmod -R 777 .` or something from StackExchange. Since most of our production software still lives on Linux, knowing the proper way to configure these environments is valuable (though unfortunately undervalued, in my opinion, since improper configuration can still "work fine").
Using Linux full-time for years will make you more than comfortable. And yes, it probably will take years. You may come to prefer the terminal to most of the GUI wrappers provided in desktop Linux distros. You won't even notice when you come out the other side. You'll realize that everything only seemed like it was 43 commands away because you only knew 2 commands to begin with. Typing the most common commands will be second nature and take less time than moving your hand to the mouse. Anything that does need to be reasoned out and typed slowly you will learn to embed in a script, with comments so you can remember how it works, and that will save you even more time.
Most importantly, in the end you will have the confidence in your abilities to write long, condescending comments on Hacker News. I kid of course, but you will no longer fear tooling (though you may grow weary of it - looking at you nodeJS), and I truly believe that's a more important and difficult skill than reading and writing code of all sorts.
Interesting, my experience is the opposite.. since windows 10 came out it became almost unusable and problematic with random stuff, even after debloating scripts. Ubuntu has been my go to system for a while now, and i basically just want to watch YouTube and edit text.
My biggest gripe with Linux is that it's an incredibly brittle system that you can easily blow up, either through no fault of my own (a grub update once made my system unbootable), issues which were triggered by trying to work around ubuntu's shortcomings (I installed an nvidia driver from a ppa, which broke my system), or through curious experimentation gone wrong.
Combined with the constant weird bugs ranging from annoying to potentially system-breaking, and the lack of resources you have for troubleshooting (because few people use it, and the system changes so fast that existing posts become outdated), it's really hard to fix, too.
I honestly try to never update my Linux box, unless I'm prepared to invest time into potentially having to fix it. And Windows' annoying updates are absolutely nothing compared to the daily barrage of packages, each triggering a restart.
> With Linux it can feel like I'm fighting with the system at times.
All your points are valid, and it's been a meme for a while now that the year of linux on the desktop is always next year.
HOWEVER, consider the following:
When you're on linux you're fighthing the shortcomings of the operating system (and learning stuff as you go) where as when you're on Windows/MacOS you're fighting companies actively trying to screw you over (and over and over again, always in new ways).
The question now becomes: what fight are you willing to fight?
Most of your interaction with your computer is via the Desktop Environment. Which is Gnome in a default Ubuntu installation. If you're not particularly interested in exploring alternate operating system philosophies, I'd recommend looking into other Desktop Environments to use on Ubuntu. In particular, I'd recommend XFCE or Mate since they behave conventionally.
One thing to understand when moving from a proprietary to an open source OS is how much control you have over the system. Just about any part of it can be disabled or upgraded or swapped out for something different. If you learn where the seams are and how the different pieces work together, it gives you far greater control over your computer than any proprietary OS. If you expect everything to just work, you may be setting yourself up for disappointment.
I have been switched for a year now, for the 4th? 5th? time in my life, and it looks like I'm gonna end up back on windows once again.
The perpetual problem with linux is that it is written and maintained by people who love linux. Its great if you just need a machine to check email, and it's great if you are well versed in linux OS structure and know the CLI command set/structure through and through.
But if you are middle of the road power user, linux is just a constant annoying nightmare of reading forum posts and copy+pasting seemingly random strings of characters into the terminal hoping that it will fix the sound issue on the video you're trying to play.
My personal experience has that Windows is easier to get up and running as long as I want to use it as-designed. Getting off that happy path turns into a fiasco for me. I'm sure a lot of my difficulties would be trivial to fix for Windows expert, but I'm not one.
For me, it's way easier to bend Linux to my will.
(I'm typing this on a Mac. I spent my first months on a Mac trying to make it act like my Linux desktop, and hated it. Then I decided to try a month doing everything the Mac way instead, and ended up loving it. Go figure.)
I've been having a reasonably good experience with Pop OS, which is Ubuntu-based but different in a bunch of ways. The makers (System76) put it on the computers they sell so they have a strong incentive to make it "just work", and that seems to have mostly worked out well.
As someone with years of light Linux experience but no patience for pissing about with config all day, my golden rule is to avoid updating anything ahead of the version that the distro bundles. E.g. no I shall not be updating my Nvidia drivers independently!
I like docker for arbitrary command line software I haven't checked the source for, Flatpak (or Snap/AppImage I guess) for big complicated GUI software, and if those aren't options I sometimes run things in a VM if I don't trust them not to screw up the system (hello Tizen Studio!). I ended up installing Blender from Steam, of all places.
I hear you, but windows seems like a black box in a lot of ways. I don’t have any trust that what it tells me about running processes is accurate. Under Linux I feel like I am actually in control of the machine on a much greater scale. And I’m sure you know know what they say about great power..
As also stated by others you should start learning the basic concepts and no specific flavor/distro to understand how everything fits together. You sound like someone who could very well succeed in that, since you didn't give up yet :)
Once I installed arch I never looked back.
I heard good things of the rolling distros of Suse and Fedora as well, but arch feels more vanilla than the others to me.
If you have a lot of time and passion you could also have a run of linux from scratch. You do it once for the learning part and then choose a distro of your liking.
I'm also very fond of Arch Linux and I use it on my laptop.
However Arch is very much DIY and is great for the person who wants to get their hands dirty because with Arch you assemble and maintain the installation yourself.
For someone who wants things to work out of the box with no (or minimal) tinkering, I would recommend a different distro.
Arch is a distro for learners and enthusiasts. But it would be frustrating for someone who just wants a working OS.
Yeah the control you get over Linux is both good and bad. You can pretty much get it to work exactly how you want, but getting there is very complex, and remembering CLI commands is extremely difficult for me even after many years of doing linux server admin.
It's slowly getting better though, things are starting to have sane defaults more often, and there are more easy to use GUI settings for things, instead of needing to deal with stuff that requires having documentation open on the side.
My benchmark for 'good' is that I shouldn't have to look at documentation or remember CLI commands to set up and use an OS.
So... your complaints are about Linux as a desktop, direct-use environment. The linked article is about a datacenter device (or at least a dedicated build box, virtually all 7995WX chips are going to be deployed headless for sure). But for sure all the skills you need to use to deploy and administer a dedicated 96 core monster are for sure ones you can productively apply to install packages or whatnot on your Ubuntu/Fedora/Arch/Mint/whatever laptop.
But also, all that said: if you want a Linux laptop where everything "normal" works with no fuss, just buy a Chromebook.
I have used the main OS. Linux (plain LTS Ubuntu) has worked way better than anything else, as long as the software was available. With Figma and an iPad I do not need a Mac or Windows anymore. I still have a separate hard disk for Windows for an occasional game.
Sometimes I reboot and people are surprised how quick I am back. Also I use only wired accessories, so that might be a factor.
Reading the responses to this comment are interesting; wildly different experiences with Windows and Linux. Likely due to the different use cases involved, hardware the OS is running on, and software/drivers installed.
My 2c: If you use common PC hardware with standard peripherals, and stick to software in your distro's package manager, you're fine. Ubuntu and Mint are popular and easy-to-use choices. Most people have an easier time than I do, but it feels irresponsible not to post this; I've been trying Linux every few years for the past 20. My experience hasn't deviated much from this:
I've found that once you start installing drivers for non-standard hardware, or installing software not in your package manager, there's a pattern of
- C+P text into CLI
- Errors
- Sudo edit system config files
- Browse internet forums, Stack Overflow, and generally channel XKCD: 979. Tens of browser tabs open; hopefully one has the answer.
- End up kicked out of the GUI after a reboot
- The system is totalled, in that it's easier to do a clean install than fix it.
It feels like there's a tension between you're not supposed to use sudo except in special cases and you have to use sudo to do anything.
When you say Ubuntu I presume you mean the default Gnome version? If you're really looking to replace Windows then KDE (i.e. Kubuntu) is far closer in terms of UX and 22.04 is pretty solid in my experience. Still tedious at times of course.
What kinda tasks are you talking about here, that requires N commands, sometimes tasks do actually take that many options because you're doing more complex things than you could without the CLI.
>My only complaints are that Linux can be tedious, everything is just 43 commands away
That is because you are using Ubuntu which is a prehistoric Linux OS. Try Fedora Cinnamon.
People complain about linux, but are using something that is designed to be a decade old for 'stability'. Meanwhile I have 0 issues with Fedora Cinnamon's stability, I think Ubuntu + children are cashing in on name recognition. Its time to move on.
> My only complaints are that Linux can be tedious, everything is just 43 commands away
Perhaps your training was done with chatgpt or some other procedural text generator because that hasn't been the case for quite a while now. Some people should stick with poorly engineered operating systems such as Windows, since stepping out of one's comfort zone is not for everyone. Some people _need_ to be told what to do and how to think in all aspects of life.
43's an exaggeration, but when something goes wrong the command-line is generally the only way to fix it even in the friendliest of distros.
In the old days in order to effectively use Linux at all anyone would have had to become familiar with the terminal and how it all works. Nowadays users only have to do that when something goes wrong - and of course that's great progress, but it does mean that those users are less prepared to deal with issues.
It's also easier to get help than it used ("RTFM n00b") to be, but that help is almost always in the form of some arcane command without any explanation of what it is, so it doesn't really help anyone learn.
Personally I'd like to see all this stuff hooked up to a GUI, with easily accessible docs for everything built in.
(I think Windows is actually quite good, unfortunately parts of Microsoft have become very user-hostile. I wish they'd just say "buy the new Windows! It's $200, we'll support it for 6 years, it will have no ads, it won't require any "cloud" and it will not harvest your personal data.")
> Personally I'd like to see all this stuff hooked up to a GUI, with easily accessible docs for everything built in.
That is indeed something i’d also like to see more of, but the number of times when i needed the command line is quite low. Ironically I have more issues with my windows install on a very modern laptop - drivers suck, need to use the command line to disable windows built in spyware, file sharing is flakey and so on. My linux machines all work. Linux on the same laptop just works. Also linux not being a commercial os is expected to have some issues around the edges but overall it beats windows in all technical areas except where vendor lock in kicks in.
I ran an experiment a while ago. My windows work machine takes a good 3 minutes to compile one of my C++ projects, but a Linux VM on that same machine builds the same project in 45 seconds.
On a work machine, that's more likely to be due to excessive background services installed by the company, particularly antivirus (which is known to kill disk performance, which means it's particularly awful for compiles). Don't get me wrong, I despise windows on its own merits, but I don't think that particular case is a fair comparison; there's a good chance that a completely clean windows VM in the same circumstances would also be faster. (Now, that might still be slower than linux, at which point we would have a fair comparison)
The general consensus I've found while researching this problem is that Windows Defender inserts itself into the compiler process to monitor it for malware.
It's not even that, windows forces you to use Defender. It's always on, and you can only disable it temporarily. Even when it's disabled, the process still seems to be active and consuming resources.
It's Norton all over again, except now the malware is an integral part of the operating system and it's harder to remove than any real virus.
Why wouldn’t the first assumption be that the bulk of these differences arise from compiler and runtime library (memcpy, etc.) efficiency distinctions rather than the OS?
I appreciate that to the end user it doesn’t matter which parts of the stack are better tuned, but framing this as “Windows vs. Linux” seems unjustified without more evidence.
I can't find any mention of the compilers, compiler settings or whether any prebuilt binaries were used. That makes it even harder to distinguish between compiler and runtime performance impacts.
I have two rigs here at home. A 12700k gaming box with a high end Nvme disk. Also, a 13900k workstation with Nvme disks as well. The gaming box runs Windows 11, and it feels generally snappy until I’m navigating the filesystem or dealing with compression. On the flip side the other box is running Debian 12 and feels significantly faster. Could be 12th gen vs 13th gen but I don’t think that’s the case.
I have a Zen 2 Threadripper box that I dual boot (it's a workstation for coding that I sometimes play games on), and I can confirm. I don't know what the exact technical reason is or why nobody in Redmond is embarrassed. But anything relating to the filesystem is simply dog-slow on Windows, and the more high-end your system is, the more this becomes the major bottleneck.
It's the first thing I disable whenever I want to do some serious disk I/O.
Also Windows still uses NTFS which dates back to 2000. For compatibility reasons, fair enough. But surely NTFS wasn't designed with SSDs in mind, let alone NVMEs.
It would be nice if Windows Defender tone down their on-the-fly aggressive scanning for anything I/O related. For any I/O activity, it regularly consumes up to 30% of my CPU. Moving files to a different directory rev up Windows Defender even it scans the folder in the past.
I wish there is a granular control over Windows Defenders in a way that it would not be aggressive with every I/O activity. I would assume that Microsoft intentionally did this due to rise of randomware attacks and we unfortunately have inept people who mindlessly clicking anything. And they don't want to be liable for being lax.
> There is only so much you can do while still maintaining Microsoft's legendary backwards compatibility.
...like adding support for more filesystems.
I'm curious about what (and when) other parts of the IO stack have been updated. Something has to have changed to add trim support: I'm guessing the scheduler.
Indeed. It's impressive how they managed to extend it.
Wikipedia provides some hints:
> Although subsequent versions of Windows added new file system-related features, they did not change NTFS itself. For example, Windows Vista implemented NTFS symbolic links, Transactional NTFS, partition shrinking, and self-healing.[21] NTFS symbolic links are a new feature in the file system; all the others are new operating system features that make use of NTFS features already in place.
It's because the NT filesystem is optimized for a different workload. This is why WSL2 (which you should be using) uses ext4 (that's the default with an Ubuntu distro, at least).
IIRC there is a filesystem filter on windows where various apps like antivirus hook themselves, which was why filesystem access under WSL1 actually faster than windows.
Isn’t this hilarious? In order to achieve high performance on windows you need to access your file system via a virtual machine. Typically it’s the other way around.
My experience has always been that a Linux workstation environment outperforms Windows simply on account of the known poor performance of filesystem stuff on Windows vs Linux.
It's just different trade-offs. Windows has a user space env that does a shitload of background filesystem indexing/scanning. And then corporate environments usually cake on virus and compliance checkers on top of that. And then my understanding is that NTFS itself has always been optimized for different workloads than most Linux filesystems.
But right back into the 90s it was always like this.
When I boot this Ryzen laptop of mine into Windows, the core DE comes up really fast but there's then a good 2-3 minutes of busy cursors and background I/O hogging the machine before it becomes responsive. Ubuntu takes longer to boot, but then is immediately snappy as hell.
For mass market, consumer user stuff, the trade-offs make sense, I guess.
This alone isn't surprising, but what I really want to know is if this hit is something fundamentally wrong with Windows or is it the overhead of all of the stuff that Windows runs that doesn't need to run, since it mentions "out of the box" windows.
Is the performance difference as large on a LTSC version of Windows 10? If it needs something in Windows 11, would it be as big of a difference on the LTSC version of Windows 11 when that comes out (I know we can't answer this at the moment)?
I would never use Windows on a server, but I am very curious about where the difference really is.
Part of why I am really curious is that on my Steam Deck I run the LTSC version of Windows 10 and got better performance on my games on Windows 10 vs SteamOS. That is just gaming, but it has made me more curious about what the reality is.
Doesn't windows have some limitation where a process will only see up to 64 threads by default and it'll have to opt into some new APIs to use more? Maybe some of the results are an artifact of that.
What about Debian or other linux flavors, is this some Ubuntu proprietary thing or open source stuff? I would love Debian to have this the same as Ubuntu.
Ubuntu is such a turn-off experience for servers. The cloud-init feature is atrocious.
You download a "server-live" iso. 2GB. However this uses the cloud-init to download the required files. Why did I download a 2GB file if your going to download it from the net anyway?
There is no option to not install from the net without removing network adapter.
Boggles me why am I being asked to upgrade my installer while installing, what's the point in that? "There is a new version on GitHub" ...
If your shipping an installer it should install. Not ask me to download a newer version. Why is there a new version, is the bundled version security ridden?
It decided to just pluck a DHCP IPv6 and use that, without any confirmation.
I setup a 50G Luks Encrypted LVM crypt yet discovered that it only allocates 25G leaving the other half to float.
You need to know YAML to configure networking.
Resolv.conf is taken hostage by systemd-resolvd which doesn't work!
The whole process was infuriating. This was an install for an email server yesterday.
My first experience with the 'live' ISO forced me to find how to get the traditional non-live ISO instead. I was trying to set up a software RAID, and it would just crash no matter how I tried to configure it. Tried the same thing in the non-live version and it worked perfectly first time.
Sadly now the live installer is the only option. I recently set up a machine to run folding@home, and it took me four attempts before the installer didn't crash and I could get a working machine. The trick was to run with all defaults and not change anything. Insanity. How the support forums aren't flooded with complaints is beyond me.
Interesting, as a user of Ubuntu for over a decade, I had nearly forgotten that Windows exists. I diddnt know anyone besides gamers used Windows these days.
Pardon my ignorance, but can we then infer that upstream, Debian will have similar or better results? What distinguishes Ubuntu in this case from, let's say, less bloated distros? Just a better PR department, or are there any technical reasons to single it out?
We’ve heard this for every version of Windows for the past twenty years or more.
When XP was new, there were people refusing to upgrade from Win2000 to “Fischer-Price Windows”.
Well, all versions except Vista — everybody seemed happy to upgrade to Windows 7. (Of course the lesson Microsoft drew from that smooth upgrade was to blow up everything for the next version. “They want tablet interactions, they just don’t know it!”)
Win8 (and Win8.1) also had the same reception. People were, of course, more than happy to move to Win10, which contained most of the under-the-hood improvements from Win8 and had a more traditional UI. (Also, with Vista → 7, it didn't hurt that machines had gotten more powerful in the meantime, so the extra RAM usage didn't really matter much anymore.)
O no. I absolutely agree with the GP. I was fine with every windows after and including XP. Until I received a company laptop with win 11. I have a big fat list of things that are super annoying or bugs.
Anecdotal, but I went 2000 -> 7 -> 10, skipping XP, Vista, and 8. Given that cadence, will hopefully be skipping 11 as well and waiting for whatever is next.
If Microsoft backs off core Windows development, which it certainly seems like they are, this is what we'd expect to see... not a big bang where it suddenly stops working, but every year it'll just sort of fall a bit more behind Linux, until in 4 or 5 years the difference consensus will slowly just creep over to Windows being noticeably worse than Linux without any big bang switchover... and it'll sneak up on Microsoft as much as it sneaks up on everyone else.
To be honest, Linux as a desktop environment is already more usable than many people realize. People kept expecting a Big Bang where someone would put out The Release That Solves Everything, but what happened instead is that it just kept creeping up and creeping up. A similar thing happened to Linux gaming; Valve never put out a big Here's Linux For Games release, it just crept up and crept up to the point that it basically runs my entire library, and single-player gaming is now to the point that anything that doesn't run in Linux is the exception, not the rule.
And Microsoft's problem is that if they let this happen, they're going to be in a world of hurt when they decide to get in gear and fix it, because of their reverse compatibility support. Fixing all that up and trying to make everything go fast is going to be hard with all these systems interacting with each other. While we Linux users may piss and moan about systemd transitions or all the audio transitions, in the Linux world, as new subsystems come in, the old ones actually go away, and it isn't dragging anywhere near as much reverse compatibility baggage around. Helps that there aren't any project managers to demand Windows LDAP for COM Apartments Over Embraced And Extended USB to be jammed in to be supported 10 years later because one big client uses it and that mean the rest of us get to be running a WLCAEEUSB service on our computers for the next 15 years, causing occasional mysterious crashes when it spuriously grabs a USB device.
It seems to me Microsoft is in grave danger of walking itself into an MBA trap, where investement in Windows doesn't pay off this quarter so they don't, repeat for 15 or 20 quarters, and then waking up one day and realizing that one quarter at a time they threw away one of the most useful business positions in the history of the world.
Interesting thought, but you might be missing the hardware perspective. I switched from Windows to Linux this year, and the biggest issue has been drivers and hardware compatibility: audio available via headphones but not (laptop) built-in speakers, occasional display issues depending on the monitor setup, occasional WiFi issues. If hardware innovation stopped, Linux could catch up easily... but it won't, so there is a constant driver treadmill that manufacturers mostly handle for MS on the Windows side, but Linux has to handle somewhat alone. Perhaps the manufacturers will start to help out more and more on the Linux side too, as more users demand compatibility.
I've had a much worse time with hardware on windows than Linux these last couple of years. My work laptop runs Windows and will get stuck in a BSPD bootloop if I turn it on with external monitors connected. My USB keyboard and mouse disconnect every few minutes.
Windows may as well not even have Bluetooth for as well as it works. Some devices always show as connected, despite not being present. I have to un-pair my headphones every time I want to connect. Sometimes I have to reboot my computer to get them to work. Windows also doesn't even support Bluetooth audio sink. On Linux, I can play audio from my phone to my computer and pipe it back into my Bluetooth headphones. Windows 7 supported this, but 10 does not.
I've even used some weird old PCI cards that only have XP drivers, but work fine with Linux.
The only hardware I've had trouble with in the last 5 years is Nvidia cards. And that's a deliberate decision made by Nvidia to not support Linux. AMD cards work perfectly.
I haven't seen problems with audio or WiFi on Linux in over ten years.
"I've had a much worse time with hardware on windows than Linux these last couple of years."
And I want to emphasize and underline my point here that if Microsoft does just let Windows languish, we are going to gradually see this more and more every year. There won't be a big bang wakeup call. There will always be people who can say "But I haven't had a problem". What you won't be able to easily tell on the forums is that the ratios will slowly-but-surely shift, but again, with no big-bang wakeup.
I just bought a brand-new fully-AMD-based laptop for the family, stuck the latest Ubuntu on it, and almost everything works. The only thing that I've found that doesn't is that if I play a game and push it out the HDMI port, we get audio dropouts. It isn't everything on the HDMI port, we watched a lengthy YouTube video with no dropouts. But wifi, bluetooth, all the keyboard controls, battery, everything else I've seen is working.
I honestly think we've already passed the peak and are already in the slow decline. Microsoft has already pushed too hard and alienated too many users. Now they'll slowly keep losing market share while desperately squeezing the remaining market for all its worth until there's nothing left. Their momentum will carry them for an unfortunately long time.
> some weird old PCI cards that only have XP drivers, but work fine with Linux.
I will say, supporting old hardware is probably Linux's strong suit, because their model of upstream first means that maintenance tends to get carried a lot further than proprietary drivers ever would.
>I've had a much worse time with hardware on windows than Linux these last couple of years.
I don't doubt that as I have had same experience. But somehow Windows has to get worse from non-technical users' perspective to be real trouble for Microsoft. So far my spouse still find think Windows is the normal thing despite using rather trouble-free, fast Mac at home for many years along with horribly slow and problematic Windows laptop provided by employer.
So even with their own first-hand experience with Windows problems they consider Linux or Mac difficult/exotic thing.
I genuinely, truly do not understand how Microsoft, let alone any user, can accept the current state of Windows' Bluetooth stack. It's completely broken, core features are simply not implemented. Microsoft themselves provide a better Bluetooth UI in one of their WPF demo repos than the one shipped with Windows.
Maybe I'm biased because my job involves building Bluetooth hardware and software for Windows, but even just trying to use normal devices like headphones or controllers is absolute anguish.
Bluetooth on Linux isn't perfect: I frequently have to unpair and repair my headphones, but at least it's feature complete. Anything that Bluetooth can do is exposed. Linux even comes with some really advanced CLI utilities to manage Bluetooth devices. On windows you're lucky to find a WinRT function for what you want, and it probably doesn't even work.
Windows 7 had a complete Bluetooth implementation, but Microsoft decided to rewrite everything from scratch. I guess the intern they delegated it to had to go back to their job at Starbucks before they finished.
Linux blows away Windows hardware support on everything except for graphics cards.
I set up dual boot on a newish Dell laptop recently and Windows couldn't even access the SSD or the wifi card without installing extra drivers. I haven't had problems that bad on Linux since about 2006.
Even dual booting is less of an option these days. If you aren't careful, windows will clobber your boot records and force itself to be the default and only boot option.
I've even heard of Windows erasing grub during normal updates.
I just erased windows from my personal machines. I've been exclusively on Linux for 5 years or more
While I agree with you broadly that hardware support is a problem for less popular operation systems, I can personally say that Fedora has been a dream for me, hardware-wise, for a while now. My 7900XT GPU just works. My printer just works. My webcam just works. My headset just works. My audio interface just works (the audio I/O, at least - I haven't actually ever tested the MIDI I/O).
I'm certain I've gotten lucky, and I'm certain there's plenty of edge cases to be ironed out, especially with more specialized hardware. But I think a lot of common consumer goods are in a good place right now, at least in my personal experience.
My latest PC was a pretty randomly chosen Chinese PC from Amazon that advertised Ubuntu compatibility. In other words, except that advantage to slowly slip away from Microsoft too.
It's been years since I really put any effort into checking before buying - it had gotten to the point I could assume that issues would be minor and/or resolved soon after purchase.
Those PC's look like great value and availability and are clearly very popular - have squeezed out other options like Intel NUC. Very tempting.
But I've heard that you pretty much never get a BIOS update and the quality of any such update is going to be dubious. Nothing sinister about this - it is simply a bargain basement option.
So, you are really throwing security out the window.
Security shouldn't be much of an issue with most modern CPUs. The microcode can be updated at boot time by the kernel, even if the one loaded by the BIOS is out of date. So microcode level patches are taken care of.
That leaves attacks on secure boot, which could be feasible with a bad implementation, but I doubt most home users would have to worry about an evil maid attack.
I'm at the beginning of my third desktop Linux year and I managed to install it on all the hardware I have thrown at it ranging from 10+ years old PCs to a recent Ryzen workstation and a Framework laptop.
I'm not a Linux guru by any mean, there's only one distro that gave me problems: OpenSuse. I'm currently on Fedora.
> If hardware innovation stopped, Linux could catch up easily... but it won't, so there is a constant driver treadmill that manufacturers mostly handle for MS on the Windows side, but Linux has to handle somewhat alone.
Hardware innovation hasn't stopped, but there's a lot less diversity than the past. For mainstream desktop, there's only three video card vendors (it's different on mobile, and servers have ASPEED's 2d cards to work with), sound cards are pretty much dead: HDAudio takes care of most of it but jack detection issues persist, Broadcom is poised to buy up all storage card vendors, Networking vendors are limited.
Intel is pretty good at supporting its products for Windows, Linux, and sometimes FreeBSD. If you buy only stuff they make, you can cover all your peripheral needs and move on with your life.
Most other vendors of important devices have code dumped drivers for Linux and sometimes FreeBSD for at least for some of their devices. This gets more frequent as there's more consolidation. The Linux drivers might not be great, and the code is usually full of unexplained magic values, but it's a start.
There's also the potential of running Windows drivers elsewhere; it sometimes works for network drivers (something something NDIS)
On the other hand once something works on linux it may do so for a long time without being dependent on the vendor to release a new version for each new major windows version.
For me it's a cheap old flatbed scanner where the last windows drivers are for XP but it still works on an rolling release distro.
> Perhaps the manufacturers will start to help out more and more on the Linux side too, as more users demand compatibility.
I do believe that manufacturers would be willing to help out if GNU/Linux was sufficiently standardized that they can simply drop some code/binary that works on basically every GNU/Linux distribution (just as they can basically do on Windows). The problem is that GNU/Linux is not some single operating system, but a proliferation of lots of different distributions.
the parent comment explains why dumping a blob is bad, particularly for drivers. it allows vendors to depend on all sorts of undocumented and unreliable behavior and prevents upstream improvements.
Agreed. I have a Linux computer upstairs, and the Wifi dongle (I know, but it's fine!) broke. Buying a new one, or even knowing what to buy, that doesn't require me to USB key some random drivers (probably cloned off GitHub) is a tricky task.
I have a mix of systems, have had for several decades, Linux works on all of them, old and new, no issues, seems odd what you're saying, almost like it was entirely made up.
It goes without saying that your experience is completely irrelevant, unless you can prove that you had my hardware working out of the box on one of those devices.
Have a look at this simple tutorial[0]. Compare it to "plug it in" on Windows.
Without reading your link yet, I don't understand why is it difficult to understand when I say all my devices on all the Dell (and Alienware), HP and Lenovo PC's I've bought the past two decades just work on Linux.
Hey, which Dell, HP, Lenovo laptop you used in this time? I really-really want to know which system doesn't have any issues on Linux. Also what Linux version your used? Pretty please?
Windows 11 has become an ad delivery vehicle with Microsoft milking what life Windows has left in this fashion. Linux has largely commoditized operating systems and the major DEs here excel over Windows in many ways, further putting the nail in Windows for the desktop. Combine this with many applications being available with any web browser, the only reason to run Windows is for the legacy Windows applications.
> To be honest, Linux as a desktop environment is already more usable than many people realize.
This will be true once the stability of graphics drivers on Linux improves. On every major kernel update, I am always bracing for everything going blank because of driver issues.
People want to game, and a lot of gamers want Nvidia. The driver may or may not work, but if it has a reputation for maybe not working, people will decide against it.
It also possible that people who find out the Nvidia cards are unreliable on Linux are making decisions over other systems where this shouldn't be such a big deal.
You may have a good specific point, but I think people will always say "once X is fixed people will switch to Linux".
When windows users try Linux for the first time, they'll (rightfully) point out problems, while their Linux friend will just say "oh that's normal, you just have to do Y" and the exact same thing would be true in the reverse situation.
Some people are just more tolerant to being in an unpredictable environment than others. I find myself highly tolerant to noise, unpredictability and things breaking. Of course I have my limits, but when comparing myself to my best friend, he will literally have none of it (he's a macos user)
I've used nvidia for close to 2 decades on various PC hardware machines, desktops to laptops, Dell, HP, Lenovo and nvidia (binary drivers in the Ubuntu/Kubuntu distribution) work 100%. Zero issues. Old and new machines. Very odd eh?
Even as a gamer, I prefer to use cheap monitors, wasted too much money on expensive ones in my younger days only to see them break after a few years then the money is gone. I can afford better, I'm just cheap. I imagine I will start using a better monitor in coming years and if I see issues, I'll definitely honestly report them and discuss them. Another thing to think about, more pixels means more electricity used to keep them lit and more electricity to shove images around on the screen for the GPU, higher electricity bill, more carbon in the atmosphere.
I wonder how many HiDPI users also complain about global warming in other forums.
That was my experience too, but a decade ago while running NVidia+Intel hardware. Since switching to AMD+AMD my home machines have only run into the issue once or twice, and then only when running something odd like multiple graphics cards.
Performance is not the limiting factor keeping people from switching to Linux.
If I could take my current desktop PC, swap to Linux, and all the hardware and all the software I like (including games) would Just Work(tm) without me spending weeks to months diagnosing (and inevitably giving up on some of what I want), I'd swap in a heartbeat.
But that is not the reality we live in.
It's gotten better, but frankly I doubt it ever will really get to the point of usability for someone like me.
IMO, once Adobe comes to linux, or they lose their position in the industry to competitors, we will see the shift.
And the main reason is Apple doesn't support Nvidia. There are limits to what you can accomplish editing videos without a beefy GPU, so you're trapped on windows if you're using adobe premier.
I mean, games like Valorant require anti-cheat root kits will still keep us on Windows forever, but those are edge cases.
Adobe is not the only problem. Hardware support, codecs, media support. It all just does not work that well, and I think will not work well for a long period of time.
Video editing for years have not been an option for Linux.
Sure there are programs, but they often crash, often encode x times slower than corporate paid crapware.
For this, you have to talk to the people you are buying your hardware from. They are getting your money, not linux. They are also supporting that hardware under Windows, not Microsoft. Tell them what your needs are, and vote with your wallet.
> codecs, media support
Funny; for years, I've been installing ffdshow (ffmpeg-based codecs) into Windows to have decent codec support in Windows; even today, many users still install VLC (or mpc, mpc-hc, mpv, mpv.net, or other clone) to get media playback without any issues.
Adobe won't come to Linux until they would lose money by not getting there.
They won't lose money, unless there's a competitor, that eats their lunch. Basically something competitive like Davinci Resolve, but for the other parts of their suite.
Microsoft should invest in Proton, make Windows apps work seamless on linux, and then just make a "Windows" replacement for Wayland that runs on Kernel...
Then, and only then, will we have "The year of the linux desktop"
I really dunno why people advocate ripping out the kernel and using Linux instead, while keeping Windows userspace. The NT kernel is the best part of Windows. It has built-in asynchronous real-time I/O. A completely plug-and-play driver model. Far more advanced permissions model. It is far ahead of the competition.
The userspace and UI of Windows is very straightforwardly modified; that's why it has changed so much every iteration of Windows. Telling people that we should keep 'Windows apps' and then rip out the kernel and use the Linux kernel instead is like saying one has a small rash, let's do chemotherapy. It is breaking a fly with a wheel.
I want to go back to the Windows 2000-era UI, with the same advancements in the kernel and core Windows technologies we have today.
I mean, this is actually backwards from a pure engineering POV from what would be actually excellent.
The core Windows NT kernel is quite excellent, and was always in some ways far superior to what Unix offered. E.g. proper asynchronous I/O from day one. Dave Cutler and crew did a bang-up job. Linux is only getting competitive things like io_uring now(ish).
The problem was the shitty API and user space they papered in front of it with. Lack of a proper CLI shell, slow filesystem, the godawful "registry", a low quality, unsafe windowing toolkit (Win32 API), and then, after WinXP, bad UX / environment stuff. And, of course, the fact that everything was proprietary, closed/closed-source, and the $$ licensing story for running a WindowsNT server was always garbage.
As a user operating system none of this mattered so much because the sheer mass of the market meant that driver support and application support was always going to win out on Windows.
But these days, with all my attached devices being USB-Cish things that just implement USB standards I have zero issues in Linux. In fact, audio interface support works better for me in Ubuntu than it does in Windows right now.
But people who were serious about running servers naturally stuck with Unix, despite it being in some ways technically inferior on paper -- because that experience on NT just wasn't as good. Remote administration via GUI. Anemic CLI. MS proprietary everything. Per-user $$ licensing stuff. Tight coupling with IIS & SQL Server, which has their own licensing stuff, etc. etc. etc.
Anyways, I think it would be a shame if MS gave up on the NT kernel.
With WSL1 they did an interesting job of getting Linux user space running overtop of the NT kernel, but they dropped that approach with WSL2 and went with a more VM approach.
> Anyways, I think it would be a shame if MS gave up on the NT kernel.
I can't say this more. The NT kernel is really the best part of Windows. Almost everything that people complain about in Windows 11 is in the userspace. Start menu nerfed? Right-click menu hidden? Settings versus Control Panel? Cortana, Copilot? There's nothing in the kernel or lower-level OS constructs that require any of these.
Funny, I just went back and edited my comment to add in filesystem where I didn't have it before and then saw your reply.
Because, yes, NTFS is just slow as hell. I have never been clear why but as I said elsewhere, various *nix filesystems have always outperformed it. I understand that on paper NTFS was technically superior back when Linux was stuck with ext2 & ext3 as standard, but in practice it just made for a really slow system.
Hehe, that thought was going through my mind as well just now. It's not really such a far out idea; if Microsoft wanted to, they could certainly put the manpower behind it to make it happen. It would also be the most delicious plot twist ever.
But I don't think this is in the cards for the foreseeable future.
> If Microsoft backs off core Windows development, which it certainly seems like they are, this is what we'd expect to see...
If this is really true, I am deeply saddened. I grew up using almost entirely nothing but Windows since 98, and have used every version (except Windows Me) since.
As a desktop and enterprise OS, Windows is pretty fantastic to develop with, natively, contrary to many opinions here. Technologies like COM, .NET (Framework), Win32, Direct{2D, 3D}, and the Windows API are all very powerful and fairly future-proof, too.
Core Windows/NT development is extremely fun, interesting, and the choices that the NT kernel developers made in the 90s put it light years ahead of the competition. Consider its advanced permissions model (ACLs versus UIDs and GIDs), its plug-and-play driver model, and its ability to run on a wide variety of hardware.
Linux supports ACL's, too, but they are not enabled by default. On plug and play, Linux won by a mile by default, as the kernel has much better support than Windows 10 which often has to connect itself to Windows Update.
On COM/.Net, today almost nobody cares. Modern ad-hoc built small-medium Corporateware it's bound to Java as a Hellish curse, and on the rest of the maket here just Direct3D matters; but not as much as before, because for mobile/console gaming Vulkan might work everywhere.
Also, on the enterprise, most people it's migrating to Unix backends and web frontends for lots of tasks. Login once, run everywhere. Does your management software work under a phone, tablet, desktop with an adaptive GUI? Then most of the company doesn't even need a Windows device.
Heck, most non-tech companies have outsourced their IT to 3rd parties and their user accounta are running under Windows virtual machines hosted on Unix servers. And modern BIOSes can run most of their setup under PXE/netboot and connect to these servers to boot their images; so, well, Windows in the desktop is more and more irrelevant.
Lots of LoB applications in various industries are still developed using .NET (even for new developments).
> Also, on the enterprise, most people it's migrating to Unix backends and web frontends for lots of tasks.
Not for every task, web applications are a good choice (which is why lots of new LoB applications in enterprise are still developed as Windows desktop applications).
> Linux supports ACL's, too, but they are not enabled by default.
Oh, it is, just nobody bothers (chances are, `getfacl .` in your homedir will succeed). They are also POSIX ACLs, not NT ACLs, so they have slightly different behavior.
Is it OK to just mention a common misconception which is still prelevant about Linux?
Rolling releases provides STABLE verions of software/packages. It is NOT "bleeding edge" since it is NOT a development branch. It is the latest stable version provided by the upstream (developer/cretors of the software). Point release comes from a time where we didn't have continous development. We didn't have these faster updates.
We should be using rolling release distros like Arch/Void/OpenSUSE more cos Linux ecosystem is far more stable than you think these days. I find more breaks in point releases than in rolling releases.
Use stable software. Not old software which gets software patches from the latest stable version after which each distro has to backport the fix. Then maintain that which leads to more bugs. Just use latest stable versions. Use rolling releases. :)
You don't call iOS 17 as bleeding edge. Firefox 119 which is the latest version of Firefox as bleeding edge right? It is a stable software version.
ALSO HARDWARE: Using Linux compatible hardware will make the entire experience pleasant. There is this notion that Linux works on anything. It does. BUT NOT 100%. You'll loose bluetooth, wifi or some other things. If you want good experience, don't go for vendors like Razer laptops and complain that Linux is not working properly. Use supported hardware. There are so many these days.
I don't know the issues you faced. But Arch in itself is a DIY distro. Not a managed rolling release like Void/OpenSUSE etc. This means you are expected to maintain it. Any updates which need manual handholding usually comes up at https://archlinux.org/news/.
But I am glad you use whatever works for you now. :)
If you are curious, here is what I do. I update every two or three weeks. I have timeshift setup which means if anything bricks (Which hasn't in last 5 years), I just rollback it up with zero terminal usage. If I have time, I do use terminal. Otherwise I will just rollback and wait for the fix. Also, I have seen a lot of issues in point release which gets fixed by just a software update in rolling releases.
This might sound like a lot. But usually this means going to https://archlinux.org/news/ and see if there is any new news which needs manual intervention. Usually there won't be. And read the packages that are getting updated. If I do have time, I usually skim through forums for any outstanding issues. But I have become sooo lazy and confident about Arch Linux that I haven't in months.
Yes. But unstable is Debian's repo. Unstable repo contains software packages which is tested against Debian. It doesn't mean instability of the software. But unstable against Debian. There is a huge misconception and misunderstanding regarding what these Stable/Testing/Unstable repos means. Unfortunate indeed.
> We should be using rolling release distros like Arch/Void/OpenSUSE
No thanks, standard Ubuntu or Kubuntu are great. I don't want a rolling release nightmare like Windows. Steam screwed up by not using at least Debian (base for Ubuntu/Kubuntu) in their Steam console.
The software that I need, will run on Windows, but hardly and always with great effort and abysmal stability. And sometimes not at all.
ruby/rbenv, nvm/node, rust/cargo, docker, nvim, ripgrep, git, rsync, ssh. All of which I sometimes, somehow managed to get running at some point when helping co-workers with their Windows machines. But the amount of fiddling, trial-and-errorring and such is terrible. For example PuTTY is a fantastic beast, but it's nowhere near as well integrated and easy to setup and use as SSH on Mac or Linux. If only because the latter already have it. Most was before WSL, though. I presume with WSL it will be much easier. But that's because its running on Linux...
I use a chunk of those. They work absolutely fine on WSL2.
My windows machine can quite happily pull shit out of Excel files, into R/Python stack on Linux and squirt them back out into SQL Server with no real cohesion issues. Also I mostly spend all day with VScode open on a Linux box from windows.
I also maintain old windows stuff on windows and loads of infra from Linux. It all just works together fine. And it stays working (unlike my 25 year long hate affair with the linux desktop)
Oh and I own a Mac as well for when I want to do some shit I actually care about rather than just earn money from.
You mean some games built for Windows run better on Linux (and I'd be curious to know which ones actually run better). Some games barely run on Linux, some just don't work, some get broken by new versions, etc.
And computers aren't just for gaming. Some people actually work with their computers. Let me know when photoshop or autocad run smoothly on Linux.
Cyberpunk runs exactly 3fps lower on NixOS than on windows during benchmarks on ultra, with dlss and ray tracing. Totally negligible performance difference but being on Linux is not.
If a game that runs bleeding edge features on windows can be made to run on Linux, then it’s safe to say that anything can run on Linux.
If autocad or photoshop don’t work it’s not because Linux can’t handle it, it’s because the authors of those programs don’t want you to. It’s DRM, every time; not some limitation on what wine can do.
WoW also crashes randomly every 30 minutes using lutris after the latest update, it's quite annoying. This happens on both my wife and my computer. She has an AMD gpu and I have a nvidia.
That, and Windows being the only OS with good fractional scaling. My main problem with Linux is that, in order to have tolerable fractional scaling, you have to use tons of hacks to make sure everything is running on Wayland. And even then some basic stuff is just broken, like JetBrains IDEs just not supporting scaling (except on KDE but then other stuff like icons are broken), drag and drop being broken in VS Code, cursors sometimes going absolutely massive, etc. macOS straight up doesn't support fractional scaling, which is even worse (good luck using a 1440p 24" display with it).
There are a huge number of business-specific Windows-only apps out there, because for so long Windows (and DOS before it) was what everyone ran on. So when you hired some college CS student for three months to create that custom app for your business workflow, it ran on Windows. And it's still there running the business, and so you can't change because that would mean replacing the app. Source code? What's that?
Had to figure out which way to go with the assumption that production app
builds was through cloud...currently thread ripper cannot beat intel 13900 and 14000. Given the different boost setups not sure why the difference.
Assuming money is no object: If you want fast cores, get an Intel. If you want lots of cores, get AMD.
I was with you until you said 4090... of course that is going to smoke an M3 it's a $2000 GPU. When you consider the M3 is a CPU and GPU combined, it is a pretty compelling offering.
Apple will not be value for money for desktops, especially if you need large RAM, disk. Top of the line Max won't beat mid range 40xx. For laptop though build quality, battery life screen etc. matter. I was looking for a personal desktop 6 months back, building an Intel based was no brainer. And now I can upgrade SSD/RAM and graphics as my need budget evolve.
I'd be curious if the businesses buying these $20k+, 96-core, HEDT workstations actually do run Windows on them in order for this to be that big of a problem for Microsoft worth addressing, or if it's Linux all the way anyway for them so Windows isn't even on the radar.
Anyone here in the know?
Also, obligatory: "But can it run Crysis?" (in software render)