I can say our company standardized on Macs for developers back when Macs were much better relative to other laptops. But now most of the devs are doing it begrudgingly. The BSD userland thing is a constant source of incompatibility, and the package systems are a disaster. The main reason people are not actively asking for alternatives is that most use the Macs as dumb terminals to shell into their Linux dev servers, which takes the pressure off the poor dev environment.
The things the Mac is good at:
1) It powers my 4k monitor very well at 60Hz
2) It switches between open lid and closed lid, and monitor unplugged / plugged in states consistently well.
3) It sleeps/wakes up well.
4) The built in camera and audio work well, which is useful for meetings, especially these days.
None of these things really require either x86 or Arm. So if a x86 based non-Mac laptop appeared that handled points 1-4 and could run Linux closer to our production environment I'd be all over it.
I think you've hit the nail on the head, but you've also summarised why I think Apple should genuinely be concerned about losing marketshare amongst developers now that WSL2 is seriously picking up traction.
I started using my home Windows machine for development as a result of the lockdown and in all honesty I have fewer issues with it than I did with my work MacBook. Something is seriously wrong here.
I think Apple stopped caring about devs marketshare a long time ago and instead is focusing on the more lucrative hip and young Average Joe consumer.
Most of the teens to early 20 somethings I know are either buying or hoping to buy the latest Macs, iPads, iPhones and AirPods while most of the devs I know are on Linux or WSL but devs are a minority compared to the Average Joes who don't code but are willing to pay for nice hardware and join the ecosystem.
Looking at the arch slide of Apple's announcement about shifting Macs to ARM, they want to people to use them as dev platforms for better iPhone software. Think Siri on chip, Siri with eyes and short term context memory.
And as a byproduct perhaps they will work better for hip young consumers too. Or anyone else who is easily distracted by bright colours and simple pictures, which is nearly all of us.
I have no idea where in Germany you're based, or what industry you work in, but in the Berlin startup scene, there's absolutely a critical mass of development that has coalesced around macOS. It's a little bit less that way than in the US, but not much.
When I go to Ruby conferences, Java conferences, academic conferences, whatever, in Europe, everyone - almost literally everyone - is on a Macintosh, just as in the US.
Because every conference is its own bubble of enthusiasts and SW engineering is a lot more diverse than Ruby, from C++ kernel devs to Firmware C and ASM devs.
Even the famous FailOverflow said in one of his videos he only bought a Mac since he saw that at conferences everyone had Macs so he thought that must mean they're the best machines.
Anecdotally, I've interviewed at over 12 companies in my life and only one of those issues Mac to its employees the rest were windows/Linux.
True, but it is full of developers using Windows to deploy on Windows/Linux servers, with Java, .NET, Go, node, C++ and plenty of other OS agnostic runtimes.
Given the fact that the US has an overwhelming dominance in software development (including for the cloud) I think that the claim this is only a US phenomenon is somewhat moot. As a simple counter-point, the choice of development workstation in the UK seems to mirror my previous experience in the US (i.e. Macs at 50% or more.)
My experience in Germany and Austria mirrors GPs experience with windows/linux laptops being the majority and Mac being present in well funded hip startups.
I think you are vastly underestimating how many people use Mac (or use Windows without using WSL) to develop for the cloud.