This is an article as to why Longhorn failed. If you want to know why Vista failed, here's two big reasons:
1. In Vista, there was a completely new Antivirus framework that AV vendors had to use in order to support Vista. Good idea in theory, but because it was brand new (and because vendors had to write brand new code to support it), it had a fuckton of problems, mostly contributing to an enormous performance degradation compared to XP. Like, horrendous. And since every laptop you could buy in the store had 3rd party AV because that's how OEMs make $$, whoops every laptop is absolute garbage out of the box.
2. Windows Vista completely rewrote the USB stack, an extremely complicated, timing-sensitive component (like, add a printf, break a device kind of sensitive), and about 80% of the way through, the lead architect who planned the rewrite left the company. Meanwhile, there were a _ton_ of devices which weren't actually correct wrt the USB spec, but just happened to work with the timings that WinXP solely by accident would exhibit. A friend of mine made a lot of money in bonuses by slogging through bug after bug in the USB stack trying to make it right for Vista SP1.
It's funny, because Vista shipped a ton of great features, that then got attributed to Win7 - it really set the foundation for the modern Windows OS until Win10 came out.
So, if you learn anything from the story of Windows Vista, it's this: Performance is not a Feature, but it sure can make all of your other features not matter at all
> It's funny, because Vista shipped a ton of great features, that then got attributed to Win7
I have heard that some people at Microsoft consider Vista a kind of public Beta for Windows 7. Maybe not intentionally, but that is the way things worked out on the road from XP to 7.
And Windows 7 is, as far as Windows systems go, very nice. Very stable, decent performance on not-too-old hardware, I like the UI very much.
And a whole new graphics driver architecture - with Windows 7 getting the praise there too. But it was Vista with the teething trouble, and the third parties that didn't have their support together.
The headline being "NVIDIA Responsible for Nearly 30% of Vista Crashes in 2007", but the perception of Vista being terrible.
I think that in reality most companies (up to a certain size) or projects have people that are pivotal, enablers or highly influential. Not only charismatic leaders but also technical wizards, etc.
They might not be irreplaceable but they might be hard to replace, and if they leave in the wrong moment, it could jeopardise everything.
I guess having a management team made up of several Steve Jobs might arguably feel safer for the share holders, but it's probably hard to find them and make it work.
There are risks to manage; For instance, treat them nice so whole teams won't quit in rage. Treat the unicorns even nicer.
Tell that to Apple. :) At certain level, very good people make a big difference day to day with their leadership, vision, political prowess, technical ability, etc. Losing them DOES damage that no plan B can cover.
There was also the sudo-UAC thing. I totally think it was the right thing to implement and improved the security for the power users a lot, but pisses everyone off for being annoying as hell (and it taught people to "approve" everything, which is bad).
It became much less of a pain in the ass when it comes to Windows 7's implementation, but it was a step in the right direction.
MS always seems to have had a schizophrenic relationship with API-exposing automation.
It's like they have "Make it easy to manage" as a priority, then don't talk to admins with non-MS experience, and then build features in a bizarre way.
1. In Vista, there was a completely new Antivirus framework that AV vendors had to use in order to support Vista. Good idea in theory, but because it was brand new (and because vendors had to write brand new code to support it), it had a fuckton of problems, mostly contributing to an enormous performance degradation compared to XP. Like, horrendous. And since every laptop you could buy in the store had 3rd party AV because that's how OEMs make $$, whoops every laptop is absolute garbage out of the box.
2. Windows Vista completely rewrote the USB stack, an extremely complicated, timing-sensitive component (like, add a printf, break a device kind of sensitive), and about 80% of the way through, the lead architect who planned the rewrite left the company. Meanwhile, there were a _ton_ of devices which weren't actually correct wrt the USB spec, but just happened to work with the timings that WinXP solely by accident would exhibit. A friend of mine made a lot of money in bonuses by slogging through bug after bug in the USB stack trying to make it right for Vista SP1.
It's funny, because Vista shipped a ton of great features, that then got attributed to Win7 - it really set the foundation for the modern Windows OS until Win10 came out.
So, if you learn anything from the story of Windows Vista, it's this: Performance is not a Feature, but it sure can make all of your other features not matter at all