Hacker News new | past | comments | ask | show | jobs | submit login

This is an article as to why Longhorn failed. If you want to know why Vista failed, here's two big reasons:

1. In Vista, there was a completely new Antivirus framework that AV vendors had to use in order to support Vista. Good idea in theory, but because it was brand new (and because vendors had to write brand new code to support it), it had a fuckton of problems, mostly contributing to an enormous performance degradation compared to XP. Like, horrendous. And since every laptop you could buy in the store had 3rd party AV because that's how OEMs make $$, whoops every laptop is absolute garbage out of the box.

2. Windows Vista completely rewrote the USB stack, an extremely complicated, timing-sensitive component (like, add a printf, break a device kind of sensitive), and about 80% of the way through, the lead architect who planned the rewrite left the company. Meanwhile, there were a _ton_ of devices which weren't actually correct wrt the USB spec, but just happened to work with the timings that WinXP solely by accident would exhibit. A friend of mine made a lot of money in bonuses by slogging through bug after bug in the USB stack trying to make it right for Vista SP1.

It's funny, because Vista shipped a ton of great features, that then got attributed to Win7 - it really set the foundation for the modern Windows OS until Win10 came out.

So, if you learn anything from the story of Windows Vista, it's this: Performance is not a Feature, but it sure can make all of your other features not matter at all




> It's funny, because Vista shipped a ton of great features, that then got attributed to Win7

I have heard that some people at Microsoft consider Vista a kind of public Beta for Windows 7. Maybe not intentionally, but that is the way things worked out on the road from XP to 7.

And Windows 7 is, as far as Windows systems go, very nice. Very stable, decent performance on not-too-old hardware, I like the UI very much.


I have Windows 7 on a 4GB maxed out IBM laptop for personal use. Don't see a reason to go anywhere, as long as I can. Although, XP was also fine.


That must be due to Vista being so late, all the oems had barely any lead time to get on board with it. Whereas by 7 rolled around it was old news.


I like it a lot. I declined the free upgrade offer. Why take the risk?


And a whole new graphics driver architecture - with Windows 7 getting the praise there too. But it was Vista with the teething trouble, and the third parties that didn't have their support together.

The headline being "NVIDIA Responsible for Nearly 30% of Vista Crashes in 2007", but the perception of Vista being terrible.


Indeed. Unfortunately, no one cares how immaculate your suit is if you forgot to wear pants.


Or in other words, don't buy a bespoke suit jacket and vest, then pick up pants from Walmart right before your wedding.


> the lead architect who planned the rewrite left the company.

This alone explains a lot.


That explains what exactly? You should never allow a culture where one employee departure can endanger the whole project.


I think that in reality most companies (up to a certain size) or projects have people that are pivotal, enablers or highly influential. Not only charismatic leaders but also technical wizards, etc.

They might not be irreplaceable but they might be hard to replace, and if they leave in the wrong moment, it could jeopardise everything.

I guess having a management team made up of several Steve Jobs might arguably feel safer for the share holders, but it's probably hard to find them and make it work.

There are risks to manage; For instance, treat them nice so whole teams won't quit in rage. Treat the unicorns even nicer.


Tell that to Apple. :) At certain level, very good people make a big difference day to day with their leadership, vision, political prowess, technical ability, etc. Losing them DOES damage that no plan B can cover.


> You should never allow a culture where one employee departure can endanger the whole project.

And yet, in practice, that's how the majority of shops are run.


It suggests that Microsoft had such a culture, at least at the time.


any reference i could read for this matter?


Who was this?


There was also the sudo-UAC thing. I totally think it was the right thing to implement and improved the security for the power users a lot, but pisses everyone off for being annoying as hell (and it taught people to "approve" everything, which is bad).

It became much less of a pain in the ass when it comes to Windows 7's implementation, but it was a step in the right direction.


Also was hated by admins since the transition to the new interfaces for automation was half assed and broke a ton of client scripts


MS always seems to have had a schizophrenic relationship with API-exposing automation.

It's like they have "Make it easy to manage" as a priority, then don't talk to admins with non-MS experience, and then build features in a bizarre way.


They also introduced a GPU accelerated desktop, iirc.


Microsoft's incredible journey with dotNet.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: