Calling the presentation "playful and light" is not how I, or most of the internet, would describe it. The presenters felt like they were on the verge of tears, it was strangely off-putting, and the corporate emotion in the presentation was cringeworthy. And that's if you could even watch it without the stream crashing.
It really was off-putting the way the main guy was saying words like "this is so exciting" while having zero facial emotion and a dead look in his eyes. Felt like somebody was holding a gun to his head and making him read off a prompter.
I'm actually wondering if Stratechery watched the same presentation. I don't think anyone could watch the same presentation and call it "playful" or "light."
It reminds me of the new customer support script where they make them use empathetic words. “ Awesome, let me help” and “ Thats frustrating” sound worse coming from humans that don’t care.
All the Microsoft presentations have that now - every second sentence describes something as “awesome”. It’s really off-putting, I don’t mind if someone is feigning enthusiasm but they lay it on so thick
You don't always have to smile or emote to project engagement. If you look at him introducing the Surface Laptop 2 at about 1m30s[1] through to 2m -- the passion and energy is unmistakable. Listen in particular to his voice and how he's moving around, and his palms.
Compared to that, and especially to a casual observer, he looked disengaged at the Win11 launch event. But it's probably a consequence of having to perform in front of a camera and not be able to move around as much.
Reddit thinks of a lot of things, but now they are mocking Panos by calling him Thanos, and Microsoft didn't help considering over 50% of PCs don't meet the system requirements.
I enjoy watching Apple's keynotes with the presenters delivering extremely polished and scripted rundowns of each new product and feature; they come across as having tailored their emotions to fit the desired tone of the presentation. It's very different to me from the way that Panos comes across when he presents things.
I must be in the minority here, but I actually find it very refreshing to watch him speak in these sorts of events. I'm guessing that you are mostly talking about Panos when you mention that "presenters felt like they were on the verge of tears" since the other presenters seemed to me much more relaxed and "normal", for lack of a better term. From my own perspective he comes across as genuinely being excited about the work they're doing and doesn't seem to have a problem falling into his own rhythm while presenting.
Intellirogue's comment mentioning his facial expressions also made me realize that his presentation style might just be closer to what I see in myself; I find myself in situations quite frequently where I am genuinely excited about something and those around me think that I'm feigning excitement. Whether or not his delivery style is universally more palatable to people, I personally feel more connected to it and there is something about it that ends up coming across to me as more relatable and sincere.
I also think it might be a positive thing for us to leave open the possibility that some of these execs are as passionate about these things as anyone on HN would want them to be. I would love to have people at Microsoft and Apple and Google creating things that they feel are so incredible that they're on the verge of tears.
I would wish that people would be that excited and felt that it was that level of incredible too. However, the presentation felt, to be honest, forced. Like the entire emotion was an act.
Nothing in the presentation was something to cry over. Nobody in the audience is ecstatic, nearly crying, about how Teams is built in to the OS, or that the Start Menu button is now in the center and how that now puts you in the center (his words). We all know this isn't a big deal and yet he's so emotional about it, and that's off-putting and makes it feel fake and forced.
It's an extreme reaction to details that, frankly, most of us are looking on in horror over. Upsetting 26 years of muscle memory is a good idea? Building Teams into Windows, which almost no family uses and is a resource hog, is a good idea? A good idea so good it's worth trying not to cry over?
" they come across as having tailored their emotions to fit the desired tone of the presentation."
They are coached and it's scripted.
Watching an Apple Presentation is like watching really bad actors try to read a dramatic script, which is fine, because they're not actors and there shouldn't be a script, but it's still funny.
It's sometimes a little bit comical - when Tim Cook raises his voice to say something more poignant - you can just see the 'forced' nature of it. I can imagine the poor coach, like a high school choir director flailing her hands, trying to get 'a little more emote'.
Watch their hand movements - sometimes it's really overly expressive and unnatural.
In particular, listen to the pacing. For a long time now they've really slowed down the cadence, it's almost odd.
In the end I don't think we should ready any of this into anything. It's basically irrelevant.
But they are cringe in the way a Pixar movie is, its kinda meant to be hokey at times to humanize the presenters without trying to be too smart. They understand the image they are presenting really well whereas Microsoft has no idea what they are fumbling into.
Few things are more cringe than that mission impossible sequence from the one last year. Worrying the amount of yes people that must have got through to be made.
I think "reigning supreme" is about more than units … I think of "mindshare" and per-unit value.
20 years ago, if you asked people "what does a computer look like?" they thought of Windows, and they were excited about it. Today, it's Apple.
It's the classic "what devices 16 year olds are asking their parents for?" survey, which hovers around like 90% pro-Apple. That's a future dollar, and one that will very likely go to Apple (unless they trip over themselves / get too high on their supply).
Also! Apple undoubtedly gets the most value-per-unit … App store sales ($138/user/year! [1]), iCloud subs, branded accessories, line-extension stuff (homePod, Apple TV, etc.). So while they may not dominate in unit percentage, they probably get like 80% of the total revenue* in the space.
> It's the classic "what devices 16 year olds are asking their parents for?" survey, which hovers around like 90% pro-Apple.
Are you sure about that?
Windows dominates the high-end computer gaming market, which a lot of this demographic is into. That's a $40+ billion market, representing a large number of Windows users.
It doesn't really FEEL like Windows has this kind of market share, does it? I think if you could factor out corporate sales data from market share figures, this would be much more instructive, and would explain why Windows is no longer driving the computing world. From my circle of family, friends, and acquaintances, I think the desktop market share is more like 50/50 (vs Mac) when people are voting with their own dollars.
Outside of US no amount of marketing would convince significant number of people to buy a product, that has nearly 100% same functionality (maybe even slightly less) for 2x markup for home. Elsewhere they just convinced most people, that iPhone is a status thing.
Apple has 1.65 billion active devices (iPhone, iPad, Mac). 1 billion active iPhones alone.
That's neither here nor there, though, and I would hardly say that means that Apple "reigns supreme", or remotely close. It seems like they're an important player among important players.
Microsoft, Apple, Google (with a subscript of Samsung), and to a much lesser degree Sony dictate our digital life now.
I mostly agree with this point. The only way it’s true is that iPhone is so popular and is both a higher-spending consumer and less fragmented platform compared to android, so most mobile software goes iPhone first.
But IMO there really isn’t a single dominant platform right now, with Windows, Mac, iOS, Android, and Linux all being highly relevant in different niches.
How could it not be 1.5bn? Every business desktop and laptop computer runs MS Windows and Office (yes we all know there are exceptions but they're negligable relatively speaking). If anything, I'm surprised it's that few.
Did we watch the same Windows 11 announcement? It was horribly cringey and corporate.
I swear at one point a presenter said something like “the start button is in the center because the product puts you at the center,” and assertions about how Windows enhances how you "engage with the product"
I agree. And the start button wasn't even at the center. A random app is in the center. Start button is still on the left of a centered group of icons.
I don't think I've seen Panos Panay less enthusiastic than this in any of his presentations.
I can see why Microsoft might insist on Windows Home users having a Microsoft account, but I don't get the reason for Windows Pro users also being required to have a Microsoft account.
Because she won't back up her files, and if she gets ransomware she'll lose them. But if she has a Microsoft account, they'll be on OneDrive. Which she would never have set up if it wasn't for Microsoft's nagging.
Exactly. Windows Defender comes with Windows for free, and does a pretty good job protecting home users. However, ransomware is still the most prolific cyber threat for home users, and Defender can't do much against it if the malware executes before Defender can stop/detect it.
Integrating OneDrive automatically for Photos, Documents, etc, can seriously aide in protecting those important files for end users, since the un-encrypted versions of their files will still be available on OneDrive after a successful attack.
What if Defender was only an option for Microsoft account users? Then it'd be "look at these greedy assholes that just want to collect all your data", when really it'd be an effort for them to protect the reputation of Defender by forcing full functionality by default.
I'll preface by saying I'm glad Windows Defender doesn't do this automatically, but there are security software suites which do provide some protection against cryptoware but at the cost of performance. They tie into the write commands at the OS level and cache the changes while it watches the behavior. If it thinks the actions are probably malicious, it locks out those flagged processes from being able to make any additional file changes and rewinds all the changes those processes have made.
I've yet to actually have it work in in actuality (I have never experienced a ransomware attack either professionally or personally) but I've definitely experienced secure delete programs attempt to do a low level overwrite with random data get interrupted by such systems. I once told sdelete.exe to wipe an entire directory. It furiously ran, did a bunch of disk IOPS, halted a minute or two into the process due to a permissions error, and all my files were fine. A second or two later the security software notified me of a cryptolocker infection it prevented and happily showed me the command I had entered moments before.
Its no free lunch though. IOPS performance is a little lower and there's more CPU/RAM usage per IOPS when the protection is on.
To be fair, they aren't (at least not yet). You can still set up a local account.
As for why/whether Microsoft should decide that, it's kinda their call. Plenty of other apps/services/devices require an account of some type. (effectively) All require agreement with their terms of service.
By your logic, why aren't you irritated that Microsoft requires any type of account at all for Windows (same for any other OS)?
With XBox I'm unable to play offline games if Xbox knows it needs an update, even if I have the network connections disconnected. There's no logical reason for this. I was in a situation last year where I had purchased an Xbox with a game that was allegedly playable offline but was unable to play it because the Xbox Live service was down (but somehow the console knew it was lacking an update).
In light of this, I don't trust that some future outage of a Microsoft service would prevent me from logging in to or using my own computer, which is totally unacceptable.
You can skip it with Pro. It's easy to overlook, but the option is there.
You can also set up a new Windows 10 Pro installation without a network cord plugged in and it dramatically simplifies the setup process skipping the whole MS account dance other than telling you that you can do it if/when you connect the machine to the network later.
I begrudgingly switched to Macs about 10 years ago, but still use Windows at work and admin my kids’ machine because they wanted to build their own and play games.
Windows at work is locked down and I guess functional for the thousands of people who want to write word docs and just need things to work.
My home Windows is a nightmare to admin and has made my kids not like computers (particularly sad because my youth was spent poking around widows). It has malware despite all the Windows bloat that constantly pegs the CPU. It also frequently prompts for admin for unknown reasons. It also will force updates that take an hour. It also has things that don’t work, like screentime that blocks the UI but still allows processes to run, with audio and video.
I have a pinhole and it’s constantly calling out to numerous services.
It’s such a terrible experience from an admin and user perspective. I hope windows11 helps. I suspect that we’ll just give up games and buy a ps or xbox and the kids will live on iPad or android with a game system.
Windows has an opportunity to be a hub, but their stuff is so bad right now.
Was that Windows team/division being split and put partly under Azure and partly under office 365 discussed in here? I’m curious about commentary from Microsoft folks.
"PCs became “good enough”, elongating the upgrade cycle"
This is really visible for me. I had historically owned relatively recent hardware. No longer. My various PC's are all 4th, 5th, and 6th gen Intel. I do not a lot, but a non-trivial amount of software development, use a fair amount of native apps, etc. In other words, not just "surfing the web". And the performance is fine/acceptable. I care much more about 16GB of memory and an SSD than I do about the CPU or even the GPU.
I don't know, however, how common that is. I'm sure there are a lot of people for whom recent hardware still matters a lot.
Nadella: In our case at Microsoft, I’ve always felt that, at least the definition of a platform is: if something bigger than the platform can’t be born, then it’s not a platform. The web, it grew up on Windows. Think about it. If we said, “All of commerce is only mediated through us,” Amazon couldn’t exist, if we had somehow said, “We’re going to have our own commerce model.”
Yes, it was born on Windows, and you did everything in your power to strangle the web in its crib. And now, when it's convenient, you try to take credit for birthing it?
There's some wiggle room here, depending on how you "count". The first HTTP client and server were certainly created as NeXT apps, and I love NeXTSTEP so I love mentioning this trivia, but the reality IMO is the web that people actually used in the 90s and the web that evolved into what we have now doesn't really resemble Tim Berners-Lee's vision for what he called "the web" or his client/server.
So as far as popularizing the web for the masses and developing the de facto WWW so many people experienced, I think the earlier DOS and other Unix browsers like Mosaic were more instrumental, with Internet explorer and Netscape really "birthing" the web we know.
> So as far as popularizing the web for the masses and developing the de facto WWW so many people experienced, I think the earlier DOS and other Unix browsers like Mosaic were more instrumental, with Internet explorer and Netscape really "birthing" the web we know.
In 1995 windows was at his birth. The majority of people used other protocols than http to explore the internet. I was searching for a DOS browser to run it on the university computer (DOS diskless) to see the new web. Then MS bought Spyglass (or something) and they destroyed Netscape by bundling IE with windows. Windows did not bring anymore the web to the masses than netscape, chimera and others did. They just happened (as today) to hold a monopolistic position on the market and exploited this position.
This is actually kind of an amazing story when you dig into it. Microsoft licensed Spyglass -- which was the commercialized version of the original Mosaic -- on a royalty basis. Microsoft's version of Spyglass was Internet Explorer 1.0, part of the commercial "Microsoft Plus!" package. But then Microsoft released IE separately for free, and when it was bundled with Windows, a bit of accounting sleight-of-hand treated it as $0 direct revenue (after all, it's just freeware they're not making you download separately, right?). So, the royalty payment to Spyglass became: $0. They still paid a fairly small flat minimum, but that was it.
In case you or others are not aware: "amazing" and "awesome" are commonly-inferred as having positive connotations in the native-english-speaking-world, but the dictionary definitions of these words have no such connotations.
Amazing: causing great surprise or wonder; astonishing.
Awesome: extremely impressive or daunting; inspiring great admiration, apprehension, or fear.
So, "Amazing" is not incompatible with "Despicable". It's like a magnitude without a direction (whereas "despicable" is magnitude and direction).
We might even see MS making popular Progressive Web Apps!
>Quote : That is going to be the fundamental challenge in such a world, but we feel that there are ways. One of the ways I look at this is you can light an Android app or a PWA app or a UWP app on Windows in the future, or even today, for some of the new AI APIs.
Would it be fair to say that the web we know today, with a few big players that gather all our data was born on Windows but the web we know and love, with an abundance of personal websites was born on Unix?
Probably not, but there is some correlation I guess
Probably not. Most people in the 90s had their personal websites in some shared hosting service, while clients across the board were and still are predominantly Windows.
Benefit of the doubt, he's not talking about literally invention, but rather adoption. Most early internet users were definitely on Windows and Internet Explorer was the dominant browser for many years. Even that take glosses over them cheating Netscape out of their spot, but really IE was a cutting edge browser when it was at it's peak.
I mean, AOL was a pretty big player for adoption of the internet and they had their own version of a walled garden. Before that, there were a lot of people using lynx on a text terminal in libraries/gopher clients/dialup to a local university. By the time Netscape/Internet Explorer were making it to the scene, they were competing for something already established and growing.
IE6 was actually a perfectly fine browser at launch, the trouble was that it stayed current for so long that getting IT infrastructure to abandon it took far longer than it should have. XP and Server 2003 had a similar predicament.
> IE6 was actually a perfectly fine browser at launch, the trouble was that it stayed current for so long that getting IT infrastructure to abandon it took far longer than it should have. XP and Server 2003 had a similar predicament.
My worst experience with Internet Explorer didn't even have to exist. Why did Windows 98 do Windows Update with Internet Explorer? Was there a technical reason why Windows Update had to be coupled with an open web browser session? My memory isn't that good but from what I recall it was really painful over dial up (technically up to 48kbps but in practice, you'd be lucky to get half of that).
Windows in 1994 was not the kind of walled garden where Microsoft could have forced users exclusively onto MSN. When the web turned out to be the winner, Microsoft had to adapt and build their own browser; they couldn’t just tighten the App Store screws like Apple would do in a similar situation on iOS.
I dont think you can compare what Apple and Microsoft are doing at 2 completely different points in the evolution of the internet. When microsoft embraced the internet we didn’t have ubiquitous TCP/IP yet. It was essential. Today we have the bottom layers so what businesses choose to do on top is their, well, business.
He very carefully does not say "born on windows" but "grew up on windows". Be sure that phrasing of a CEO's words on a press interview are very carefully chosen and checked.
Hah, QuakeEd, quake's editor was born on NeXT (written in Objective C), but then ported to Windows (MFC) as this was (and still is) the typical developer's machine (or DOS back in the days).
Not to mention that MS were actually pretty late to the party. They didn't put any chips down on the growth of the web until win98, which could already be marketed as "web focused" because... the web was already huge.
Someone please correct me if I'm wrong about the above, since I'm going off childhood memories here.
Internet Explorer was first released on August 16, 1995 [1] for as part of the Microsoft Plus! pack for Windows 95. That said, IE 1 and 2 were pretty paltry efforts and Microsoft's attack against Netscape didn't really get going until IE 3, which came out in '96 [2]. As I understand it, this was also the first release of Internet Explorer that was packaged with the operating system, as it was shipped with the Windows 95 OSR2 service pack.
You are correct. Their initial attempts were messy then too. By time the iMac G3 came around in late '98 the web was the #1 reason people were buying computers and Apple was marketing on how much easier their product was to setup for the web. Microsoft was truly remarkably bad at winning the race so they went off trying to shoehorn IE on Mac, which was a winning bet when the iMac became the #1 selling machine.
Around that time Gates made a speech that they were turning the company around to focus on the Internet. Unsaid was that they were quite late to the party.
Before that you had to install TCP/IP separately on any non-Unix boxes. Personally I had been using the net for years at that point.
Ha. Mainstream Windows at the time didn’t even have TCP/IP software to connect to the internet. I think it came in a later revision of Windows 95. Windows 98 did have it installed though.
Indeed. In fact it was J Allard's[0] 1994 memo "Windows: The Next Killer Application on the Internet" that caught the attention of Bill Gates. Before then it was clear many (most? everyone else?) at Microsoft didn't see the potential of the internet.
It was his memo that lead to reshaping Microsoft's direction with Windows and the internet with the inclusion of TCP/IP in Windows 95.
Allard went on to be a key member of the original Xbox launch team and Xbox Live. Unfortunately a lot of what he worked on post-Xbox was not as successful and he left Microsoft around a decade ago.
Since then he hasn't been involved in much of note, or at least nothing to the same scale and impact as his earlier work. Although topping his internet memo and the launch of Xbox and Xbox Live is pretty hard to do to be fair :)
95 shipped with a TCP/IP stack at launch but it wasn't checked for install by default. The big thing missing was IE 1.0 didn't ship with the OS initially, you had to get it via the Plus! pack until the later versions of 95. Came with pinball too though.
My first ‘job’ was going around to professors houses and setting up their modems and trumpet windsock so they could connect to the internet through the university’s modem pool. I was 12 and my dad came with me, the professors would reach out to him to get me to set up their internet. Hilarious!
Nadella's talking about how the web "grew up" on Windows; it's just rhetoric about how the dominant (client-side) portion of the web was facilitated through Windows.
>portion of the web was facilitated through Windows
I'm old enough to have lived through ie6. They did everything they could to hold the web back a generation because they saw the future and it was web services and the open web doesn't exclusively run on windows.
They did not however, prevent you from downloading and installing netscape like Apple is currently doing with iOS.
They absolutely tried to, and were only stopped because the Justice Department told them that bundling IE with Windows was an illegal use of monopoly power. It's interesting that Nadella gives Microsoft credit for that and not Attorney General Janet Reno, under whose authority the Justice Department pursued the anti-trust case against Microsoft that led to browser bundling being declared anticompetitive.
Bundling isn't the same thing. They proactively pushed IE on Windows users which was deemed anticompetitive, but they never prevented you from using Netscape or Firefox. And the only reason that decision was made then and is not applicable to iOS now is that Windows overall was so dominant as a desktop OS that it effectively owned the market. Apple can say that users who don't like Safari can just go buy an Android.
Your memory might be faulty. They got in trouble for bundling Internet Explorer with Windows, but at no point did they ever make any attempt to make it so you could not install an alternative browser.
> prevent you from downloading and installing netscape like Apple is currently doing with iOS.
I don't know about netscape, but I currently use a browser other than Safari as my default browser. Unless you mean the underlying framework is WebKit?
The funny thing about that phrase is that it was invented to describe Active Directory's relationship with LDAP and Kerberos. Active Directory is definitely one of Windows's killer applications, and rather than attempting to kill it, Microsoft instead makes a killing off of it.
"In 1996, the iframe tag was introduced by Internet Explorer; like the object element, it can load or fetch content asynchronously. In 1998, the Microsoft Outlook Web Access team developed the concept behind the XMLHttpRequest scripting object.[4] It appeared as XMLHTTP in the second version of the MSXML library,[4][5] which shipped with Internet Explorer 5.0 in March 1999.[6]"
This was Microsoft's single biggest contribution to the early web. Everyone likes to hate on IE but actually it did include a lot of innovation.
I remember this time as well and it's not really true. They pushed a lot of proprietary features like ActiveX, but IE overall was far more sophisticated than Netscape. It really pushed DOM (albeit a hacky, proprietary DOM) as a way to dynamically manipulate pages. They also birthed XMLHTTP which was the original format for AJAX. Netscape was pretty Stone Age by comparison.
Some new technologies that are now crucial to web standards were developed as a part of IE. But to say Netscape was “Stone Age” in comparison is disingenuous and/or a misremembering of the past. Especially the following 5 years after Netscape died; when the web stagnated on IE6 while Firefox and Safari slowly gained market share.
IE had many issues supporting other crucial (and standard) technologies like CSS, images + alpha blending, basic ease of use features (tabs, accessibility, extensions), security (a good chunk of the ciphersuite was missing), etc.
It really does not matter what MS tried to do back the, as the average consumer was using Windows to access the internet.
Sure, MS was bad. Doesn’t change the fact that most people first experienced the internet through Windows. And it was that audience that brought business and more money.
I’m sorry, but your memory is either for a very specific community or just naive. Just the sheer number of applications around web development for Windows vs Mac at the time would disprove that belief. For every Cyberduck; you have a WinSCP, FileZilla, CuteFTP, SmartFTP, CoreFTP, CrossFTP, etc. Especially since the Macromedia suite was also available on Windows.
Having been in the community pretty deeply in those days, I can assure you that Apple didn’t gain any sizeable market share until Web 2.0. The prototype.js + Rails developers definitely preferred OS X for their tool suite. The growth of backend developers, in general, led to Apple’s growth; via developer’s needing terminal access and easy installation/management of backend tools with ease of use.
I think I'm going back farther than you think. Webstar, BBEdit, and of course the Macromedia stuff which was in it's infancy. Macs seemed like a bargain compared to say a WebForce Indy
I'm talking pre-Macromedia. The mid-90s to mid-00s were ruled by Windows developers on the web, with a slice for Linux and Mac users. Mac didn't become any sort of influence until sometime between 2006 and 2009 and certainly didn't gain its current popularity with web developers until 2012-2013.
If you're talking pre-Windows 95, certainly Mac was more popular than after it's release. But the OS also had a much more sizeable marketshare[1] and the inverse was true (Windows was devoid of software and tools). At the same time, however, the web was also much smaller and certainly still in it's nascent infancy.
I think it's not just rethoric. E-commerce was the thing that brought money to the Web, that let it grow up and grow big. And the vast majority of these transactions were made by people sitting before monitors with a browser on Windows.
MS wanted the web tied to IE and Windows, but I don’t recall them trying to kill the web or positioning themselves to take a percentage of all web commerce.
He didn't say it was born on Windows, he said it grew up on windows. Per your quote:
>In our case at Microsoft, I’ve always felt that, at least the definition of a platform is: if something bigger than the platform can’t be born, then it’s not a platform. The web, it grew up on Windows. Think about it. If we said, “All of commerce is only mediated through us,” Amazon couldn’t exist, if we had somehow said, “We’re going to have our own commerce model.”
That's hard to argue. AOL made it where Prodigy and Compuserve withered because it embraced the WWW on .... Windows. I would also agree that Microsoft didn't do much to have the web grow up on Windows; in fact they actively thwarted it in many ways including their embrace, extend extinguish strategy. They mistakenly believed that controlling the browser was the key to controlling the internet.
In fact MS not only invented XHR (and the reason why is called that is because it originnaly was semi-documented internal Windows API), but they also invented bunch of HTML-native features that are currently being reintroduced into HTML5. And then there is the “JS frameworks” front-end crowd that will actively push back against doing anything non-trivial in native HTML5.
It's even richer than you think. This is not true:
> it was born on Windows
It was born on the NeXT operating system. That was the Steve Jobs creation that Apple bought and used as the foundati0on for OS X. Here's the actual history:
He said the web grew up on Windows. Internet Explorer invented XMLHttpRequest, paving the way for interactive web pages that can dynamically fetch data.
You can have web pages that dynamically fetch data without XMLHttpRequest. For example, one technique is to dynamically inject a <script> tag referencing the URL of a server-side CGI script which collects the data and sends it back encoded as a call to a JavaScript callback function to be executed on the client. It's not as clean, and it can only manage GET requests—so the URLs need to be unique to prevent caching—but it works.
Windows was already dominant at the time. No other OS was as wide deployed, popular and known at the time. Most people I knew in the 90's had their first contact with the internet through a windows box.
Microsoft may have been late to the party but, considering how they basically owned the client-side, they had time to recover. In just a few years IE ate 90%+ of the browser world.
Yes, but that doesn't alter the fact that Bill Gates saw the web as an existential risk for Microsoft, and did everything in his power to ensure that the web evolved in a way that would benefit Microsoft in general and Windows in particular. I find the second part of Nadella's quote to be especially ironic:
Think about it. If we said, “All of commerce is only mediated through us,” Amazon couldn’t exist, if we had somehow said, “We’re going to have our own commerce model.”
That is, in fact, exactly what Microsoft tried to do. They tried to force their own proprietary standards on the web. It was only after they were slapped down by the Justice Department that they stopped engaging in anti-competitive behavior and created the space for alternative web browsers to grow. Even then, it was almost too late. How many decades did we wait for IE6 to finally die so that we could write web sites that would even pretend to conform to modern web standards?
That's not the same thing. Bad as it was, MS/IE never prevented websites from existing, or even partially-gated them by requiring them to get MS approval like console video games and iOS App Store.
He said "grew up" as you quoted, then you said "it was born on Windows" which is untrue, and renders your entire point invalid.
While I agree that Microsoft worked hard to hold back innovation on the Web in its early days, you are changing the words of a quote so you can then attack the quote, and that's not a good look.
In the nineties we came within an inch of having MS Internet. If IE with ActiveX had succeeded at displacing Javascript we would be condemned to having one corporation controlling all end user internet experience. We should count our lucky starts that they screwed up Windows Mobile so badly.
I didn't catch that quote -- that's an amazingly dumb thing to say. It's so dumb that I'm inclined to believe it was a mistake, rather than something Nadella believes. Maybe the PR folks had a little too much say in that statement because more than a few BS meters exploded.
Ignoring the statement for a moment, Nadella killed "Windows First" thinking. It took me a long time to believe they were actually doing it, but SQL server, and then Teams -- a component of office -- along with the fact that SQL works better in Linux than on Windows (embarrassingly so in frequent cases), Outlook 365 in Firefox works better on any OS than Outlook on Windows, .NET Foundation/open source .NET core going from "a sort-of .NET for those of you who insist on using non-Windows stuff, to working better for nearly everything than the "Full Framework"[0].
Had that same sentence been altered by one, easy to slip on, word it would have been far more accurate ("ignoring" facts rather than outright lying about them): "The web, it grew up with Windows" or along side Windows. Accurate would be "dragged Windows with it kicking and screaming at times" and "Windows as actively hostile toward the web" but I don't expect the best CEOs of the least important companies to be that honest. I'd love it, but it might not fly on Wall Street.
Had Microsoft continued in its monopolistic behavior, it's arguable that MS tried to and possibly could have delayed the success of the Internet via its refusal to support internet protocols natively in the OS (Windows 95's earliest version lacked Internet Explorer, but also didn't include tooling -- IIRC -- to speak SLIP/PPP or anything outside of NetBIOS, including TCP/IP/UDP by default). The browser would could have done without the "kill Netscape at all costs" behavior, but even that produced tangible benefits at the end of the day.
I can't pretend to read his mind, but I doubt he doesn't recognize the stupidity in that statement -- it's possible, but it would be contrary to what I've experienced.
I'd love to know what Microsoft would have done with Nadella at the helm during that important time. I don't think what I know about Microsoft, today, would mean that they'd have embraced the open web back then. I'd take it as a strong positive if Nadella, today, believed he would have led Microsoft here all the way back then, but there's so much hindsight in that conclusion. Most major players at that time were looking to "lock in users, using the internet, somehow" at the time, that it wasn't an obvious, correct, choice that one could sell to shareholders at that time. None of those major players (save Oracle) made a hard turn at some point and look nothing like they do today, culture-wise (Oracle was negatively affected by it all but Larry Ellison's reality distortion field appears to be alive and well over there). In a lot of ways, the "locked in users" is the internet we have, today, but the companies from back then largely didn't become the winners in today's locked-in world (save Amazon).
[0] See my other comment for the variety of things that Microsoft at least owes to Nadella's willingness to reshape internal culture in a way I wouldn't have expected would be possible.
Nadella’s true colours showing through. A few years back on here people were virtually giving the guy a metaphorical hand job about the new improved Microsoft.
So reality is as I described : hubris and marketing resulting in nothing but stagnated excrement for everyone unfortunate enough to have to deal with them. Again. Same as the last executive positions there.
If Microsoft hadn’t had their hands in our technology future we’d be further along and better off. All they have done is set our expectations low.
I’ve been a Microsoft developer, admin and architect for 30 years as well as a director of a gold partner and probably directly put tens of millions of dollars of revenue in their direction.
Hell I even have met Bill back in the early 90s on a couple of occasions and got some books signed.
I’ve also been on the end of having products pulled from under me causing massive rewrites of stable LOB applications, on the end of audits and bugs that are so abhorrent that their own internal division heads got involved at the time.
What’s worst is I burned possibly tens of years of human lives on things which were directly their fault through absolute incompetence and negligence towards their customers or burning their entire roadmap for something new and shiny.
So quite frankly I think I’m fairly qualified to tear them a new one when I see fit.
Edit: as always I notice that the moment the US wakes up any critical posts about MSFT magically start plummeting. Reputation that important? Earn it!
I remember over a decade ago such as in 2006 ppl thought that Google would overtake msft business by offering free ad-supported alternatives to many msft products. Hardly. Both have grown to dominate their respective industries
It’s interesting too that Google kind of stopped working on features in docs/sheets so they never hit parity with office. Close enough doesn’t work for excel.
Had they actually tried to replace office, I think they could.
It is very interesting what happened. Google did overtake Microsoft in may areas:
- The most popular operating system
- More popular document suit
- Higher profits
Microsoft survived successfully and thrives today even if they don't have the leadership in may areas.
With azure's success, windows is no longer microsoft's most profitable product/service and it hasn't been for a long time. Chromebooks got a small but stable fraction of the education and laptop market. Many activities that required a desktop can now be more easily done using a smartphone, tablet or smart tv. Web services are, by definition, multi-platform; making an OS-specific features unneeded. The set of reasons to use a desktop won't disappear but it is still getting smaller and will continue to do so for some time.
It is a matter of market demand... windows is no longer as valuable as it was and will eventually become a second class legacy product. When that will happen is hard to predict.
"What gives Microsoft more freedom-of-movement, though, is that Windows is no longer the core of its business. This remains CEO Satya Nadella’s biggest triumph"
Apple recognized this years ago about its OS, of course, but I'm not sure I agree with the statement above. The entrenchment of Windows in the enterprise is huge for Microsoft, and keeps the door propped open there for most of their efforts.
What I mean by 'keeps the door propped open' is that a generation of 'Microsoft stack' developers and development has created a corporate infrastructure landscape that has been paved and fenced by Microsoft's Windows (and to a degree, the .NET ecosystem).
This is an incredibly silicon valley centered viewpoint. I'm sure the guy makes some good observations, but how can you take anyone seriously that starts with a comment like this? Apple is insignificant outside of the English speaking first world: North America, UK, Aus/NZ. I wish some of these people from those markets would get outside their bubble from time to time. There's a huge market out there that has never used nor wanted an iPhone or any other Apple product.
Not that they are bad, they just aren't all that special.
> Apple is insignificant outside of the English speaking first world
Japan has a higher share of iOS than the US or UK. Also China is ~20% iOS. Sounds small until you realize that percentage equates to more iOS devices than the US.
Also Switzerland or Sweden have a lot of Apple users. All those countries have one thing in common: everything is very expensive there. So people have a lot of money and they can easily afford Apple products.
In Argentina for example an iPhone is around 2500 USD (100% import tax), an average citizen needs to save for a year to afford an iPhone.
An average Argentinian office worker should be able to put 2500$ per year aside. If he doesn't need a new washing machine or new furniture, he could get an iPhone. But he would need to spend all his savings just for one thing.
Grandparent post was speaking about Apple generally: "There's a huge market out there that has never used nor wanted an iPhone or any other Apple product."
Yup. The starting price of an apple computer in India is almost 10 times the median salary of a young worker / executive.
You will not fine Macs in India, except in the 1%, or in movies.
I am sure that is true for most of South Asia.
It's Windows all the way down. There is a huge black market for pirated windows copies here and a decade back, the standard PC setup was a basic Intel based system, loaded with pirated windows os on a cd that costs about 2 USD.
Globally, about 270 million personal computers (including laptops) [1] and 1.5 billion smartphones [2] are sold per year. Apple makes up about 17% of global smartphone market share [3] and 55% of US market share [4].
How can Apple "reign supreme" over the industry, to the point where they're facing significant anti-trust pressure, while also having negligible marketshare? Clearly there is more to this supremacy than raw marketshare.
US marketshare and mindshare among the (US) people who decide such things. If all the relevant politicians and judges and their friends and families (presumably all wealthy) have iPhones...
As well as having 55% US marketshare as other posters have mentioned, they account for 42% of global smartphone profits despite only having 17% marketshare.
In the consumer market, maybe. In reality you are only taking into account your country, in all other countries macs are rare, and are typically only used by creators (photographers, musicians, video makers, etc) or by someone that develops iOS applications (because you don't have an alternative, or legally you don't, a lot of people just have hackintosh VM to compile).
Nowadays a lot of people that used a mac went back to Windows, because let's face it, it's more stable, it's faster, and consumes less resources. I once was a macOS user, and the latest release was so slow on my 2015 macbook pro (that still is a pretty good hardware) that made me format everything and install Linux on it. Now it seems to me that I have a brand new computer...
In the professional market there is only Windows. In every big company everything is Microsoft, Windows, Office, Active Directory, Exchange, Windows server, Azure. Probably the reason is the IT department likes Windows more since it's easier to maintain than macs.
Regarding iPhone, they are not so diffuse as they are in the US. In my country you have a 19% iPhone, and it's one of the European countries where they are more prevalent, and still it's a minority. Most of the people uses Android phones, and doesn't want to spend more than 200 euros for a phone.
Probably in the US iPhone are so diffuse because you usually get them with your mobile plan, while in my country no one buys a phone that way, since you will end up paying it like double the price and you loose the freedom to change the provider as you wish.
I find this baffling. I'm in Sweden and I do see a Mac every now and then, especially among artists and designers. But most people I know use nothing but Windows and every place I've ever worked used Windows. The only place I've seen a lot of non-Windows machines is the university. They had a fairly even split between Windows and Linux.
As the parent mentioned, Macs are extremely popular in the SV development scene. Even large, stodgy 70,000 person companies like Cisco consider them the standard.
IBM, in a historic twist of fate, rolled out Macs to their entire fleet unless you opted out. They also said that using Mac, contrary to what you'd expect from the purchase price, saved them buckets of money.
People (consumers) see the Mac prices and balk because they're comparing to the Acer Aspire they can buy at BestBuy on discount for $300.
Businesses see the Mac prices (and, barring upgrades): they see that they're cost competitive with the contemporaries.
Dell XPS, HP Elitebook, Lenovo Thinkpad Txx (or Carbon X1) are all in spitting distance and in some cases even more expensive than Macbooks.
That's before you even talk support contracts or yearly licenses (the OEM license on the unit is rarely the one you'll use, usually you have a volume license and a KMS server which also costs a lot of money).
Exactly. From a business' perspective (and according to IBM's own study so this isn't conjecture), Macs are significantly cheaper in the long term, and only slightly more expensive in the short term if even.
Also, when it comes to management for a small or even medium sized business, look at Apple's approach to management: https://www.fleetsmith.com . Dead simple pricing, dead simple UI, and they provision your devices out of the box without any manual setup. You can just buy 50 MacBooks, Apple will automatically ship them set up to your employees for no charge, and you'll pay a several bucks a month per device for management.
Now compare that to how you manage a Windows deployment. By comparison, it's unintelligible. You need to upgrade every device to Windows 10 Pro or Enterprise as part of a contract to join a domain, set up a domain server with Active Directory to have a user store, add connects to connect the Directory to the Domain which may or may not require a Windows Server, which may or may not require the arbitrary Windows Server CALs, and then you have to manually set up and wipe each device out of the box unless you managed to get a special deal with the OEM...
> and only slightly more expensive in the short term if even
As I found with Lenovo and their bizarre continued use of 1080p as a baseline everywhere in CURRENT_YEAR, if you specifically go looking for laptops with even just 1440p displays the prices quickly inflate towards Apple levels.
I don't see how having to replace an entire computer for an hardware fault like a broken SSD can make you save a lot of money. Maybe in the old days where macs were still professional machines.
Nowadays macs are the nightmare of every IT department, the smallest that on a PC takes a couple of minutes, like changing an hard drive or a RAM module, it's a major surgery on a mac, where it's still possible and where you don't get all the things soldered on the main board.
You have an OS problem on a mac? Good luck wasting hours reinstalling the operating system manually and then installing all the software. On Windows? You have your custom install ISO with all the drivers and software you need and you install it automatically. Or just have a stack of cloned disks, and in case of a problem just swap the hard drive, in most enterprise computers that have everything tool less is basically the fastest way.
Also Windows machines are usually favorites by the IT departments because they work well with all the Active Directory stuff, meaning that you can manage everything centralized. Yes, you can even on a mac join an Active Directory domain, but you can do less things than with a Windows machine.
And surely you have to develop a lot of custom tooling to administer network made by macs, while on Windows you can do that with standard tooling.
As mentioned by the sibling comment, there are a lot of iPhones here. Not enough that I would say Apple "reigns supreme" over the smartphone market and certainly not of computing as a whole. But including smartphones absolutely does make the claim less dubious.
Apple only reigns supreme over the niche it's carved out for itself, which has maybe grown a bit over the last years, but will still always be a minority (globally) if you compare it to Windows/Linux on desktop machines or Android on mobile. And that's ok for them, because they don't want to reduce their profit margins.
From what I observe in NL (EU) is: Mass market is moving away from home PC's to handheld devices, either iOS or Android (about 50/50) (for example my dad uses an iPhone and iPad, mom uses an Android phone and iPad) Lot's of students, devs/ start-ups are using or switching to a macbook.
I wonder if consumer desktops will be replaced by "phones" connected to a keyboard/mouse/screen at some point. It kind of would make sense. If you could run Windows11+Android apps on a new type of good phone this would be great imo. It would save lots of desk space too
I don't think he intended it to mean "general purpose computing" but "consumer computing," he mentioned iOS in the next line. If you include mobile devices, in the US at least, I can see the argument, although it still isn't a great one considering Android.
It doesn't "reign supreme" in consumer computing outside the US either. Rather, it's hilariously down - Apple has a mere ~2% market share in India (the largest smartphone market in the world after China).
Even that seems to be limited to, as another comment said, the English speaking first world. The reason why I don't have an Apple device is not that I can't afford one. I can easily afford their stuff, but I want a Linux computer and a phone that is cheap enough not to worry about. Plus, their smug, hyperbolic marketing is really off-putting.
If we're talking about phones, Android is very low maintenance as well.
If we're talking about Real Computers, Linux is more rough around the edges, sure. But it has Freedom. And it's a fantastic system for software development.
I can’t speak for android now but when I used it 4 years ago I actually suffered irrecoverable data loss due to buggy SD card handling and was constantly bugged by it every two minutes (Motorola handset) for various updates and issues. It did not respect my attention however hard I tried to configure it to.
So it went. I can’t trust it. Not only that with Google’s business model I can’t rationally invest in it. Oh and the handset was abandoned after 12 months. My backup iPhone 5s got an update on 14th June at 7 years old.
As for real computers, yes. The Mac I have is a nice terminal for Linux on a server but Linux on the desktop is not rough round the edges: it’s a nightmare for daily use.
Linux on desktop is very far from a nightmare. Sure, if you don't know what you're doing and you mess around with things, things will break.
If you use it as you would use Windows or Mac OS, then there are no more issues with Ubuntu from my experience.
I actually prefer Ubuntu over the other two as the web developer experience is much better than Windows and Mac OS. It's nice to use native tools. It's nice to be able to trust profiling. The same is also true for the other two where you would have to use replacement software on Linux.
I’ve been through this discussion a thousand times before. I bought an officially tested with Ubuntu T495 laptop and a kernel update broke it entirely due to bugs. X just craps itself and the whole thing hangs. I don’t have the time or inclination to deal with stuff I expect to work.
I do most of my dev work in linux but using the Mac as a terminal.
Like so many other things in economics, the top 20% of the market could very well be more lucrative than the bottom 80%. Hence Apple's perceived "supremacy".
They're barely even competitors, IMO. Apple's where you go to pay a premium for the closest thing to a just-works, well-integrated OS & device ecosystem that the market provides, for home users or individual pros (or small businesses that don't do much in the way of "fleet management").
Microsoft (rather, some manufacturer shipping MS-bearing machines) is where you go if you don't want, for whatever reason, to pay a premium for that, or you need the best business-oriented integration & management the market provides.
That covers about 99% of all personal and business computer users, between those two—the remaining sliver is mostly Linux and the BSDs. But there's not a ton of market overlap between them, I don't think.
I think you're missing the fact that Windows does offer a lot on the home front. Games and stuff like HTPC are just better on Windows.
Anyways they are competitors because people do switch between them. Currently I have a MacBook Pro, my next laptop absolutely will not be a MacBook Pro. It's the worst laptop I've ever owned.
"Games and stuff like HTPC are just better on Windows." But for how much longer? My first foray into M1 Mac ownership has shown me that for all but the twitchy first person shooters, Apples entry level chip could easily end my reliance on a separate Windows desktop for gaming. If I wasn't overly addicted to assets and mods in Cities:Skylines and thus need at least 32GB of RAM to support my oversubscription habit I'd still be rocking an M1 Mac today.
And Apple is just getting started. I was amazed at how many Windows only games I could get to work with either CodeWeavers commercialized and drop dead simple to use WINE implementation or the Windows ARM beta running on the Parallels beta.
If Apples new SOC is able to woo more and more people like I think it will to run on Macs at some point developers will notice and the quantity and quality of Mac ports will improve and I may even not need Windows running on my Mac to run all the games I'm interested in. Most of the games I run today have Mac versions already (although that didn't factor into my buying decisions since as I mentioned I also have a Windows box).
I look at this similar to the Blackberry before the iPhone. The year before the iPhone if you would have told me that it Apple was coming out with a phone that would utterly displace Blackberry within the next five years I would have laughed you out of the room - but that's exactly what happened. Nothing is guaranteed, but I wouldn't be shocked to see a significant shift away from Windows for quite a few segments of gaming.
Apple can't excel at games because powerful gaming hardware goes against their "thinness at all costs" philosophy.
Your post reminds me of the person who said Microsoft Office was going to go out of business when Iwork web launched in 2015. I think any argument that goes "this reminds me of Iphone" is a poor argument.
The GPU in the M1 is about as powerful as a nVidia 1650ti in laptop form. Unless they heavily prioritize GPU development or work with AMD/nVidia they'll never be a player in the game market due to serious lack of GPU power.
Absolutely for many CPU bound loads the M1 is a great processor. It just lacks any GPU power which is what you need for games.
> I think you're missing the fact that Windows does offer a lot on the home front. Games and stuff like HTPC are just better on Windows.
Well, yes, if you have any software that is must-have and it's only available for (or best on) one OS, that'll be what you choose.
Incidentally, I'd not put Windows at the top of my list for HTPC. Probably 3rd or 4th under Linux and Android (if I didn't want to pay anything for the OS and wanted more of an appliance, or if I were using anything short of a quite beefy computer—Win10 is high overhead and uses tons of disk) and macOS (if I wanted a desktop experience on the TV and for most advanced media stuff to probably Just Work without having to mess with config files—old Mac Minis are great for this), unless I was also trying to PC-game on a TV.
>It's a very weird perspective to think that Mac's 5% market share actually matters and is a threat to Windows.
Yes it matters, and yes it is a threat. At least in certain market.
Mac usage share has increased from near-zero to 30% in the United States over the past fifteen years, while Windows has dropped from 90%+ to 60% in the same time period [0]. The trend is very similar in other wealthy markets. Windows isn't going anywhere, but the era of Windows hegemony in wealthy markets is now over.
The problem Microsoft has with Windows is their moat (win32) is turning into an anchor. All that pandering to enterprise customers has created a hostile attitude towards end users.
win32 is "pandering" to enterprise customers? What would pandering to non-enterprise customers look like? Their non-win32 APIs (XAML/UWP/winrt) have all failed to gain market share.
Microsoft fleeced the IT users with restrictive practices and overpricing.The windows OS is still a rickety machine of labyrinthine codes.The explorer is still pathetic.Nadella is talking like Indian politicians
>Microsoft, like Apple, is responding by doing what they do best, but, because it’s Microsoft, it’s the exact opposite of Apple: instead of more deeply integrating and doing everything themselves in an attempt to appeal to consumers, they are opening up and removing limitations in an attempt to appeal to developers, and by extension consumers who don’t want to be bound into Apple’s ecosystem.
This attitude is why I think Microsoft will continue to be successful and why Nadella is a much better CEO than Ballmer.
This article sums up a lot of my thinking pretty well, and at the same time, I didn't find any of it particularly earth-shattering, and that it was all a rather obvious place they've been heading[0] but I realized after I reflected on it for a moment that ... this industry sometimes makes you think in terms of minutes, not years, and ... holy crap, are we really here?
Shortly after Microsoft started realizing the threat that i-devices posed to the entirety of their business (anyone who had used an iPhone/iPod Touch saw what was coming), they reacted by digging in/"patriotism", with policy changes like: "We'll buy anyone who wants one a Windows Phone 8 device..." because... they were already buying almost everyone an iPhone that asked for one and they needed a way to say "we're going to make it a lot harder for you to show our customers that you prefer an iPhone". My best friend and I had jobs developing/architecting large MS-related things so we regularly assessed the marketability of our skillsets. We both concluded: "Microsoft has one way out of this: give away[1] Windows, probably need to start making Office/Linux work or provide easy integration points for the OS community to integrate with AD/Exchange[2]". I, separately concluded, that I needed to start using a Linux desktop, full time, because no part of me believed they'd do it. It's, culturally, too foreign for them.
And then, they started executing on exactly that plan, consistently. When they'd make the inevitable "Microsoft-centric" decision and get blowback, they'd unexpectedly listen, and either thoroughly explain, or resolve.
Windows 10 isn't free as anyone who's built their own desktop knows. But it's effectively become the last version of Windows you buy, upgraded for life, and they made that retroactive to Windows 7 (do they still do this? not sure). Stuff mostly just works with regard to Linux interop -- NFS is a problem in some contexts, but most of the things that seemed like they existed "just to make it painful to interoperate between the two platforms" are gone, and a lot of that is because Microsoft put money behind intentionally making "The Enemy OS" work, with care spent to not just "make it work" but fix the parts that aren't quite good enough" continuously.
The best part of it, though, is they're operating as the "opposite" of their past behavior with open standards/Linux in general. I wouldn't have been surprised to see an OK implementation of Linux interop on Windows (Windows Services for Unix was the start of that idea), with no attention paid to getting MS apps running at all on the Linux side. I'd have been quite surprised if they wrote anything* for Linux, but they made the important things work well on Linux.
I'm an enthusiastic full-time user of OpenSUSE Tumbleweed, with all of one disused Windows OS running in a virtual machine, who writes software almost exclusively in C# (console, web and even an occasional GUI, at home, anyway).
It's not just that they "open sourced a bunch of really important things", but it's how they approached opening up things like .NET Core. They completely failed with the full framework's "Shared Source" crap. They recognized the complete loss of trust that all of us (I am and have been a happy-minus-a-few-things Microsoft customer all of my career).
How many companies much smaller than Microsoft release their "frameworks/libraries/tools/software" as Open Source products but simply refuse to accept a single Pull Request. Basically, "Developing it transparently" -- a good thing -- but failing to "develop in the open". How many tiny little edge cases around the variety of framework Collections required me, in a high-performance scenario, to write an ugly hack because my performance edge-case was not worth fixing. They didn't opt to "develop in the open" and accept only internal PRs. They didn't take it to the next step and "benevolently, (but probably reluctantly) accept the occasional PR). They encouraged developers to submit pull requests using swag/recognition and, most importantly, willingness to accept PRs that aren't all that important to Microsoft (or maybe anyone but that one developer if it doesn't negatively impact anything). They even hosted it on GitHub, despite owning/pushing a competitor at the time, because "that's where everyone really is". A look at some of the core logic around collections (and enum for Pete's sake), turns out that all of those previous edge cases are not only solved, they're solved really, really well[3].
God, please give me a time machine so I can see the reaction when I explain all of this to 12-years-ago me. I "would will have been[4]" more likely to think that this future version of me appeared out of some alternative timeline that I'd never actually see.
[0] Not a dig at the author/topic, I just felt that he was explaining something that "most people who care at all about this topic already knows".
[1] Ideally "open source it" but I knew that was extremely unlikely simply because of legal reasons that are far more complicated than I care to understand.
[2] This was a long time ago; I'd say "Office for Linux" is a necessity, and is acceptable (and I prefer to use Outlook web, no matter the OS; it doesn't endlessly hang), today with the decision to write a Teams for Linux (albeit, with fewer features) client.
[3] I was downright disappointed that a really nice generic Enum library that I wrote is completely pointless in .NET Core. Every single thing that was 10-100 times faster in that library is now slower than just using the built-in stuff. And .NET's implementation uses less memory.
[4] See Douglas Adams commentary on the difficulties of time travel and grammar.
> But it's effectively become the last version of Windows you buy, upgraded for life, and they made that retroactive to Windows 7 (do they still do this? not sure).
I upgraded -- for free -- a laptop running Windows 7 to Windows 10 about six months ago, long after the free upgrade period ended. I have a Windows 7 box that brings up a nag screen that Windows 7 is out of date and I should upgrade to Windows 10. This screen shows up every few days. So, they still do it.
For what it's worth, I realized after I wrote that original comment that I have a (rather powerful) desktop in my basement that I stopped using several months ago (really, about 2-3 years ago my usage dropped off substantially). I had just done a large hardware upgrade, re-activated Windows 7 Ultimate and ... a project heated up which caused me to not use it for almost six months, the PSU started to flake out, causing it to be unstable under moderate load (with PSU testing A-OK) resulting in another 6-months of reluctant, infrequent, use which became so infrequent that the last time I touched the machine it was to unplug the Ethernet cable as a precaution against accidentally using a two-year out-of-date and out-of-support OS to check my e-mail.
I intended on upgrading it to Windows 10 but fell into one of the worst projects of my life that put it off for 3 months, I had already grown accustomed to using a new laptop, loaded with Tumbleweed which matched my work laptop and just never got around to it.
One of the de-motivators to that was expecting that I'd be forking out money for the upgrade. That's disappeared, now, and I had already planned on finding all of that out this weekend/doing it because I'm at the point where -- though there are one or two things that there is no Linux version of my preferred app, and either my muscle memory in that app is so strong that using the alternative is *tedius as hell, especially for a rare need*, is so bad that calling it an *alternative* is highly misleading. Plus, I'm sitting on a 2070 that I acquired from family near MSRP to add to its existing 1070s and I'm pretty sure my kids will kill me if I wait any longer.
It's impossible to for Microsoft and Google to compete with Apple in the US. Every YouTube, TikTok, Hollywood, Instagram celebrity use AirPods/iPhones/Macbooks which means Windows and Android will forever be associated with being a poor man's version of whatever Apple is offering.
You might say Apple's only dominant market share is in the US (50%) but the reality is that the US dictates what's cool and what's not.
Among silly comments on HN, this is sillyness of a 14 year old. Ya can't be serious.
Hospitals, wall street banks and hedgefunds, almost all the big F500 corporations and people who actually do meaningful work for laboratories, engineering plants and refineries, and factories making microchips to cars to Aeroplanes are more than likely to be on Windows.
You've been mislead which makes sense since Apple has the effect more on impressionable people. The world literally runs on Windows. The internet runs on Linux. Apple makes premium consumer devices.
> Hospitals, wall street banks and hedgefunds, almost all the big F500 corporations and people who actually do meaningful work for laboratories, engineering plants and refineries, and factories making microchips to cars to Aeroplanes are more than likely to be on Windows.
By saying something about iOS. Engineering plants don't run iOS.
Facebook could start fighting back against Apple by giving influencers with Android more visibility, or worsening React Native on iOS. I wonder how much of an effect it would have.
Apple is already screwing over Facebook (look at the features in iOS 15), which is why they could consider doing this.
Apple's philosophy is that being first is not being best. You can be 5th place, but if you do it "best," it won't matter. That doesn't mean they are necessarily copying either. We've had leaks that they've tried foldables for years.
>The first is that Steve Jobs unabashedly hated styluses. When introducing the first iPhone in 2007, the Apple founder said, “Who wants a stylus? You have to get ’em, put ’em away, you lose ’em. Yuck! Nobody wants a stylus. So let’s not use a stylus.”