What moat? I see people reimplementing open source models in a matter of weeks. These are not huge code bases. And when you want to run those models at scale, every cost saving will make big differences given their enormous computing requirements. It think it will become a cut-throat market.
Windows is also compiled for ARM and Microsoft literally sells their own Surface Pro computers with ARM processors[0]. So this is a terrible analogy, for a lot more reasons than that. Rather than go through all the reasons it’s a bad analogy I’ll just get straight to NVidia’s long term moat:
NVidia has 6x the cash-on-hand (>$30B) than their closest competitor (AMD) ($5B), and all of that can go towards GPU R&D rather than being split between CPU+GPU R&D (AMD has 10% of the $65B GPU market share but 33% of the $50B CPU market share — their CPU business is almost 75% of their total business). If there are changes needed to stay competitive, NVidia has the warchest they need to adapt. Just their cash on hand would allow them to fumble an entire release cycle and still catch back up.
Sure, if ROCm ever achieves parity with CUDA then Nvidia’s margins will finally decrease but the only way they’re facing any existential threat is by massively fumbling internally. If you have reason to believe that Nvidia will start executing poorly, then that would potentially be a valid concern — but so far they’re continuing to do great work despite their massive lead and their prices reflect their technical lead over their competitors.
On windows, you forget that it wasn't the case historically that windows could run on ARM, and it took years for MSFT to make it work (hence it was a real moat), and now that it works it is a real threat to Intel's grip on the PC market. So it exactly proves my point that you want to know how locked in your customers are to your product.
And as Bezos says, your margin is my opportunity, and nvidia makes humongous margins right now. It will attract investments in competitors. And if a competing GPU is half as powerful but a third of the price, it will eat market shares and compress margins. Unless customers cannot switch, but I don't believe that to be the case.
Training those models costs 10s of millions. Running them at scale some order(s) of magnitude that. At one point those AI initiatives will need to return a profit. You really think developers will not have to learn a new API? Users inertia works for retail users, I don't think it works so much for people who get paid to do it or have a financial incentive to switch. Provided there is a suitable alternative.
60 p/e ratio, means that you need 60 years of profits to get back the price you paid for the company. It's very very high, by historical standards.
As the CEO from Sun Microsystems said after the bubble burst:
"At 10 times revenues, to give you a 10-year payback, I have to pay you 100% of revenues for 10 straight years in dividends.
That assumes I can get that by my shareholders. That assumes I have zero cost of goods sold, which is very hard for a computer company. That assumes zero expenses, which is really hard with 39,000 employees. That assumes I pay no taxes, which is very hard. And that assumes you pay no taxes on your dividends, which is kind of illegal. And that assumes with zero R&D for the next 10 years, I can maintain the current revenue run rate.
Now, having done that, would any of you like to buy my stock at $64?
Do you realize how ridiculous those basic assumptions are? You don’t need any transparency. You don’t need any footnotes. What were you thinking?"
High P/Es are more of a projection of growth than an expectation of time to pay back.
Also, McNealy’s comment isn’t quite applicable because stocks aren’t bonds. There’s an inherent value to the ongoing operation beyond just paying the earnings out.
1) Cisco had good hardware, but nothing that commodity gear couldn’t also do. Nvidia have a substantial moat.
2) Cisco had a p/e of 700! Nvidia is at 60.
3) Cisco had a decent order book during the boom, but nothing to justify their price. Nvidia’s order book is… unholy.
I do think there’s a lot of hype around the various FOMO/ridealong stocks - but nvidia, I earnestly think remains undervalued.