I think their point is the billions in private investment which preceded those millions.
I think this is a common issue in computer science, where credit is given to sexy "software applications" like AI when the real advances were in the hardware that enabled them, which everyone just views as an uninteresting commodity.
> I think their point is the billions in private investment which preceded those millions.
But the "billions" didn't precede the "millions". They're just completely incorrect, and anyone that knows even a tiny amount about the actual history can see it immediately. That's why these comment sections are so polarized. It's a bunch of people vibe commenting vs people that have spent even like an hour researching the industry.
The history of semiconductor enterprise in the US is just a bunch of private companies lobbying the government for contracts, grants, and legal/trade protections. All of them would've folded at several different points without military contracts or government research grants. Read Chip War.
You seem to be arguing that the second government touches anything then everything it does gets credited to the government funding column. Seems simplistic to me, but you can believe what you like. Go back far enough and there was only private industry, and no government funding until the space race basically.
Either way the fact remains that the billions spent developing GPU's preceded the millions spent to use those GPUs for AI. Not sure what it has to do with polarization of the comment section. I assume it's just people seeking an opportunity to heap abuse on anything close to a representative of the evil "other side".
> Go back far enough and there was only private industry, and no government funding until the space race basically.
How do you think the railroads were built in the US? The bonds of the Pacific Railroad Acts date back to the 1860s. Pretty easy to build a railway line when government foots the bill.
Government funding of research. We were talking about the NSF after all, not free markets versus central planning.
On that though, I read somewhere that the hierarchical committee-led operation of the funding agencies is the same way communist systems dole out money for everything else too. Not sure if they were being completely serious.
From 1901 up to FDR's election in 1932, 5 Americans won Nobel Prizes in the sciences. There was not much government funding back then, and not much was going on either.
So your argument is that nothing is communism? The fact that it's a single large organization allocating resources is rather key to the whole point. That the same organizational structure doing it is interesting to me anyway. I suspected this line of thinking is too triggering for some people though.
A corporation is not an economic system, just a tiny participant of one. And I'd rather describe their decision making as hierarchical yes, but by middle managers implementing the agendas of higher ups, not necessarily by committees. When they operate by committee they tend to be at their worst...
Many industries are uninvestable in their early days. How many get to the point where private funding makes sense without initial government funding for fundamental science and research? Where will we be in 15 years if the government starts pulling funding like the NSF? We might find the private money at that time is funding those future industries in other countries instead.
Seeing all the recent tariff fights and actually finding out what the story is behind some of the different industries, I am becoming much more of the opinion that other countries take over industries as the result of specific agendas targeting those industries and maintaining a large degree of monopoly over them. The US has not reacted much because each country only took one industry or so and it was a way to manipulate them or appease them or whatever, but it is turning into death by a thousand cuts. I definitely think the US government needs to be a lot more involved than they have been in a range of ways. That list of ridiculous-sounding cancelled NSF grants wasn't it though. If you're talking about the SBIR program, that is pretty tiny. I assume it will continue, it is legally set to be at 2% or whatever.
> You seem to be arguing that the second government touches anything then everything it does gets credited to the government funding column.
Absolutely not. This is an obvious bad faith interpretation of my comment.
> Either way the fact remains that the billions spent developing GPU's preceded the millions spent to use those GPUs for AI.
Again, you're just obviously completely factually wrong to anyone who has even a modicum of casual interest in the history of these technologies.
> Not sure what it has to do with polarization of the comment section. I assume it's just people seeking an opportunity to heap abuse on anything close to a representative of the evil "other side".
And one more time for the people in the back. Anyone with any amount of actual knowledge on the topic at hand can immediately dismiss your entire argument because it isn't based in anything resembling fact. It's just you wishing or hoping that it might be somewhere close to true. This is just that scene from Billy Madison: "Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul."
I wonder if it deals more with the approachability of software applications. If I even begin to think I’d compete with NVIDIA delivering similar hardware, I’d very quickly realize I was an idiot. Meanwhile as a single individual, there is still a reasonable amount of commercial markets of software I really do have some chance at tackling or competing against. As software complexity rises it’s becoming far less tractable than it was in say the 90s but there are still areas individuals and small sums of capital can enter. I think that makes the sector alluring in general.
Hardware is just in general capital intensive, not even including all the intellectual capital needed. So it’s not that it’s uninteresting or even a commodity to me, it’s just a stone wall that whatever is there is there and that’s it in my mind.
That difference in difficulties is kind of the point. Imagine, as an extreme, a company makes a machine with certain functions performed based on which button combinations you press. A second company gets a patent for using the first company's machine for doing various tasks by pressing various button combinations, which are new uses of the machine no one had thought of yet. Now the second company has all the bargaining power in the market and so gets giant margins, despite doing a tiny fraction of the work it takes to make those tasks possible.
I wonder if our current system ended up this way because it is the most efficient in terms of specialization, or because the patent system drove things in this direction where the people last dealing with customers (i.e., those making the software layer) have the best info of what tasks the customers want to do with their computers, and hence patent the solutions first. Leaving hardware vendors no choice but to serve the software monopolies (one after another since the 80's).
I think this is a common issue in computer science, where credit is given to sexy "software applications" like AI when the real advances were in the hardware that enabled them, which everyone just views as an uninteresting commodity.