I believe his underlying assumption is that the API is so cheap that there's no way they're making money off of it. Yes it's paid, but doesn't matter if they're losing money on every API call.
Not sure what you mean by “individual units” but the suggestion is that it costs more than they charge. i.e it’s not profitable, and the more they sell the more they lose.
My point was "making it up on volume" is largely irrelevant when it comes to mass market web-apps.
Costs are relatively fixed outside of infrastructure, and potential customers are any number up to and including the internet-connected population of the world.
The marginal cost of a new subscription is way less than they charge. The more they sell the less they lose, even if they're still losing overall to gain market-share.
ChatGPT isnt a web-app. It takes serious hardware to run the model. More users means more hardware. If they are charging per token (which they are) then the costs will scale linearly with usage, given 100% usage per node. Anything less than 100% is an even greater loss.
This depends on the compute power quantum stepping....
That is what is the upgrade cost to expand capacity as new customers are added. If for example adding 1 million new users requires $200,000k in hardware expenditure and $20k in yearly power expenditure, but your first year return on those customers is only going to be $50k, you're in a massive money losing endeavor.
The point here is we really don't know the running and upkeep costs of these models at this point.