They are already bleeding money from what I'm aware, so they either have to jack up prices so much they could make up for 5bn in losses, or Sam Altman manages to convince investors to contribute to another funding round.
Considering he was willing and Delusional enough to ask for 7T USD for AI chips, Im sure he would try.
My guess as the other commenters have mentioned they will start building integrations with CRMs, ready to deploy RAG apps for enterprise knowledge bases, etc. There is still money to be made. How much that I do not know.
Since I have been using the GPT3 API for some work stuff since 2021, I remember that prices had dropped dramatically when GPT3.5Turbo came out. Now they are engaging in what I presume is price wars with Google and Anthropic. Already Anthropic has a higher price, even for its Haiku model for what they called "increased intelligence" but it doesn't beat 4o-mini in benchmarks.
While There is money to be made in further integrations, hell thats where i see most of thr priductivity increase from these tools coming from, OpenAI has already spent billions developing these tools, money which they have to pay back in some way soon.
This is also ignoring the giant elephant that is open models, soon enough models like Llama, would be able to match or even surpass what ChatGPT, by which point why would any sufficently large company pay for API when they can run their own model, especially when all those GPU used for training flood the market.
But then again, plenty of large comapnies still use aws, even when it makes no sense to go serverless, so they might have a market to capitalise on.
We live in interesting times for tech, moores law is dead, intel is falling, layoffs are everywhere...
I sure picked the best time to go to university for Computer Science T-T
Those GPUs also cost a bomb to run. LLMops isn't super easy, I am working with a large OEM manufacturer rn as a consultant and they are also experimenting internally with LLMs, but they have enough resources to run those models. I don't see smaller companies having enough resources to experiment with various models at scale like they are.
>I sure picked the best time to go to university for Computer Science T-T
Going to be a bit contrarian here – while it's true that jobs can be tough to find and layoffs are discouraging, I genuinely believe it's also one of the most exciting times to be in the CS field. The fact that we can actually talk with an "algorithm" is still quite bonkers to me cause I remember fiddling with RNNs and LSTMs just to predict the next word or two in a sentence. There are still ways that we can leverage it to make something really cool. Perplexity is one. Phind is another. Notions's AI integrations are great.
I graduated about four years ago and faced my own setbacks, including getting laid off from my first job. But despite those hurdles, I've managed to find my footing and am doing reasonably well now.
Just hang in there champ.
>Those GPUs also cost a bomb to run. LLMops isn't super easy, I am working with a large OEM manufacturer rn as a consultant and they are also experimenting internally with LLMs, but they have enough resources to run those models. I don't see smaller companies having enough resources to experiment with various models at scale like they are.
True, I sorta conflated running llama on your pc with what large comapnies.
Not to mention how I was somewhat conflating chat-gpt the product with OpenAI the company, What i argues was that soon enough ChatGPT itself won't be that special when comparing it to open source models.
OpenAI the company is in the weird position of both having a moat, and yet drowning in it: They have a huge advantage in skilled experts, engineers, and know-how to get a first mover advantage, especially now that they are practically another subsidiary of microsoft.
But they also have the notable disadvantage of spending billions upon billions of dollars developing a model that in the end is little to no better than what one could get for free from the internet.
A small company with a few dozen specialist could present a comparable product at a fraction of the cost, simply by not having to pay back the cost of developing their own model.
I feel like OpenAI would end up in a weird place in soon, maybe something like a cloud provider for companies, usefull for smaller ones where brand recognition and reliability matter, but having to compete with more specialised companies offering a similar service using llama, And at some point large companies could just build their own servers with open-source LLM's with their own servers and their own teams, bypassing OpenAI entirely.
The biggest winner here is those new small AI consulting teams that didn't have to spend nearly as much on finetuning the models that are already made.
You probably know way more about these things than me, what do you think of this prediction?
It doesnt sound as terrible for developers as I first thought, though it pains me to see how many people quit/never went into software development due to the AI hype, we lost a third of our class from 2023, and I assune things are even worse in america/developed countries.
OpenAI's position is indeed paradoxical. They have a considerable lead in terms of expertise and infrastructure, yet that very advantage comes with the burden of their substantial development costs. A small, nimble company leveraging open-source models can undoubtedly provide some competitive pressure by offering similar capabilities at a lower cost.
Despite these challenges, I believe OpenAI has strategic avenues to sustain and grow. Their investments in integrations, enterprise solutions, and reinforcing the reliability and scalability of their models can maintain their edge. The trust and infrastructure they offer might still be appealing enough for many businesses to stick with them, similar to the AWS analogy you mentioned.
As for the job market and the future for developers, I see your point. The AI hype has indeed introduced some volatility. However, I am cautiously optimistic. The evolution of AI and its integration into various fields will eventually balance out, creating new opportunities even as it displaces others. I still believe we’re in a transformative period where mobility and adaptation within the CS field could lead to exciting new prospects.
To your last point, it’s indeed tough to see talented individuals shy away from software development due to the current uncertainties. However, I hope this phase will pass and those who remain will likely find themselves at the forefront of some groundbreaking developments. Let's hope our lord and saviour, J-Pow has many more rate cuts for us in the future.
Considering he was willing and Delusional enough to ask for 7T USD for AI chips, Im sure he would try.