Sure, when availability and SLA kicks in…, but reselling APIs will only get you that far. Perhaps the whole pro/cons cloud argument can also kick in here, not going into it. We may well be on the same page, or we both perhaps have valid arguments. Your comment is appreciated indeed.
But then is the author (and are we) talking experience in reselling APIs or experience in introducing NNs in the pipeline? Not the same thing IMHO.
Agreed that OpenAI provides very good service, Gemini is not quite there yet, Groq (the LPUs) delivered a nice tech demo, Mixtral is cool but lacks in certain areas, and Claude can be lengthy.
But precisely because I’m not sticking with OAI I can then restate my view that if someone is so good with prompts he can get the same results locally if he knows what he’s doing.
Prompting OpenAI the right way can be similarly difficult.
Perhaps the whole idea of local inference only matters for IoT scenarios or whenever data is super sensitive (or CTO super stubborn to let it embed and fly). But then if you start from day 1 with WordPress provisioned for you ready to go in Google Cloud, you’d never understand the underlying details of the technology.
There sure also must be a good reason why Phind tuned their own thing to offer alongside GPT4 APIs.
Disclaimer: tech education is a side thing I do, indeed, and been doing in person for very long time, more than dozen topics, to allow myself to have opinion. Of course business is different matter and strategic decisions arr not the same. Even though I’d not advise anyone to blindly use APIs unless they appreciate the need properly.
But then is the author (and are we) talking experience in reselling APIs or experience in introducing NNs in the pipeline? Not the same thing IMHO.
Agreed that OpenAI provides very good service, Gemini is not quite there yet, Groq (the LPUs) delivered a nice tech demo, Mixtral is cool but lacks in certain areas, and Claude can be lengthy.
But precisely because I’m not sticking with OAI I can then restate my view that if someone is so good with prompts he can get the same results locally if he knows what he’s doing.
Prompting OpenAI the right way can be similarly difficult.
Perhaps the whole idea of local inference only matters for IoT scenarios or whenever data is super sensitive (or CTO super stubborn to let it embed and fly). But then if you start from day 1 with WordPress provisioned for you ready to go in Google Cloud, you’d never understand the underlying details of the technology.
There sure also must be a good reason why Phind tuned their own thing to offer alongside GPT4 APIs.
Disclaimer: tech education is a side thing I do, indeed, and been doing in person for very long time, more than dozen topics, to allow myself to have opinion. Of course business is different matter and strategic decisions arr not the same. Even though I’d not advise anyone to blindly use APIs unless they appreciate the need properly.