Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
hobs
on July 6, 2023
|
parent
|
context
|
favorite
| on:
GPT-4 API General Availability
And get only 20 K tokens per minute, where a decent size question can use up 500 tokens, pretty much a joke for most larger websites.
https://learn.microsoft.com/en-us/azure/cognitive-services/o...
weird-eye-issue
on July 7, 2023
[–]
That's the
default
limit for GPT-4 which has more demand than any other LLM in the world.
hobs
on July 7, 2023
|
parent
[–]
Which is just demonstrating my point, just saying "go use Azure" doesn't solve anything.
weird-eye-issue
on July 7, 2023
|
root
|
parent
[–]
Yeah for GPT-4 they aren't even accepting new customers
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
https://learn.microsoft.com/en-us/azure/cognitive-services/o...