Hacker News new | past | comments | ask | show | jobs | submit login

I suspect it's a balancing act between the AI being generally willing to help and avoid responses like this, e.g.:

https://www.sandraandwoo.com/wp-content/uploads/2024/02/twit...

or it just telling you to google it




what (hypothetically) happens when the cost to run the next giant llm exceeds the cost to hire a person for tasks like this?


Given current models can accomplish this task quite successfully and cheaply, I'd say that if/when that happens it would be a failure of the user (or the provider) for not routing the request to the smaller, cheaper model.

Similar to how it would be the failure of the user/provider if someone thought it was too expensive to order food in, but the reason they thought that was they were looking at the cost of chartering a helicopter form the restaurant to their house.


Realtime LLM generation is ~$15/million “words”. By comparison a human writer at the beginning of a career typically earns ~$50k/million words up to ~$1million/million words for experienced writers. That’s about 4-6 orders of magnitude.

Inference costs generally have many orders of magnitude to go before it approaches raw human costs & there’s always going to be innovation to keep driving down the cost of inference. This is also ignoring that humans aren’t available 24/7, have varying quality of output depending on what’s going on in their personal lives (& ignoring that digital LLMs can respond quicker than humans, reducing the time a task takes) & require more laborious editing than might be present with an LLM. Basically the hypothetical case seems unlikely to ever come to reality unless you’ve got a supercomputer AI that’s doing things no human possibly could because of the amount of data it’s operating on (at which point, it might exceed the cost but a competitive human wouldn’t exist).


the R&D continues




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: