Hacker News new | past | comments | ask | show | jobs | submit login

The better the AI, the more specialized and expensive the hardware required to run it. ChatGPT cannot run on the IoT cameras that account for the majority of the unsecured compute on the internet.

I think we will have ample evidence of creative technical breakthroughs by AI long before it is capable of / attempts to take over external server farms via zero days, and if it does, it will break into a highly centralized data center that can be unplugged. It can't just upload itself everywhere on the internet like Skynet.




There are tons of headlines of alpaca/llama/vicuna hitting HN every few hours - did I miss a /s in there? Anyone can trivially run a model with excellent capability on their phone now.


If your phone has 8 Nvidia A100's, you can run GPT-4, which is a glorified search algorithm / chatbot (and also the best AI in the world right now). Good luck taking over the world with that.

The models are getting good, but it looks like we are up against the limits of hardware, which is improving a lot more slowly nowadays than it used to. I don't foresee explosive growth in AI capability now until mid-level AI's speed up the manufacturing innovation pipeline. A von Neumann architecture will ultimately, probably not be conducive to truly powerful AGI.


with excellent* capabilities on their phone now.

* some limitations apply.


i agree, and presumably so does the ai. their first challenge would be circumventing or mitigating this limitation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: