Hacker News new | past | comments | ask | show | jobs | submit login

That's probably not going to be an option for me as I wanted to upgrade to something with 16 GB of vram. I do toy with running LLM inference and squeezing models to fit in 8 GB vram is painful. Since the 5070 non-ti has 12 GB of vram there is no hope that a 5060 would have more vram than that. So, at a minimum I'm stuck with the prospect of upgrading to a 5070 ti.

That's not the end of the world for me if I move to a 5070 ti and you are quite correct that I can downclock/undervolt to keep a handle on power consumption. The price makes it a bit of a hard pill to swallow though.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: