Hacker News new | past | comments | ask | show | jobs | submit login

The post says it uses "2GB Or less" of VRAM.

A 1B parameter transformer model is on the low/tiny-end of model size these days.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: