Hacker News new | past | comments | ask | show | jobs | submit login

First part is correct, the second part is not. GPT Neo is a 2.7B param model, the largest GPT is 175B (they have various flavours, up to 175B). I appreciate the sentiment and what ElutherAI is doing with GPT Neo, but there is no open source equivlenet of the full GPT-3 available for the public to use. Hopefuly it's just a matter of time.



GPT-J is 6B and comes pretty close. Also practically I haven’t noticed a difference.

Keep in mind there are also closed source alternatives: for example, AI21’s Jurassic-1 models are comparable, cheaper, and technically larger (albeit somewhat comically, 178B instead of 175B parameters).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: