If nothing else, actually training an AI algorithm that large is an extremely large engineering challenge.
Googling around, it looks like most neural networks have somewhere in the neighborhood of tens of thousands of parameters. If nothing else GPT-3 is much, much bigger than most of its peers.
Googling around, it looks like most neural networks have somewhere in the neighborhood of tens of thousands of parameters. If nothing else GPT-3 is much, much bigger than most of its peers.