> In contrast, its ability to explain grammar is terrible.
To be fair, grammar is sometimes a bit of an advanced subject - even if it is ostensibly easy (which it isn't). Isn't that the same problem for every other subject? Other than encyclopedic knowledge, it struggles with many subjects at higher forms (i.e. programming more than just a few basics, math, and so on).
I think this is partially an issue of availability of data.
Something else to consider is that languages differ as to the complexity of their grammar.
Much research and training of LLMs has been done in English, but if these models were trained on as much data in other languages as English, I wonder if:
- LLMs would do better or worse on intelligence and other tests if they were tested in other languages?
- Could conversing with LLMs be easier or harder in other languages?
Some languages, like Logan or Lojban, might be especially suited towards this sort of testing and interaction, as they were designed to be easy for computers to interpret.
To be fair, grammar is sometimes a bit of an advanced subject - even if it is ostensibly easy (which it isn't). Isn't that the same problem for every other subject? Other than encyclopedic knowledge, it struggles with many subjects at higher forms (i.e. programming more than just a few basics, math, and so on).
I think this is partially an issue of availability of data.