>> But isn't that the nature of scientific hypotheses? They are tenuous, only valid until you come across some data that tells you "aha, there's another piece to this rule".
Chomskian grammars (CFGs) are used widely in compilers and similar tools to model computer languages and even for limited subsets of natural language they don't do half bad.
The problem with phrase structure grammars is that they're very costly to develop and maintain and so far there's never been one such grammar that can model the whole of a natural language.
If there was a way to learn CFGs with good coverage from text they'd be in much wider use, but unfortunately grammar induction is hard.
Also, Google has its own political reasons not to want to use grammars. They champion neural networks and statistical AI. It's their schtick, innit.
Chomskian grammars (CFGs) are used widely in compilers and similar tools to model computer languages and even for limited subsets of natural language they don't do half bad.
The problem with phrase structure grammars is that they're very costly to develop and maintain and so far there's never been one such grammar that can model the whole of a natural language.
If there was a way to learn CFGs with good coverage from text they'd be in much wider use, but unfortunately grammar induction is hard.
Also, Google has its own political reasons not to want to use grammars. They champion neural networks and statistical AI. It's their schtick, innit.