I am working on a system to parse the English language using a hand-written compiler and then store the IR in a database, so all human knowledge can be searched and queried in all its facets (who said what when and where) using English language. I believe that a database is the key to NLU, and machine learning is mostly useless for true NLU, because machine learning currently has no good way of interacting with a database AFAIK, and without the knowledge of a full database of human knowledge and knowing who said what it's impossible to truly understand human language. Storing all human knowledge in a neural network just isn't practical anytime soon.
I wrote a new programming language, Eek, because it was impossible for me to handle the complexity of doing all with current languages (that lack built-in support for asynchronous database access and parsing). So far the first generation of the programming language is working, but as an interpreter written in TypeScript, and I wrote an English language parser with it, and a simple database. Now I am working on a better, LLVM-based implementation of Eek. I started this thing about 3 years ago, and it will take some more years before this will be even demoable...
I wrote a new programming language, Eek, because it was impossible for me to handle the complexity of doing all with current languages (that lack built-in support for asynchronous database access and parsing). So far the first generation of the programming language is working, but as an interpreter written in TypeScript, and I wrote an English language parser with it, and a simple database. Now I am working on a better, LLVM-based implementation of Eek. I started this thing about 3 years ago, and it will take some more years before this will be even demoable...