Hacker News new | past | comments | ask | show | jobs | submit login

I have a mathematics (though not physics) degree and I didn't understand your question at all; "forques" appears to be either a place in France, Old French, or Catalan. I assume ChatGPT was correct in re-spelling "forques" as "torques", but have you tried asking Claude using words that do appear on the Internet?



Unlike you, both LLMs were familiar with geometric algebra and used the relevant terminology.

Testing on something widely known isn’t likely to stretch these systems.


I'd expect them to do better when the input uses words that appear more in the training data.

This very thread is the fifth hit on Google for `"forques" geometric algebra`; the third and fourth hit are the same paper as each other; the second hit is https://bivector.net/PGAdyn.pdf which appears to have invented the term; and the first hit doesn't define it.

I (logic, computability, set and type theory) am in no position to know whether it's a standard term in geometric algebra, but I do strongly expect LLMs to do much worse on queries that don't appear much in their training set (for which I take Google search results as a proxy); even if they have the knowledge to answer, I expect them to answer better when the question uses common words. I do know that when I asked your question to ChatGPT, it silently re-spelt "forques" as "torques".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: