Hacker News new | past | comments | ask | show | jobs | submit login

> But how is any information that isn't testable trusted? I'm open to the idea ChatGPT is as credible as experts in the dismal sciences given that information cannot be proven or falsified and legitimacy is assigned by stringing together words that "makes sense".

I understand that around the 1980s-ish, the dream was that people could express knowledge in something like Prolog, including the test-case, which can then be deterministically evaluated. This does really work, but surprisingly many things cannot be represented in terms of “facts” which really limits its applicability.

I didn’t opt for Prolog electives in school (I did Haskell instead) so I honestly don’t know why so many “things” are unrepresentable as “facts”.




I bet GPT is really good at prolog, that would be interesting to explore.

"Answer this question in the form of a testable prolog program"


You lost this bet: Write append3/4 which appends three lists to a fourth, such that append3(Xs,Ys,[e],[]) terminates.


Did you give it a try?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: