Hacker News new | past | comments | ask | show | jobs | submit login

You don't need these criteria when you can see in advance that something is impossible.

I think something that only learns to reproduce text, can not become an intelligent actor.

It's necessary to act in an environment with Feedback.

And while it of course depends on the definition of intelligence, the article is about the Gödel machine, which is a fancy word for AGI




You need the criteria in advance to even know if the thing is impossible.

We don't know the extent of our ignorance about intelligence.

> I think something that only learns to reproduce text, can not become an intelligent actor.

> It's necessary to act in an environment with Feedback.

Ok, but text adventures are a thing, so that doesn't rule out learning from text.

And all RHLF has humans as part of the environment and giving feedback (that's the H and the F in RLHF).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: