Hacker News new | past | comments | ask | show | jobs | submit login

'It's not AGI yet' - the implication is insufferable. It's a language model that is incapable of any kind of reasoning, the talk of 'AGI' is a glib utopianism, a very heavy kind of koolaid. If we were to have referred to this tech as anything other than 'intelligence' - for example, if we chose 'adaptive algorithms' or 'weighted node storage' etc. we'd likely have a completely different popular mental model for it.

There will be no 'AI model' that is 'AGI', rather, a large swath of different technologies and models, operating together, will give the appearance of 'AGI' via some kind of interface.

It will not appear as an 'automaton' (aka single processing unit) and it certain will not be an 'aha moment'.

In 10 years, you'll be able to ask various agents, of different kinds, which will use varying kinds of AI to interpret speech, to infer context, which will interface with various AI APIs, in many ways it'll resemble what we have today but with more nuance.

The net appearance will evolve over time to appear a bit like 'AGI' but there won't be an 'entity' to identify as 'it'.




> incapable of any kind of reasoning

If this were true the debate would be a hell of lot easier. Unfortunately, it is not.


In fact, comments like the one your are responding to are the most effective way to respond to ‘it hallucinates’.


There is no reasoning, which is why it will be impossible to move the LLM's past certain kinds of tasks.

They are 'next word prediction models' which elicit some kinds of reasoning embedded in our language, but it's a crude approximation at best.

The AGI metaphors are Ayahuasca Koolaid, like a magician duped by his own magic trick.

There will be no AGI, especially because there will be not 'automaton' aka distinct entity that elicits those behaviours.

Imagine if someone proposed 'Siri' were 'conscious' - well nobody would say that, because we know it's just a voice-based interface onto other things.

Well, Siri is about to appear much smarter thanks to LLMs, and be able to 'pass the bar exam' - but ultimately nothing has fundamentally changed.

Whereas each automaton in the human world had it's own distinct 'context' - the AI world will not have that at all. Context will be as fleeting as memory in RAM, and it will be across various systems that we use daily.

It's just tech, that's it.


> There is no reasoning

> elicit some kinds of reasoning

I know it's hard, but you have to choose here. Are they reasoning or are they not reasoning?

> next word prediction models

238478903 + 348934803809 = ?

Predict the next word. What process do you propose we use here? "Approximately" reason? That's one hell of a concept you conjured up there. Very interesting one. How does one "approximates" reason and what makes it so that the approximation will forever fail to arrive at its desired destination?

> Whereas each automaton in the human world had it's own distinct 'context' - the AI world will not have that at all. Context will be as fleeting as memory in RAM, and it will be across various systems that we use daily.

Human context is fleeting as well. Time, dementia and ultimately death can attest to that. Even in life identity is complicated and multifaceted without singular I. For all intents and purpose we too are composed of massive amounts of loosely linked subsystems vaguely resembling some sort of unity. I agree with you on that one. General intelligence IMO probably requires some form of cooperation between disparate systems.

But you see some sort of fundamental difference here between "biology" and "tech" that I just cannot. If RAM was implemented biologically, would it cease to be RAM? I fail to see what's so special about the biological substrate.

To be clear, I'm not saying LLMs are AGI, but I have a hard time dismissing the notion that some combination of systems - of which LLMs might be one - will result in something we just have to call generally intelligent. Biology just beat us to it, like it did with so many things.

> It's just tech, that's it.

The human version is: it's just biology, that's it. What's the purpose of stating that?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: