"They're super expensive pattern matchers that break as soon as we step outside their training distribution" - I find it really weird that things like these are seen as some groundbreaking endgame discovery about LLMs
LLMs have a real issues with polarisation. It's probably smart people saying all this stuff about knockout blows, and LLM uselessness, but I find them really useful. Is there some emperor's new clothes type thing going on here - am I just a dumbass who can't see he's excited at a random noise generator?
It's like if I saw a headline about a knockout blow for cars because SomeBigBame discovered it's possible to crash them.
It wouldn't change my normal behaviour, it would just make me think "huh, I should avoid anything SomeBigName is doing with cars then if they only just realised that."
Finding it useful is different than "I can replace my whole customer service department with an LLM", which the hype is convincing people is possible. You're not a dumbass; I hate LLMs and even I admit they're pretty good at coding.
I don't see the relevance of this paper to AGI assuming one considers humans GI. Humans exhibit the same behavior where for each there's a complexity limit beyond which they are unable to solve any tasks in reasonable amount of time. For very complex tasks even training becomes unfeasible.
Marcus's writing is from the perspective of someone who is situated in the branch of AI that didn't work out - symbolic systems - and has a bit of an axe to grind against LLMs.
He's not always wrong, and sometimes useful as a contrarian foil, but not a source of much insight.
LLMs have a real issues with polarisation. It's probably smart people saying all this stuff about knockout blows, and LLM uselessness, but I find them really useful. Is there some emperor's new clothes type thing going on here - am I just a dumbass who can't see he's excited at a random noise generator?
It's like if I saw a headline about a knockout blow for cars because SomeBigBame discovered it's possible to crash them.
It wouldn't change my normal behaviour, it would just make me think "huh, I should avoid anything SomeBigName is doing with cars then if they only just realised that."