Hacker News new | past | comments | ask | show | jobs | submit login

Consensus in the machine learning community is that current architectures are enough for AGI, no new architecture needed. There is plenty of leeway as well into how much money we can pour into training and if the linear scaling law for model capabilities still holds — and we have no reason to believe yet that they won't we have the resources (both financial and technological) to reach this within 10-20 years tops.



> Consensus in the machine learning community is that current architectures are enough for AGI, no new architecture needed.

I don't think there's anything like consensus on this. There isn't even consensus on whether AGI is possible in the first place.


There isn't even consensus on what AGI means.


A few counter points: Most developers prefer Claude 3.5 (or the slightly improved 3.7 since a few days back), which is a year old already. What was supposedly ChatGPT 5 will be called 4.5 instead, after two multi-month and hugely expensive training runs. Grok 3 was trained on a massive cluster of 200.000 H100 and still not beating SOTA models.

There's certainly not "a clear way forward", even though one may hope the billions invested will lead to algorithmic breakthroughs.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: