Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good article, the METR metric is very interesting. See also Leopold Aschenbrenner's work in the same vein:

https://situational-awareness.ai/from-gpt-4-to-agi/

IMO this approach ultimately asks the wrong question. Every exponential trend in history has eventually flattened out. Every. single. one. Two rabbits would create a population with a mass greater than the Earth in a couple of years if that trend continues indefinitely. The left hand side of a sigmoid curve looks exactly like exponential growth to the naked eye... until it nears the inflection point at t=0. The two curves can't be distinguished when you only have noisy data from t<0.

A better question is, "When will the curve flatten out?" and that can only be addressed by looking outside the dataset for which constraints will eventually make growth impossible. For example, for Moore's law, we could examine as the quantum limits on how small a single transistor can be. You have to analyze the context, not just do the line fitting exercise.

The only really interesting question in the long term is if it will level off at a level near, below, or above human intelligence. It doesn't matter much if that takes five years or fifty. Simply looking at lines that are currently going up and extending them off the right side of the page doesn't really get us any closer to answering that. We have to look at the fundamental constraints of our understanding and algorithms, independent of hardware. For example, hallucinations may be unsolvable with the current approach and require a genuine paradigm shift to solve, and paradigm shifts don't show up on trend lines, more or less by definition.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: