Hacker News new | past | comments | ask | show | jobs | submit login

The empirical scaling laws are evidence. They're not deductive evidence, but still evidence.

The scaling laws themselves advertise diminishing returns, something like a natural log. This was never debated by AI optimists, so it's odd to suggest otherwise as if it contradicts anything the AI optimists have been saying.

The scaling laws are kind of a worst case scenario, anyway. They assume no paradigm shift in methodology. As we saw when the test-time scaling law was discovered, you can't bet on stasis here.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: