Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think 10 years for training such large models on phones is entirely feasible, if only because phones are mainly concerned with power drain and ergonomics before anything else. Nobody is going to buy an iPhone 15 if it's the size of a brick and lasts 2 hours just because you're training&running models on it.

The focus instead should be on running expansive models locally on your desktop system at home. This is a better focus in two ways: one the power issue is a non-concern (and AMD has already stated power usage will reach 500-700W average by 2025), two once you have it running locally the avenues of use open up such as being able to access your local model through other devices without the heavy burden on those devices.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: