Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The gpt-oss-20b model has demonstrated that a machine with ~13GB of available RAM can run a very decent local model - if that RAM is GPU-accessible (as seen on Apple silicon Macs for example) you can get very usable performance out of it too.

I'm hoping that within a year or two machines like that will have dropped further in price.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: