Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Don't newer models use more energy? I thought they were getting bigger and more computationally intensive.


They use a massive amount of energy during training. During inference they use a tiny amount of energy, less than a web search (turns out you can be really efficient if you don't mind giving wrong answers at random, and can therefore skip expensive database queries!)


Right, but the comment I was responding to suggested that ChatGPT3.5 used lots of energy and newer models use less.


Indeed, this is correct. See today's Claude Haiku 4 announcement for an example.


Looking at https://platform.openai.com/docs/pricing, GPT 3.5 is $1.50-4 per million output tokens, and GPT 5 is $0.40-120, with plain "gpt-5" with no qualifiers going for $10/million.

GPT5 is probably cheaper in the sense that gpt5-nano is at least as capable as 3.5 while costing less, but the "normal" models are more expensive for the newer ones, and thats what people are generally going to be using.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: