Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think what they're suggesting is that yes, Google spent some SWE hours to implement it, but they have (or will, over time) saved more than that in equivalent compute (by pushing it onto users). So, from Google's point of view, they saved SWE hours on the company balance sheet, so it's a win.

But from the larger point of view of "everyone," this actually made the total cost worse. The cost/benefit analysis from Google's point-of-view was worth it, even though it made things overall worse in the larger perspective. Hence "tragedy of the commons," where the short-sighted behavior of some actors makes things worse overall.




I'm don't think that metric applies here - "SWE years" is when you want for determine whether investing in certain optimisation is worth doing.

In case of VP9, there was no such tradeoff - it used more storage, more CPU time and age SWE hours. What it saved was a very clearly quantifiable financial cost of bandwidth and SWE hour tradeoff didn't have much to do with it.

It would be applicable if the debate would be investing engineering time to speed up VP9 encoder for example.


SWE-years are actually used for two things: a) measuring ongoing cost and b) for calculating if an investment will pay off.

There's a few benefits over just using a dollar figure. It's a unit that natural ties time to value. SWE-years is a natural unit for thinking about the time of SWEs. It's also convertible to other units. The conversion factors to other units can change over time, without having to pay attention to the minutiae.

All these factors make it useful for both measuring ongoing cost and making investment tradeoffs.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: