Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We probably had similar rigorous educations at that level. In SE we studied things like the '87 Wall St. crash versus Therac-25. The questions I remember were always around what "could or should" have been known, and crucially... when. Sometimes there's just no basis for making a "calculated risk" within a window.

The difference then, morally, is whether the harms are sudden and catastrophic or accumulating, ongoing, repairable and so on. And what action is taken.

There's a lot about FB you say that I cannot agree with. I think Zuckerberg as a person was and remains naive. To be fair I don't think he ever could have foreseen/calculated the societal impact of social media. But as a company I think FB understood exactly what was happening and had hired minds politically and sociologically smart enough to see the unfolding "catastrophe" (Roger McNamee's words) - but they chose to cover it up and steer the course anyway.

That's the kind of recklessness I am talking about. That's not like Y2K or Mariner-I or any of those very costly outcome could have been prevented by a more thoughtful singular decision early in development.




I’m talking strictly about the day to day engineering of pushing code and accidentally breaking something which is what “move fast and break things” is about and how it was understood by engineers within Facebook.

You now have raised a totally separate issue about the overall strategy and business development of the company which you’d be right about - if it were required to have a PE license to run an engineering company, Zuckerberg would have to have had his PE license revoked and any PEs complicit in what they did with tuning for addictiveness should similarly be punished. But the lack of regulation in any engineering projects that don’t deal directly with human safety and how businesses are allowed to run is a political problem.


I see we agree, and that as far as day-to-day engineering goes I'd probably care very little about whether a bug in Facebook stopped someone seeing a friends kitten pics.

But on the issue I'm really concerned about, do you think "tuning for addictiveness" on a scale of about 3 billion users goes beyond mere recklessness, and what do we do about this "political problem" that such enormous diffuse harms are somehow not considered matters of "human safety" in engineering circles?

Is it time we formalised some broader harms?


I think there are political movements to try to regulate social media. There’s lots of poorly regulated sub industries within the tech field (advertising is another one).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: