Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sam Altman decided to irresponsibly talk bullshit about parenting because yes, he needed that marketing sound bite.

I cannot believe someone will wonder how people managed to decode "my baby dropped pizza and then giggled" before LLMs. I mean, if someone is honestly terrified about the answer to this life-or-death question and cannot figure out life without an LLM, they probably shouldn't be a parent.

Then again, Altman is faking it. Not sure if what he's faking is this affectation of being a clueless parent, or of being a human being.





That’s not the questions people will ask though. They’ll go “what body temperature is too high?” Baby temperatures are not the same as ours. The threshold for fevers and such are different.

They will ask “how much water should my newborn drink?” That’s a dangerous thing to get wrong (outside of certain circumstances, the answer is “none.” Milk/formula provides necessary hydration).

They will ask about healthy food alternatives - what if it tells them to feed their baby fresh honey on some homemade concoction (botulism risk)?

People googled this stuff before, but a basic search doesn’t respond with you about how it’s right and consistently feed you emotionally bad info in the same fashion.


Agreed. I wasn't defending Altman!

I was mostly responding to the section about how those people should not be parents but I must’ve misread tone/missed something.

I was mostly arguing that Altman's statements, if taken at face value, show him to be unfit to be a parent. I stand by this, but mostly because I think people like him -- Altman, Musk, I tend to conflate -- are robots masquerading as human beings.

That said, of course Altman is being cynical about this. He's just marketing his product, ChatGPT. I don't believe for a minute he really outsources his baby's well-being to an LLM.


Ahhh ok thank you for clarifying that for me!



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: