Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I feel like science reporting would actually be greatly improved if they simply had ChatGPT write the articles. The human "reporters" they have now are terrible.


Unless the GPT would be responsible for setting up the incentive structures, you’d end up with the same monotonous clickbait. The problem are people who give bad (science) reporting their attention and therefore money. PopSci journals are just answering the demand.


Non-scientists aren't going to be very good at telling bad science reporting from good. There's no way to fix that: you can't expect non-experts to be good judges of the experts.

With GPT, at least the writing quality would be better.


Perhaps, but recognizing clickbait should be within the grasp of everybody, even if they aren’t experts on the subject matter. Maybe that’s where an LLM would fit - evaluating the likelihood of an article offering something of substance.


The whole point of clickbait is to get non-discerning, ignorant laypeople to click on stuff and watch it. If it didn't work, it wouldn't be the problem it is now. You can complain that people aren't intelligent or educated enough on things far outside their domain of expertise (which may be something as mundane as flipping burgers, though to be fair making a great burger isn't that easy), but obviously your expectations do not line up with reality. The entire field of marketing, after all, is based completely on weaponizing and using human psychology against people in order to extract money from them. People have fundamental weaknesses based on their biology, so just complaining that people aren't overcoming their built-in limitations is futile, IMO.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: