Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>As researchers have found with the internet more broadly, racism (i.e., often reflecting perspectives of those building technology) is built into social media platforms. For example, algorithms (i.e., a set of mathematical instructions that direct users’ everyday experiences down to the posts that they see) can often have centuries of racist policy and discrimination encoded.9

There's something ridiculous about this statement.



Overall this seems to be a very odd study with - basically no recommendations.

It boils down to, you’ll need to judge for yourself and every child is different but the internet, while great, is terrifying and can be traumatic.

So I’m not sure what we learned here.


I think it can be summed up in the following ways:

1) Parenting takes effort, use that effort to do 2-4 2) Know what they are looking at, prioritize face-to-face interactions. 3) Know the parents of the kids they are hanging out with. 4) Make sure that the parents of friends have the same values as you when it comes to social media.


Thank you for pointing this out. I'd love to hear their explanation of how any given algorithm is "racist".


This is just for YouTube shorts, but I notice a huge swathe of right wing content being blasted at me whenever I foolishly stray onto that part of the app. I'll get anything from Tate to Rogan to obscure 'Woke-critical' content straight away.

There is a study on the right wing nature of the Shorts algorithm: https://firstmonday.org/ojs/index.php/fm/article/download/10...

Nothing about it suggests other sites do the same thing, but if right wing content is driving more engagement than left wing content through outrage alone it stands to reason that any reinforcement learning would prioritise it.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: