Hacker News new | past | comments | ask | show | jobs | submit login

The solution would be to revoke section 203 from any platform which acts as a digital public square if they do moderation beyond removing illegal content.

Ofc they would try there best to be excluded to have there cake and eat it too.






The entire point of section 230 is to allow platforms to remove non-illegal content [1].

Basically there were two lawsuits about platforms showing content. One of the platfroms tried to curate content to create a family-friendly environment. The second platform just didn't take anything down. The first platform lost their lawsuit while the second won their lawsuit. Congress wants to allow platforms to create family friend environment online so section 230 was written.

[1]: https://en.wikipedia.org/wiki/Section_230#


If something like that were put in place, any platforms acting as a “public square” should also be required to disable all recommendation and content surfacing features aside from search, algorithmic or otherwise.

Those recommendation features already do plenty of damage even with platforms having the ability to remove anything they like. If platforms are restricted to only removing illegal content, that damage would quickly become much greater.


You need moderation for more than legality though, otherwise you can't have open forums like this, that aren't total cesspits.

Right:

* When a bot farm spams ads for erectile dysfunction pills into every comment thread on your blog... That's "legal content"!

* When your model-train hobbyist site is invaded by posters sharing swastikas and planning neo-nazi rallies, that too is "legal content"--at least outside Germany.

All sorts of deceptive, off-topic, and horribly offensive things are "legal content."




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: