Hacker News new | past | comments | ask | show | jobs | submit login

"The Internet interprets censorship as damage and routes around it" is propaganda and cultural myth.

Conditioning liability limitation on being hands-off will only go so far. For one thing, most platforms are basically unusable if you try to turn them into the verbal equivalent of 2b2t[0]. You need to do spam filtering at the least, and that implies making editorial judgments as to what users to take on. It is also possible to use free speech as a censorship tool - say, by harassing or doxing users as reprisal for speech.

Even if we walked this back to "utilities must have consistent rules and users can sue for unfair application of them", this is still historically lenient in terms of intermediary liability. The law does not chase pointers: with literally any other field of endeavor, we don't accept the idea that someone can facilitate a crime without being liable for it. That would be a massive loophole. But we accept it for defamation and copyright law because we convinced Congress to accept the propaganda of the Internet[1].

Under current law, Visa and Mastercard cannot disclaim liability that Apple or Google can. This is why the anti-porn campaigns have been so successful at killing amateurs in that space. Apple might be prudish about not having porn in their app, but they can at least accept filters on the app[2]. And, as we've seen with cryptocurrency, letting those companies disclaim liability would be an absolute nightmare. Cryptocurrency has been an absolute boon to ransomware and scam enterprises that otherwise would not be able to take payments. So you will never convince Congress to carve massive, gaping loopholes in banking law the same way we did to defamation and copyright law.

A decade and change ago our biggest worry was that Comcast would try to turn the Internet into a series of cable TV packages. Now, we have forced so many people to effectively immigrate to the Internet, that they are in a position to demand that it does actually work like cable (in that things they don't like can be sued into oblivion or taken down). The old Internet cannot exist in a world where it can be used against itself to destroy itself.

[0] The oldest anarchy server in Minecraft.

[1] For the record, I am not opposed to CDA 230 or DMCA 512. But we still need to recognize that these liability limiters have massively harmed the ability for plaintiffs to prosecute legitimate defamation or copyright cases.

[2] I am aware that Tumblr got smacked down for this, but this is not because Apple refuses to accept filtering on social networks. They got smacked down because they are absolutely terrible at filtering anything, and an app reviewer saw CSAM.




Copyright only hurts the small creators it was intended to protect. Hell, I work in a creative field and in order to get employment that isn't soliciting freelance work on a project-by-project basis, most companies heavily push creative people to sign away all rights to the things they make on or off company time. I've only successfully gotten a company to drop onerous IP language once. The other times I've walked away from what are otherwise dream jobs.


> But we still need to recognize that these liability limiters have massively harmed the ability for plaintiffs to prosecute legitimate defamation or copyright cases.

Have they really though? I guess plantiffs might find it easier to get damages out of Facebook or Google, but it still seems to be very easy to get stuff taken down.


It's actually impossible to get damages out of Facebook or Google. Believe me, a lot of really big copyright plaintiffs have tried. The law around them is iron-clad.

The reason why it's very easy to take down things is directly downstream of this. Social media companies will trip over themselves to take down content, and the standard for what constitutes a legitimate takedown notice is hilariously low. Remember how Bungie had to basically yell and scream at Google to get them to reverse an illegitimate takedown request an angry fan made to tarnish their reputation? Google didn't want to lift a finger, because all the processes heavily encourage passing the buck. Likewise, social media likes to use content recommendation algorithms because it also lets them get around the law. If you use humans to recommend or curate content, then you're a publisher, you get full copyright liability, and you become the big copyright punching bag[0]. But, if you write an algorithm to make the same decision you would have made as a human, then you're in your safe harbor and there's nobody to sue.

And plaintiffs don't want easy takedowns, they want to sue a big company for lots of money... so that they'll settle and agree to a licensing deal. That's how copyright is supposed to work. When someone with money doesn't pay for the thing, you get to use the law to smack them until they pay for the thing. Nobody wants to sue individuals; it's expensive, time-consuming, and makes you look like an extortion artist.

But that doesn't work under DMCA 512, because there isn't a big company anymore. There's just an online service provider doing things "at the request of" billions of individual users.

We as hackers, myself included, love to complain about copyright. The problem is that copyright only inconveniences those who want to play by the rules. We get angry about it because we want to be told "yes" and will never, ever afford to actually pay to be told "yes". But if you don't care, it's quite easy to evade the system. Register new accounts, use a VPN, lightly disguise your infringement, and you make yourself far harder to sue.

[0] See Mavrix v. LiveJournal. This is the only part of law that I can think of where being willfully blind to abusive conduct is actually rewarded rather than being punished.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: