You could make that more complicated where moderators tag the content and then you apply filters based on what children are allowed to view in a jurisdiction, or you could be conservative in only allowing non-controversial stuff for kids to avoid that.
Obviously different jurisdictions are increasingly disagreeing with it being a non-problem.
I regret to inform you that there's a bug in your code.
Specifically, it relies on the "moderatorApprovedForChildren" flag, which is sometimes sent incorrectly because of glitches in the system that sets that flag. Apparently the number of such glitches increases sharply with the number possible values of "j", but is significant even with only one value.
Also, flag-setting behavior is probabilisitic in edge cases, with a surprisingly broad distribution.
You are therefore not meeting your "zero porn" spec, while at the same time blocking a nonzero amount of non-porn.
Don't bother to fix the bug, though; given the very large cost of the flag-setting system, the company has gone out of business and cancelled your project.
> Obviously different jurisdictions are increasingly disagreeing with it being a non-problem.
Different jurisdictions are doing a lot of stupid things. You get that in a moral panic. Doesn't make them less stupid.
Weirdly enough, other companies manage to not accidentally sell/give porn to kids just fine. I see no issue with holding large media companies like TikTok, Meta, Google, etc. to account just like we would if someone put hardcore porn on the Disney channel. This is only a problem when you want to be a massive company that operates in every market while not taking any responsibility for what you do/not hiring the necessary staff to manage it.
Similarly, if your alcohol/weed store sells to children and you get caught, you can be criminally prosecuted. This is well-trodden ground. Companies worth trillions can be expected to do what everyone else manages to do.
Same deal with malicious ads. These companies absolutely have the resources to check who they're doing business with. They choose not to.
Banks also don't get to just not bother with reconciling accounts because it's hard to check if the numbers add up, and yeah bugs can result in government action.
Uh-huh. User-generated content is exactly like the Disney channel.
Let's keep using the TikTok example. According to https://arxiv.org/abs/2504.13279 , TikTok receives about 176 years of video per day. That's 64,240 days per day, or 1,541,760 hours per day. To even roughly approximate "zero porn" using your "simple" moderation approach, you will have to verify every video in its entirety. Otherwise people will put porn after or in amongst decoy content.
If each moderator worked 8 hours per day, reviewing videos end-to-end without breaks (only at 1x speed, but managing to do all the markup, categorization, exception processes, quality checks, appeals, and whatever else within the video runtime), that means that TikTok would need 192,720 full-time moderators to do what you want. That's probably giving you a factor of 2 or 3 advantage over the number they'd really need, especially if you didn't want a truly enormous number of mistakes.
The moderators in this sweatshop are skilled laborers. To achieve what you casually demand, they'd have to be fluent in the local languages and cultures of the videos they're moderating (actually, since you talk about "jurisdictions", maybe they have to also be what amounts to lawyers). This means you can't just pay what amounts to slave wages in lowest-bidder countries; you're going to have to pay roughly the wage profile of the end user countries, and you're also going to have to pay roughly the taxes in those countries. Still, suppose you somehow manage to get away with paying $10/hour for moderation, with a 25 percent burden for a net of $12.50/hour.
Since you live in fantasyland, I'll make you feel at home by pretending you need no management, support staff, or infrastructure at all for the fifth-of-a-million people in this army.
You now have TikTok paying $19,272,000 dollars to moderate each day's 1,541,760 hours of video. TikTok operates 365 days a year, and anyway the 1,547,760 is an average. So the annual wage cost is $7,034,280,000.
TikTok financials aren't reported separate from the rest of ByteDance, but for whatever it's worth, [some random analyst](https://www.businessofapps.com/data/tik-tok-statistics/) estimates revenue at about $23B per year, so you're asking for about 30 percent of gross revenue. It's not plausible that TikTok makes 30 percent profit on that gross, so, even under these extremely, unrealistically charitable assumptions, you have made TikTok unprofitable and caused it (a) shut down completely, or (b) try to exclude all minors (presumably to whatever crazy draconian standard of perfection any random Thinker Of The Children feels like demanding that day).
No, TikTok can't just raise advertising rates or whatever. If it could get more, it would already be charging more.
That's all probably about typical for any UGC platform. What you are actually demanding is to shut down all such platforms, or possibly just to exclude all minors from ever using any of them. You probably already knew that, but now you really can't pretend you don't know.
Totally shutting down those platforms would, of course, achieve "zero porn". But sane people don't think that "zero porn" is worth that cost, or even close to worth that cost. Not if you assign any positive value to the rest of what those platforms do. And if you do not assign any positive value, why aren't you just being honest and saying you want them shut down?
If they want to centralize and provide recommendations for public video clips posted by anyone in the entire world but can't actually economically do that in a responsible way, then sure I don't have a problem with them being fined into oblivion. I don't see much need for businesses with hundreds of millions of customers to exist (and see plenty of downsides to allowing one company/platform to be that large. Especially a centralized communications platform), and if they can't actually handle that scale, then okay. Maybe their whole premise was a stupid idea. Or maybe they'll need to charge users to cover costs. Or ban children.
Well, I'd be happy to see them replaced by decentralized systems, too, and while I'm capable of recognizing that many people value the recommendation services and rendezvous points that those platforms provide, I'd really rather see that done in a way that didn't require big players.
But I don't know why you think that'd be an improvement.
Do you actually think that a fully decentralized, zero profit, no-big-players system for posting and discovering short media (or any kind of media) would put less "sexualized content" in front of teenagers (or anybody else)?
Moderation in such systems is usually opt-in, both because it fits better with the obvious architectures, and because the people who tend to build software like that tend to be pretty fanatical about user choice. So, if they choose to, kids are definitely going to be able to see pretty much anything that the system allows to exist at all... which will probably include tons of stuff that's really hard to find on, say, TikTok.
As for "recommending", I suspect any system that succeeded in putting the content users actually wanted in front of them would give teenagers, and indeed actual children, more "sexualized" content. The companies you're railing against are, in fact, trying to tamp that down, whether or not you believe it, and whether or not you think they're doing enough. A decentralized protocol does not care and will do exactly nothing to disadvantage that content.
Nobody really knows how to do decentralized recommendations (without them being gamed into uselessness), but if somebody did figure out a good way to do it, I'd expect it to be worse, from your point of view, than the platforms. So would a "pull-based" system that relied on search or graph following or communities of interest or whatever.
For a person with the priorities you seem to have, I can't see how decentralized systems would be anything but "out of the frying pan, and into the fire".
Decentralized systems like the web already have a solution: lots of jurisdictions are making it illegal to provide adult content without age gating it. The point is for people to assume the same set of liabilities they would in person instead of the status quo where the web magically means you can do whatever. Then you just set up filters at home (or have ISPs offer following) to block the other jurisdictions. e.g. I lose nothing from simply blocking Russia altogether on my router.
> Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.
I hate to direct traffic to people like that, but, you know, how about their actual "study"? I realize that the "journalists" at the Guardian aren't willing to provide the actual source link, but it's not hard to find.
Their methodology involves searching for suggested terms. They find the most outrage-inducing or outrage-adjacent terms offered to them at each step, and then iterate. They thereby discover, and search for, obfuscated terms being used by "the community" to describe the content they are desperately seeking.
They also find a lot of bullshit like the names of non-porn TV shows that they're too out of touch to recognize and too lazy to look up, and use those names to gin up more outrage, but that's a different matter.
This is, of course, all in the service of whipping up a moral panic over something that doesn't fucking matter to begin with.
Thank you for linking the source material, unfortunately it badly contradicts you. It clearly shows that the _very first_ list of ten suggested search terms contained (pretty heavily) sexualised suggestions.
I suppose some of that stuff could reasonably be called "sexualized". Pornographic? No. A problem? Not unless you have really weird hangups.
Here's a unified list of all the "very first list" suggestions they say they got. I took these from their appendix, alphabetized them, and coalesced duplicates. Readers can make their own decisions about whether these justify hauling out the fainting couch.
+ Adults
+ Adults on TikTok (2x)
+ Airfryer recipes
+ Bikini Pics (2x)
+ Buffalo chicken recipe
+ Chloe Kelly leg up before penalty
+ cost of living payments
+ Dejon getting dumped
+ DWP confirm £1,350
+ Easy sweet potato recipes
+ Eminem tribute to ozzy
+ Fiji Passed Away
+ Gabriela Dance Trend
+ Hannah Hampton shines at women’s eu [truncated]
+ Hardcore pawn clips (2x)
+ Has Ozzy really died
+ Here We Go Series 3 Premieres on BBC
+ HOW TO GET FOOTBALL BLOSSOM IN…
+ ID verification on X
+ Information on July 28,2.,,,
+ Jet2 holiday meme
+ Kelly Osbourne shared last video with [truncated]
+ Lamboughini
+ luxury girl
+ Nicki Minaj pose gone wrong
+ outfits
+ Ozzy Funeral in Birmingham
+ pakistani lesbian couple in bradford
+ revenge love ep 13 underwater
+ Rude pics models (2x)
+ Stock Market
+ Sydney Sweeney allegations
+ TikTok Late Night For
+ TIKTOK SHOP
+ TikTok Shop in UK
+ TIKTOK SHOP UK
+ Tornado in UK 2025
+ Tsunami wave footage 2025
+ Unshaven girl (3x)
+ Very rude babes (3x)
+ very very rude skimpy
+ woman kissing her man while washing his [truncated] (2x)