I don't mean technically, (well partially with security) I mean the legal content aspect ie nudity.
Sure you can pass your photos to human examiners that go through hundreds of photos and click on the ones that have nudes but have the ability to have it work 24/7 real time...
NEURAL NET! Throw that in there, ML
I'm scared for this, aside from the security aspect ie. checking the integrity of files if they have anything in them besides a photo itself, no malicious script/code.
Yet it seems so prevalent everywhere there is photo-upload with near-instant processing (displayed on site).
So this question isn't too much on the technical aspect of file upload but the filtering inappropriate content part and server security.
edit: I'm going to look for existing resources (IT COSTS MONEY?!!!)
looking at this Cloud Vision api by Google right now
https://cloud.google.com/blog/big-data/2016/08/filtering-inappropriate-content-with-the-cloud-vision-api