Do you have suggestions? I am genuinely open to hearing them. I do, however, push back on the fundamental premise that we facilitate sex trafficking. Both Android and iOS have location sharing built into the OS - I don't think the feature is ever going away.
Invest in ML to spot patterns. I don't know specifically how it was done, but I heard that Whatsapp had a hard time with spam after e2e encryption since you couldn't catch spam by examining the text. So instead they started looking for patterns and found things like an account that was less than a week old sending out thousands of messages a week. It's an outlier in usage pattern. I believe credit card fraud systems do something similar. Like if you suddenly make a purchase far far away or in a category that you never do, then it gets flagged.
I'm happy to help brainstorm more ideas.
From the article and multiple police investigations it sounds like your software is used by them to run their illicit businesses. Having users register could be one option. If there's one guy and a bunch of teenage girls on a plan that might be a red flag or worth further review. Perhaps your company could look at all the people that were busted and see if there's commonalities with how they use the app. It's my guess that if multiple people were using it for trafficking and were busted there's probably traffickers using your software right now.