The big difference is with photos end-to-end encrypted, Apple can't (by choice nor force) have human "content reviewers" look at photos to inspect them for unlawful content, as was the intention under Apple's 2021 plan [1] after a threshold of 30 hash matches was met.
Although it was starting on CSAM material, it wasn't clear which other illegal activities Apple would assist governments in tracking. In countries in which [being gay is illegal](https://www.humandignitytrust.org/lgbt-the-law/map-of-crimin...), having Apple employees aid law enforcement by pointing out photographic evidence of unlawful behaviour (for example, a man hugging his husband) would have been a recipe for grotesque human rights abuses.
With photos encrypted, Apple can't be pressured to hire human reviewers to inspect them, and thus cannot be pressured by governments that enforce absurd laws to pass on information on who might be engaging in "unlawful" activities.
Although it was starting on CSAM material, it wasn't clear which other illegal activities Apple would assist governments in tracking. In countries in which [being gay is illegal](https://www.humandignitytrust.org/lgbt-the-law/map-of-crimin...), having Apple employees aid law enforcement by pointing out photographic evidence of unlawful behaviour (for example, a man hugging his husband) would have been a recipe for grotesque human rights abuses.
With photos encrypted, Apple can't be pressured to hire human reviewers to inspect them, and thus cannot be pressured by governments that enforce absurd laws to pass on information on who might be engaging in "unlawful" activities.
[1] https://www.eff.org/deeplinks/2021/08/apples-plan-think-diff...