Perhaps with all the deplatforming, delisting, and other deleterious effects, all the 'good' content is removed, while at the same time those removals are affecting their machine learning data sets. It is almost a self-imposed adversarial attack on result quality.
I think that the model is, somehow, irreparably broken. Remember that, when it started returning photos of black people in searches for "gorilla", they just stopped using it for "gorilla" searches.
My suspicion is that its been poisoned by some combination of an improperly-considered tagging process and malfeasance.
I do think it's hilarious that the one time Google doesn't drop a relevant search term in the middle of a query is for something like "black haired man." The way it treats terms, you might expect it to drop the "haired" and return pictures of black men, but no, it's almost exclusively white dudes with black hair (never mind that black and brown people also have black hair).