It’s another argument in favour of EA that they try to cut past arguments like this. If you’re a billionaire you can do a lot more good by investing in a mosquito net factory than you ever could by hanging mosquito nets one at a time yourself.
The argument of EA is that feelings can be manipulated (and often are) by the marketing work done by charities and their proponents. If we want to actually be effective we have to cut past the pathos and look at real data.
Firstly, most people aren't billionaires. Nor do I think EA is somehow novel in suggesting that a billionaire should buy nets instead of help directly.
Secondly, you're missing the point I'm making, which is why many people find EA distasteful: it completely focuses on outcomes and not internal character, and it arrives at these incomes by abstract formulae. This is how we ended up with increasingly absurd claims like "I'm a better person because I work at BigCo and make $250k a year, then donate 10% of it, than the person that donates their time toward helping their community directly." Or "AGI will lead to widespread utopia in the future, therefore I'm ethically superior because I'm working at an AI company today."
I really don't think anyone is critical of EA because they think being inefficient with charity dollars is a good thing, so that is a strawman. People are critical of the smarmy attitude, the implication that other altruism is ineffective, and the general detached, anti-humanistic approach that the people in that movement portray.
The problems with it are not much different from utilitarianism itself, which EA is just a half-baked shadow of. As someone else in this comment section said, unless you have a sense of virtue ethics underlying your calculations, you end up with absurd, anti-human conclusions that don't make much sense to anyone with common sense.
There's also the very basic argument that maybe directly helping other people leads to a better world overall, and serves as an example than just spending money abstractly. That counterargument never occurs to the EA/rationalist crowd, because they're too obsessed with some master rational formula for success.
Secondly, you're missing the point I'm making, which is why many people find EA distasteful: it completely focuses on outcomes and not internal character, and it arrives at these incomes by abstract formulae.
No, I did not miss that point at all. I think it is WRONG to focus on character. That leads us down the dark path of tribalism and character assassination and culture war.
If we're going to talk about a philosophy and an ethics of behaviour, we have to talk about ACTIONS. That's the only way we'll ever accomplish any good.
The argument of EA is that feelings can be manipulated (and often are) by the marketing work done by charities and their proponents. If we want to actually be effective we have to cut past the pathos and look at real data.