Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That hasn't defined EA for at least 10 years or so. EA started with "you should donate to NGOs distributing mosquito nets instead of the local pet shelter".

It then moved into "you should work a soulless investment banking job so you can give more".

More recently it was "you should excise all expensive fun things from your life, and give 100% of your disposable income to a weird poly sex cult and/or their fraudulent paper hedge fund because they're smarter than you."



This is what some EAs believe, I don't think there was ever a broad consensus on those latter claims. As such, it doesn't seem like a fair criticism of EA.


You can’t paint a broad brush of a whole movement, but it’s true if the leadership of the EA organization. Once they went all-in on “AI x-risk”, there ceased being a meaningful difference between them and the fringe of the LW ratsphere.



The people who run the forum you linked to.


That same link puts AI risk under the "far future" category, basically the same category as "threats to global food security" and asteroid impact risks. What's unreasonable about that?


> That hasn't defined EA for at least 10 years or so.

Meanwhile, on an actual EA website: https://www.givewell.org/charities/top-charities

* Medicine to prevent malaria

* Nets to prevent malaria

* Supplements to prevent vitamin A deficiency

* Cash incentives for routine childhood vaccines


GiveWell is an example of the short-termist end of EA. At the long-termist end people pay their friends to fantasize about Skynet at 'independent research institutes' like MIRI and Apollo Research. At the "trendy way to get rich people to donate" end you get buying a retreat center in Berkley, a stately home in England, and a castle in Czechia so Effective Altruists can relax and network.

Its important to know which type of EA organization you are supporting before you donate, because the movement includes all three.


I assume that GiveWell is the most popular of them. I mean, if you donate to MIRI, it is because you know about MIRI and because you specifically believe in their cause. But if you are just "hey, I have some money I want to donate, show me a list of effective charities", then GiveWell is that list.

(And I assume that GiveWell top charities receive orders of magnitude more money, but I haven't actually checked the numbers.)


Even GiveWell partnered with the long-termist/hypothetical risk type of EA by funding something called Open Philanthropy. And there are EA organizations which talk about "animal welfare" and mean "what if we replaced the biosphere with something where nothing with a spinal cord ever gets eaten?" So you can't trust "if it calls itself EA, it must be highly efficient at turning donations into measurable good." EA orgs have literally hired personal assistants and bought stately homes for the use of the people running the orgs!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: