I'm on Bay Area tech Twitter and there's a growing movement called effective accelerationism. E/accs lionize tech billionaires and call for increased investment in futuristic technologies like AI, decreased regulation, generally more rigid social hierarchies, etc.
I want to peacefully coexist with people in my social circles but it's hard to hide my disdain. I believe a lot of tech billionaires are essentially nihilists creating these stories as a way to increase their wealth and power.
A lot of their values seem to stem from doing too much Adderall, too much coke, and too many hallucinogens. Sometimes it seems like they came up with a lot of this stuff while playing video games on LSD.
Effective accelerationism doesn't value humanity. If anyone gets crushed beneath the wheels of technological "progress," if society turns out worse for the majority of people, well, that's not their problem.
You should call it out. It helps to hear differing opinions.
I'm a bit of a global warming pessimist. I don't think humanity is equipped to turn things around and that we are looking at significant changes, far more than anybody is willing to admit to in the media.
If I were a tech billionaire, I would be designing a sustainable, self contained, hermetically sealed, box you can fit a village of people into. A box of forest and tech that can feed and sustain my family and friends. I like it imagine a utopian medieval village were people are tradespeople and farmers by day, and write code at night.
It doesn't need to be in space, the earth will look like Mars soon enough.
Now that I think about it, I wonder if the Saudis are building with
"The Line"
> It doesn't need to be in space, the earth will look like Mars soon enough
You're disregarding how well humans are adapted to earth-conditions. An earth ravaged by draughts, flooding, wildfires, volcano eruptions and superstorms is still more hospitable to humans than Mars - by several orders of magnitude.
I think the whole earth could look a lot like the Sahara or Gobi desert in a few generations. If the biology of our fertile soils change too much, and nature can't evolve fast enough to adapt to the changes, we could lose a lot of what we rely on.
I think its more likely that a few large famines will wipe out most humans before a total biological collapse, which will end the carbon emission problem.
Mars is colder than Antarctica, drier than the Sahara, and has lower air pressure than the top of Everest.
What little atmosphere Mars does have is 95% CO2, but it's so cold in the Martian winter that the sky literally falls each winter as 25% of it by mass condenses into solid CO2 "dry ice" on the poles. Mars has no ozone layer (not that you'd survive outside without a space suit on), and the entire thickness of the atmosphere is so tenuous that a large coronal mass ejections that happens to hit the planet will kill basically all humans walking or driving around outside on the surface.
The Martian soil has about a million times the concentration of calcium perchlorate (toxic to both humans and plants) than the perchlorate concentration in water found in literal superfund cleanup sites.
Even then, we'd be able to breathe without helmets. Mars is already like the Gobi, without oceans, precipitation or the breathable air that Earth has. It's not close to being the same thing
The e/acc stuff is the first that came to my mind as well. Well, it's our fault we let sociopaths lead the game, right? The people that glared over the dystopian parts of the fiction, the dehumanizing part, because the tech was shiny and the smell of opportunity for themselves was even shinier.
Just a nitpick: hallucinogenics are probably something these people should do more, as it increases empathy generally.
I like the accelerationist story and I don't think I came to that opinion through manipulation. If I did it was by the club of Rome.
The way I see it is - you're in a car heading for a cliff. You can go all in on the brakes and hope you stop in time, or all in on the gas and hope you can jump the gap. We don't really have enough data to know which is better, it comes down to feeling.
And the time to slam the breaks was probably about 40 years ago. Since people back then elected Reagan instead, I'm of the opinion it's worth keeping the gas all the way down, just in case it works.
I think that's a very good metaphor, because, statistically speaking, there are very very few cliffs where pressing the gas gives you higher chances of survival than trying to brake and turn. And on top of that, "well we're already aimed at the cliff and going to fast" is a very good way for the people that put us in that situation to force the issue in the direction they want.
"Well how bad can it be? You can't know it's going to be bad until we try it." -> "Ok, it's looking a little scary, but let's get a little closer." -> "Ok, this is probably a bad idea, but it's not an emergency yet, so let's chill out, ok?" -> "Ok there's still time to turn stop freaking out." -> "Ok now it's an emergency but it's too late to turn so we might as well go full speed ahead and hope it works out."
Yeah, with a car and a cliff hitting the brakes is a good decision. Civilizations have a lot more momentum than cars though.
There's an analysis saying that if the Titanic had hit the iceberg head on, it wouldn't have sunk. I like the car analogy because there's opportunities for passengers to bail and thereby lighten the car for those who want to clear the cliff, but in terms of ability to turn or stop an ocean liner is probably the better metaphor.
You should look into how Futurism, the cultural movement, preceded fascism.
Our era may seem frustrating. I guarantee that accelerating the causes of pain does not lessen the pain. Giving the reins to strongmen, to technology-enriched businessmen, to impersonal processes of capital investment and profit - there is no chance we end up anywhere good. I would rather not go through another World War to prove this idea wrong again.
The issue I take with this philosophy is it's all or nothing. Sam Bankman-Fried admitting to being willing to flip a coin that saves or kills humanity points to this. Either we give all of our hope and money to tech billionaires to save us, or we'll just _________ and then humanity will perish. Also we need to trust tech billionaires who have shown plenty of times that they shouldn't be trusted to save humanity, so I'm not inclined to do that.
Especially when it's these same tech billionaires who are informing us about the risks in the first place.
Tyler Cowen found the rationalist/EA people impressive. But was skeptical that anybody had enough information and wisdom to see thousands of years into the future. After the fall of FTX he pointed out that they couldn't even predict the consequences of their own actions even one year into the future. Why should we trust longtermism (or TESCREAL or whatever we're calling it now)?
(I'm left-wing so I normally find Cowen's perspective rather dismal. But I think the best part of conservative philosophy is the skepticism that any small group of experts knows best or can direct the course of history for the long term. And here he is resoundingly correct.)
This is kind of hard to discuss since it seems like SBF didn't engage in EA "in good faith" and we can't really know if other billionaires are either. It does make it easy to discount EA because of the FTX debacle, but I guess if it was so easy to discount then maybe EA true believers weren't actually doing anything to offset that black eye.
More than anything it seems like billionaires are trying to convince us to let them hold future monopolies because we just should, ok?
>I think the best part of conservative philosophy is the skepticism that any small group of experts knows best or can direct the course of history for the long term
It's a double edged sword for sure. On one hand, if "conservatism" is about resisting change for the sake of change alone it can restrict our growth as a society. On the other hand, changing something on the basis of the newest group of "experts" deciding we should is something that DOES need pushback in many cases. It really just underscores how society needs all kinds of people coming to consensus to be functional.
As long as society attempts to come together and reach consensus on important things, I don't think we need to be saved by billionaires :)
> This is kind of hard to discuss since it seems like SBF didn't engage in EA "in good faith" and we can't really know if other billionaires are either.
I wonder if EA could ever be implemented in a way that proves or disproves it as a viable strategy? It seems to me that the inherent problem is that people will always behave in subtle and not-subtle self-interested ways that make "true" altruistic behavior devilishly difficult to carry out in the real world (especially under the conditions that arise granting you the billions to carry the philosophy out). And therefore almost impossible to falsify.
Sort of reminds me of the old adage, "Communism cannot fail, it can only be failed." With some people today exclaiming that true Marxism has never been tried. But I can't imagine what perfect conditions could exist that would allow either communism or EA to be carried out, without having to account for human nature in the end.
I think the best interpretation of EA is still "Effective altruism is a question" (which I believe is more or less the original interpretation): how can you do the most (in my opinion, reasonable) good (within a budget)? It's trying to separate feeling good about doing a small act, versus simply pausing to think about what is effective.
Sure, people will converge on claimed solutions to that question. But you can give your own solution[1] (I myself am an EA and disagree on some points, including giving locally in my third world country, and volunteering). The perspective is really valuable I think.
Now that said indeed, don't try to make money at all costs in order to donate. First that can easily fail and be a direct net negative, and second there are secondary effects like losing trust and unexpected side effects on other people. Being honest and trustworthy is a really good idea.
[1] Recently Give Directly dropped out of Givewell's top charities, for probably understandable reasons; I still like Give Directly and still give. Just get informed and give well! (to Givewell or not :P)
Galacta7 hinted at it with the discussion of failing Communism -- there can be plenty of excellent philosophies on paper but once they enter the real world it doesn't matter how altruistic the philosophy is on paper if it's twisted by a single person when they gain control of the real world in some way.
I am not against the philosophy that there are optimal ways to help, and less optimal ways to help. I'm not against the philosophy that tries to weigh the best of the available options. I am against the philosophy that then arrogantly says "This is the best and only way to move forward for the best utility to humanity" as if they are able to see the future.
I don't doubt you have opinions on how to best help humanity, and that's great! As you said, the perspective that we only have a limited amount of utility we can provide for ourselves or the benefit of others, and we must be wise in how we use it is a good one to have. It's the same wisdom that helps me see that I can't give my rent money to another person and tell my family "tough luck" when we get evicted.
On the other hand I feel like utilitarianism can easily lead to decision overload when applied to everyday life. So it's a lens to view the world through but can't be a holistic principle that guides your entire life or you'd never get anything accomplished.
The fact that it's utilitarian is already a red flag for me, because you have to start making judgements about the expected utility output of helping one person over another. Italy had to coldly adopt this mindset when prioritizing care during COVID-19. It has use cases such as ensuring the future workforce and viability of a country in the face of limited healthcare but nobody wants to hear their expected utility is too low to be "worth" helping.
Believing that's the way we should view and calculate every aspect of life feels like a bad mixture of egotism (like Musk believing he's the only one who can save humanity long term) and and, weirdly, a selfishness that the utilitarian's utility is sacred and should only be "spent wisely" and not "wasted."
I find similarities to how many businesses close because they aren't making infinite growth anymore, or scrapping a mostly-complete 80% project (which could be used as-is) since that last %20 is hard.
I want to peacefully coexist with people in my social circles but it's hard to hide my disdain. I believe a lot of tech billionaires are essentially nihilists creating these stories as a way to increase their wealth and power.
A lot of their values seem to stem from doing too much Adderall, too much coke, and too many hallucinogens. Sometimes it seems like they came up with a lot of this stuff while playing video games on LSD.
Effective accelerationism doesn't value humanity. If anyone gets crushed beneath the wheels of technological "progress," if society turns out worse for the majority of people, well, that's not their problem.