Yeah, the whole community side to rationality is, at best, questionable.
But the tools of thought that the literature describes are invaluable with one very important caveat.
The moment you think something like "I am more correct than this other person because I am a rationalist" is the moment you fail as a rationalist.
It is an incredibly easy mistake to make. To make effective use of the tools, you need to become more humble than before you were using them or you just turn into an asshole who can't be reasoned with.
If you're saying "well actually, I'm right" more often than "oh wow, maybe I'm wrong", you've failed as a rationalist.
> success is supposed to look exactly like actually being right more often.
I agree with this, and I don't think it's at odds with what I said. The point is to never stop sincerely believing you could be wrong. That you are right more often is exactly why it's such an easy trap to fall into. The tools of rationality only help as long as you are actively applying them, which requires a certain amount of humility, even in the face of success.
Also that the Art needs to be about something else than itself, and a dozen different things. This failure mode is well known in the community; Eliezer wrote about it to death, and so did others.
To no avail, alas. But this is why we now see a thought leader publish a piece to say this is a thing it's now permissible not to be, indeed never to have been at all.
This reminds me of undergrad philosophy courses. After the intro logic/critical thinking course, some students can't resist seeing affirming the antecedent and post hoc fallacies everywhere (even if more are imagined than not).
I'm not affiliated with the rationalist community, but I always interpreted "Less Wrong" as word-play on how "being right" is an absolute binary: you can either be right, or not be right, while "being wrong" can cover a very large gradient.
I expect the community wanted to emphasize how people employing the specific kind of Bayesian iterative reasoning they were proselytizing would arrive at slightly lesser degrees of wrong than the other kinds that "normal" people would
use.
If I'm right, your assertion wouldn't be totally inaccurate, but I think it might be missing the actual point.
> I always interpreted "Less Wrong" as word-play on how "being right" is an absolute binary
Specifically (AFAIK) a reference to Asimov’s description[1] of the idea:
> [W]hen people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.
"Less wrong" is a concept that has a lot of connotations that just automatically appear in your mind and help you. What you wrote "It's very telling that some of them went full "false modesty" by naming sites like "LessWrong", when you just know they actually mean "MoreRight"." isn't bad because of Asimov said so, or because you were unaware of a reference, but because it's just bad.
> I'm not affiliated with the rationalist community, but I always interpreted "Less Wrong" as word-play on how "being right" is an absolute binary: you can either be right, nor not be right, while "being wrong" can cover a very large gradient.
I know that's what they mean at the surface level, but you just know it comes with a high degree of smugness and false modesty. "I only know that I know nothing" -- maybe, but they ain't no modern day Socrates, they are just a bunch of nerds going online with their thoughts.
Sometimes people enjoy being clever not because they want to rub it in your face that you're not, but because it's fun. I usually try not to take it personally when I don't get the joke and strive to do better next time.
Very rational of you, but that's the problem with the whole system.
If you want to avoid thinking you're right all the time, it doesn't help to be clever and say the logical opposite. "Rationally" it should work, but it's bad because you're still thinking about it! It's like the thinking of a pink elephant thing.
>If you want to avoid thinking you're right all the time, it doesn't help to be clever and say the logical opposite.
I don't understand how this is supposed to be relevant here. You seem to be falsely accusing me of doing such a thing, or of being motivated by simple contrarianism.
Again, your claim was:
> but you just know it comes with a high degree of smugness and false modesty
Why should I "just know" any such thing? What is your reason for "just knowing" it? It comes across that you have simply decided to assume the worst of people that you don't understand.
I don't think I'm more clever than the average person, nor have I made this my identity or created a whole tribe around it, nor do I attend nor host conferences around my cleverness, rationality, or weird sexual fetishes.
Rationalism is not about trying to be clever it's very much about trying to be a little less wrong. Most people are not even trying, which includes myself. I don't write down my predictions, I don't keep a list of my errors. I just show up to work like everyone else and don't worry about it.
I really don't understand all the claims that they intellectually smug and overconfident when they are the one group of people trying to do better. It really seems like all the hatred is aimed at the hubris to even try to do better.
But the tools of thought that the literature describes are invaluable with one very important caveat.
The moment you think something like "I am more correct than this other person because I am a rationalist" is the moment you fail as a rationalist.
It is an incredibly easy mistake to make. To make effective use of the tools, you need to become more humble than before you were using them or you just turn into an asshole who can't be reasoned with.
If you're saying "well actually, I'm right" more often than "oh wow, maybe I'm wrong", you've failed as a rationalist.