28 were things that require you believe in our particular formulation of transhuman singularitarianism before the premise even makes sense
This. This is the thing that has to be fixed before LessWrong can claim to primarily be about “refining the art of human rationality.”
AI is interesting, but not about rationality. Cryonics is interesting, but not about rationality. Nanotechnology is interesting, but not about rationality.
Ways of thinking are about rationality. Ways of being smart are about rationality. Ways of being stupid are about rationality. Stories of failing spectacularly are about rationality.
The first group may of course be highly relevant to the second. But it’s not about it, and requiring readers of a site advertised as being on the topic of “rationality” to buy into the standard transhumanist belief cluster is a failure of signaling rationality, and thus for these purposes a failure of instrumental rationality.
Before posting, don’t just think “Is this interesting to the LW readership?” but also “and is it on topic? Is my actual point about rationality?”
Honestly, I’m surprised we’re getting new people at the rate we are.
It’s an interesting site full of really smart people, and (and this is a real plus point) the comment quality is consistently high because people buy into the moderation system, i.e. “mod up if you want to see more comments like this.” That’s a BIG WIN. Keeps me reading.
Really. I strongly suggest you just write a main section post requesting people stay on-topic in posts, and that not clearly being on-topic about rationality is a reason to downvote. See what the community thinks.
This. This is the thing that has to be fixed before LessWrong can claim to primarily be about “refining the art of human rationality.”
AI is interesting, but not about rationality. Cryonics is interesting, but not about rationality. Nanotechnology is interesting, but not about rationality.
Ways of thinking are about rationality. Ways of being smart are about rationality. Ways of being stupid are about rationality. Stories of failing spectacularly are about rationality.
The first group may of course be highly relevant to the second. But it’s not about it, and requiring readers of a site advertised as being on the topic of “rationality” to buy into the standard transhumanist belief cluster is a failure of signaling rationality, and thus for these purposes a failure of instrumental rationality.
Before posting, don’t just think “Is this interesting to the LW readership?” but also “and is it on topic? Is my actual point about rationality?”
It’s an interesting site full of really smart people, and (and this is a real plus point) the comment quality is consistently high because people buy into the moderation system, i.e. “mod up if you want to see more comments like this.” That’s a BIG WIN. Keeps me reading.
Really. I strongly suggest you just write a main section post requesting people stay on-topic in posts, and that not clearly being on-topic about rationality is a reason to downvote. See what the community thinks.
I felt a bit out of place until I started reading MoR; what was all this cryonics/decision theory stuff?
A couple chapters in I thought, “THAT’S the kind of stuff I’m interested in talking about! Now I feel like I’m in the right place.”