Okay. This may not be the kind of thing you had in mind, but the way I personally think about things:
is probably not focused enough on emotions. I’m not very good at dealing with emotions, either myself or other people’s, and I imagine that someone who was better would have very different thoughts about how to deal with people both on the small scale (e.g. interpersonal relationships) and on the large scale (e.g. politics).
may overestimate the value of individuals (e.g. in their capacity to affect the world) relative to organizations.
The way this community thinks about things:
is biased too strongly in directions that Eliezer finds interesting, which I suppose is somewhat unavoidable but unfortunate in a few respects. For example, Eliezer doesn’t seem to think that computational complexity is relevant to friendly AI and I think this is a strong claim.
is biased towards epistemic rationality when I think it should be more focused on instrumental rationality. This is a corollary of the first bullet point: most of the Sequences are about epistemic rationality.
is biased towards what I’ll call “cool ideas,” e.g. cryonics or the many-worlds interpretation of quantum mechanics. I’ve been meaning to write a post about this.
is hampered by a lack of demographic diversity that is probably bad for cognitive diversity (my impression is that LW is overwhelmingly male, white, 18-24 years old, etc.).
Atheism and skepticism in general:
is likely to be another form of belief as attire in practice. As in, I think many people who identify very strongly as atheists or skeptics are doing it to signal tribal affiliation more than anything else.
It takes incredible strength to recognize flaws in your entire way of thinking
Eh, does it? I think it just requires a cultural meme about criticism being a good thing. LW has this, maybe too much of this, and my impression is that so does Judaism (based on e.g. avoiding your belief’s real weak points). This is some evidence that you are thinking reasonably but it isn’t extremely strong evidence.
I think it just requires a cultural meme about criticism being a good thing.
That usually gets you a culture of inconsequential criticism, where you can be as loudly contrarian as you want as long as you don’t challenge any of the central shibboleths. This is basically what Eliezer was describing in “Real Weak Points”, but it shows up in a lot of places; many branches of the modern social sciences work that way, for example. It gets particularly toxic when you mix it up with a cult of personality and the criticism starts being all about how you or others are failing to live up to the Great Founder’s sacrosanct ideals.
I’m starting to think it might not be possible to advocate for a coherent culture that’s open to changing identity-level facts about itself; you can do it by throwing out self-consistency, but that’s a cure that’s arguably worse than the proverbial disease. I don’t think strength of will is what’s missing, though, if anything is.
Yes. And that’s what I’m unrealistically looking for—not just disagreement, but fundamental disagreement. And by fundamental I don’t mean the nature of the Singularity, as central as that is to some. I mean things like “rational thought is better than irrational thought” or “religion is not consistent with rational thought.” Even if they’re not spoken, they’re important and they’re there, which means they ought to be up for debate. I mean “ought to” in the sense that the very best, most intellectually open society imaginable would have already debated these and come to a clear conclusion, but would be willing to debate them again at any time if there was reason to do so.
What, on your view, constitutes a reason to debate issues about which a community has come to a conclusion? Relatedly, on your view, can the question of whether a reason to debate an issue actually exists or not ever actually be settled? That is, shouldn’t the very best, most intellectually open society imaginable on your account continue to debate everything, no matter how settled it seems, because just because none of its members can currently think of a reason to do so is insufficient grounds not to?
I think it’s safe to end a debate when it’s clear to outside observers (these are important) that it’s not going anywhere new. An optimal society listens to outsiders as well.
About epistemic vs. instrumental rationality, though: I had never heard those terms but it seems like a pretty simple difference of what rationality is to be used for. The way I understand it, Less Wrong is quite instrumentally focused. There are many posts as well as sequences (and all of HPMOR) about how to apply rationality to your everyday life, in addition to those dealing only with technical probabilities (like Pascal’s Mugging—not realistic).
Personally I’m more interested in the epistemic side of things and not a fan of assurances that these sequences will substantially improve your relationships or anything like that. But that’s just me.
Okay. This may not be the kind of thing you had in mind, but the way I personally think about things:
is probably not focused enough on emotions. I’m not very good at dealing with emotions, either myself or other people’s, and I imagine that someone who was better would have very different thoughts about how to deal with people both on the small scale (e.g. interpersonal relationships) and on the large scale (e.g. politics).
may overestimate the value of individuals (e.g. in their capacity to affect the world) relative to organizations.
The way this community thinks about things:
is biased too strongly in directions that Eliezer finds interesting, which I suppose is somewhat unavoidable but unfortunate in a few respects. For example, Eliezer doesn’t seem to think that computational complexity is relevant to friendly AI and I think this is a strong claim.
is biased towards epistemic rationality when I think it should be more focused on instrumental rationality. This is a corollary of the first bullet point: most of the Sequences are about epistemic rationality.
is biased towards what I’ll call “cool ideas,” e.g. cryonics or the many-worlds interpretation of quantum mechanics. I’ve been meaning to write a post about this.
is hampered by a lack of demographic diversity that is probably bad for cognitive diversity (my impression is that LW is overwhelmingly male, white, 18-24 years old, etc.).
Atheism and skepticism in general:
is likely to be another form of belief as attire in practice. As in, I think many people who identify very strongly as atheists or skeptics are doing it to signal tribal affiliation more than anything else.
Eh, does it? I think it just requires a cultural meme about criticism being a good thing. LW has this, maybe too much of this, and my impression is that so does Judaism (based on e.g. avoiding your belief’s real weak points). This is some evidence that you are thinking reasonably but it isn’t extremely strong evidence.
Could you elaborate?
On why Eliezer doesn’t seem to think that or why I think that this is a strong claim? We had a brief discussion about this here.
That usually gets you a culture of inconsequential criticism, where you can be as loudly contrarian as you want as long as you don’t challenge any of the central shibboleths. This is basically what Eliezer was describing in “Real Weak Points”, but it shows up in a lot of places; many branches of the modern social sciences work that way, for example. It gets particularly toxic when you mix it up with a cult of personality and the criticism starts being all about how you or others are failing to live up to the Great Founder’s sacrosanct ideals.
I’m starting to think it might not be possible to advocate for a coherent culture that’s open to changing identity-level facts about itself; you can do it by throwing out self-consistency, but that’s a cure that’s arguably worse than the proverbial disease. I don’t think strength of will is what’s missing, though, if anything is.
Yes. And that’s what I’m unrealistically looking for—not just disagreement, but fundamental disagreement. And by fundamental I don’t mean the nature of the Singularity, as central as that is to some. I mean things like “rational thought is better than irrational thought” or “religion is not consistent with rational thought.” Even if they’re not spoken, they’re important and they’re there, which means they ought to be up for debate. I mean “ought to” in the sense that the very best, most intellectually open society imaginable would have already debated these and come to a clear conclusion, but would be willing to debate them again at any time if there was reason to do so.
What, on your view, constitutes a reason to debate issues about which a community has come to a conclusion?
Relatedly, on your view, can the question of whether a reason to debate an issue actually exists or not ever actually be settled? That is, shouldn’t the very best, most intellectually open society imaginable on your account continue to debate everything, no matter how settled it seems, because just because none of its members can currently think of a reason to do so is insufficient grounds not to?
I think it’s safe to end a debate when it’s clear to outside observers (these are important) that it’s not going anywhere new. An optimal society listens to outsiders as well.
OK. Thanks for answering my question.
These are good, thank you.
About epistemic vs. instrumental rationality, though: I had never heard those terms but it seems like a pretty simple difference of what rationality is to be used for. The way I understand it, Less Wrong is quite instrumentally focused. There are many posts as well as sequences (and all of HPMOR) about how to apply rationality to your everyday life, in addition to those dealing only with technical probabilities (like Pascal’s Mugging—not realistic).
Personally I’m more interested in the epistemic side of things and not a fan of assurances that these sequences will substantially improve your relationships or anything like that. But that’s just me.