As a rule, any mention of religion on an online forum degenerates into a religious argument. Why? Why does this happen with religion and not with Javascript or baking or other topics people talk about on forums?
What’s different about religion is that people don’t feel they need to have any particular expertise to have opinions about it. All they need is strongly held beliefs, and anyone can have those. No thread about Javascript will grow as fast as one about religion, because people feel they have to be over some threshold of expertise to post comments about that. But on religion everyone’s an expert.
Then it struck me: this is the problem with politics too. Politics, like religion, is a topic where there’s no threshold of expertise for expressing an opinion. All you need is strong convictions.
Do religion and politics have something in common that explains this similarity? One possible explanation is that they deal with questions that have no definite answers, so there’s no back pressure on people’s opinions. Since no one can be proven wrong, every opinion is equally valid, and sensing this, everyone lets fly with theirs.
The part in bold is due to people having read the QM sequence and believing that they are now experts in the ontology of quantum physics.
It occurred to me that on this forum QM/MWI discussions are a mind-killer, for the same reasons as religion and politics are:
Not particularly. To the extent that it is a mind killer it is a mind killer in the way discussions of FAI, SIAI capabilities, cryonics, Bayesianism or theories like this are. Whenever any keyword suitably similar to one of these subjects appears one of the same group of people can be expected to leap in and launch into an attack of lesswrong, its members, Eliezer, SingInst or all of the above—they may even try to include something on the subject matter as well.
The thing is most people here aren’t particularly interested in talking about those subjects—at least they aren’t interested in rehashing the same old tired arguments and posturing yet again. They have moved on to more interesting topics. This leads to the same abysmal quality of discussion—and belligerent and antisocial interactions—every time.
Any FAI discussions are mindkilling unless they are explicitly conditional on “assuming FOOM is logically possible”. After all, we don’t have enough evidence to bridge the difference in priors, and neither side (AI is a risk/AI is not a risk) explicitly acknowledges that fact (and this problem makes them sides more than partners).
I’m not sure I agree with Graham on the exact mechanics there. There are a number of mindkilling topics where empirically supportable answers should in principle exist: the health effects of obesity, for example. Effects of illegal drugs. Expected outcomes of certain childrearing practices.
Expertise exists on all these topics, and you can prove people wrong pretty conclusively with the right data set, but people—at least within certain subcultures, and often in general—usually feel free to ignore the data and expertise and expound their own theories. This is clearly not because these questions lack definite answers. I think it’s more because social approval rides on the answers, and because of the importance of the social proof heuristic and its relatives.
QM interpretation may or may not fall into that category around here.
Graham actually agrees with you; the essay quoted above continues:
But this isn’t true. There are certainly some political questions that have definite answers, like how much a new government policy will cost. But the more precise political questions suffer the same fate as the vaguer ones. I think what religion and politics have in common is that they become part of people’s identity, and people can never have a fruitful argument about something that’s part of their identity. By definition they’re partisan.
It occurred to me that on this forum QM/MWI discussions are a mind-killer, for the same reasons as religion and politics are:
The part in bold is due to people having read the QM sequence and believing that they are now experts in the ontology of quantum physics.
Not particularly. To the extent that it is a mind killer it is a mind killer in the way discussions of FAI, SIAI capabilities, cryonics, Bayesianism or theories like this are. Whenever any keyword suitably similar to one of these subjects appears one of the same group of people can be expected to leap in and launch into an attack of lesswrong, its members, Eliezer, SingInst or all of the above—they may even try to include something on the subject matter as well.
The thing is most people here aren’t particularly interested in talking about those subjects—at least they aren’t interested in rehashing the same old tired arguments and posturing yet again. They have moved on to more interesting topics. This leads to the same abysmal quality of discussion—and belligerent and antisocial interactions—every time.
Any FAI discussions are mindkilling unless they are explicitly conditional on “assuming FOOM is logically possible”. After all, we don’t have enough evidence to bridge the difference in priors, and neither side (AI is a risk/AI is not a risk) explicitly acknowledges that fact (and this problem makes them sides more than partners).
I’m not sure I agree with Graham on the exact mechanics there. There are a number of mindkilling topics where empirically supportable answers should in principle exist: the health effects of obesity, for example. Effects of illegal drugs. Expected outcomes of certain childrearing practices.
Expertise exists on all these topics, and you can prove people wrong pretty conclusively with the right data set, but people—at least within certain subcultures, and often in general—usually feel free to ignore the data and expertise and expound their own theories. This is clearly not because these questions lack definite answers. I think it’s more because social approval rides on the answers, and because of the importance of the social proof heuristic and its relatives.
QM interpretation may or may not fall into that category around here.
Graham actually agrees with you; the essay quoted above continues: