I think I’m more on your side of the argument but I don’t find the arguments you bring convincing. You use an example like the moon-landing where there’s no value in believing in it and you given that you don’t have the skills to simply change your beliefs by choice, you take it as an assumption that nobody has those skills.
I’m planning to write a post about this sometime in the future, but the gist of why I believe that having adopting useful but wrong beliefs is that it makes you shy away from situations that might disapprove those wrong beliefs.
On interesting thing about dealing with spirituality is that the act of telling people without any spiritual experiences about advanced spiritual concepts that are based on actual experiences is that the lay person necessarily forms a wrong belief about the actual spiritual topic as having a correct belief about the topic needs some basic experience as reference points.
That’s one of the reasons why most traditional forms of spirituality don’t tell beginners or lay people about advanced concepts. If one isn’t careful when talking in that domain it’s quite easy for the beginner to adopt wrong beliefs that are in the way of progress.
From my perspective this is one of the reasons why a lot of New Age spirituality has relatively little to show in spiritual experiences for the people in those communities.
it makes you shy away from situations that might disapprove those wrong beliefs
This is another good reason. I was gesturing roughly in that direction when talking about the Christian convert being blocked from learning about new religions.
I think that there’s a general concept of being “truth aligned”, and being truth aligned is the right choice. Truth-seeking things reinforce each other, and things like lying, bullshitting, learning wrong things, avoiding disconfirming evidence etc. also reinforce each other. Being able to convince yourself of arbitrary belief is an anti-truth skill, and Eliezer suggests you should dis-cultivate it by telling yourself you can’t do it.
Your point about spirituality is a major source of conflict about those topics, with non-believers saying “tell us what it is” and the enlightened saying “if I did, you’d misunderstand”. I do think that it’s at least fair to expect that the spiritual teachers understand the minds of beginners, if not vice versa. This is why I’m much more interested in Val’s enlightenment than in Vinay Gupta’s.
Being able to convince yourself of arbitrary belief is an anti-truth skill, and Eliezer suggests you should dis-cultivate it by telling yourself you can’t do it.
That’s an interesting example in this context. You seem to say you want to believe that “you can’t do it” because it’s useful to hold that belief and not necessarily because it’s true.
Practically, I don’t think convincing yourself of a belief because the belief is useful is the same thing as convincing yourself of an arbitrary belief. I don’t think that the people I now who I consider particularly skilled at adopting beliefs because they consider them useful practiced on arbitrary beliefs.
To use an NLP term (given that’s the community where I know most people with the relevant skill set), behavior change is much easier when the belief change is ecological then if it’s random.
You use an example like the moon-landing where there’s no value in believing in it
There’s some value in believing in it. If you don’t believe in it and it comes up, people might look at you funny.
One interesting difference between what we may as well call “epistemicists” and “ultra-instrumentalists” is that ultra-instrumentalists generally weight social capital as more important, and individual intellectual ability as less important, than epistemicists do. See here: most of the reputed benefits of belief in Mormonism-the-religion are facets of access to Mormonism-the-social-network.
Another interesting feature of ultra-instrumentalists is that their political beliefs are often outside their local Overton windows. Presumably they have some idea of how much social capital these beliefs have cost them.
I think I’m more on your side of the argument but I don’t find the arguments you bring convincing. You use an example like the moon-landing where there’s no value in believing in it and you given that you don’t have the skills to simply change your beliefs by choice, you take it as an assumption that nobody has those skills.
I’m planning to write a post about this sometime in the future, but the gist of why I believe that having adopting useful but wrong beliefs is that it makes you shy away from situations that might disapprove those wrong beliefs.
On interesting thing about dealing with spirituality is that the act of telling people without any spiritual experiences about advanced spiritual concepts that are based on actual experiences is that the lay person necessarily forms a wrong belief about the actual spiritual topic as having a correct belief about the topic needs some basic experience as reference points.
That’s one of the reasons why most traditional forms of spirituality don’t tell beginners or lay people about advanced concepts. If one isn’t careful when talking in that domain it’s quite easy for the beginner to adopt wrong beliefs that are in the way of progress.
From my perspective this is one of the reasons why a lot of New Age spirituality has relatively little to show in spiritual experiences for the people in those communities.
This is another good reason. I was gesturing roughly in that direction when talking about the Christian convert being blocked from learning about new religions.
I think that there’s a general concept of being “truth aligned”, and being truth aligned is the right choice. Truth-seeking things reinforce each other, and things like lying, bullshitting, learning wrong things, avoiding disconfirming evidence etc. also reinforce each other. Being able to convince yourself of arbitrary belief is an anti-truth skill, and Eliezer suggests you should dis-cultivate it by telling yourself you can’t do it.
Your point about spirituality is a major source of conflict about those topics, with non-believers saying “tell us what it is” and the enlightened saying “if I did, you’d misunderstand”. I do think that it’s at least fair to expect that the spiritual teachers understand the minds of beginners, if not vice versa. This is why I’m much more interested in Val’s enlightenment than in Vinay Gupta’s.
That’s an interesting example in this context. You seem to say you want to believe that “you can’t do it” because it’s useful to hold that belief and not necessarily because it’s true.
Practically, I don’t think convincing yourself of a belief because the belief is useful is the same thing as convincing yourself of an arbitrary belief. I don’t think that the people I now who I consider particularly skilled at adopting beliefs because they consider them useful practiced on arbitrary beliefs.
To use an NLP term (given that’s the community where I know most people with the relevant skill set), behavior change is much easier when the belief change is ecological then if it’s random.
There’s some value in believing in it. If you don’t believe in it and it comes up, people might look at you funny.
One interesting difference between what we may as well call “epistemicists” and “ultra-instrumentalists” is that ultra-instrumentalists generally weight social capital as more important, and individual intellectual ability as less important, than epistemicists do. See here: most of the reputed benefits of belief in Mormonism-the-religion are facets of access to Mormonism-the-social-network.
Another interesting feature of ultra-instrumentalists is that their political beliefs are often outside their local Overton windows. Presumably they have some idea of how much social capital these beliefs have cost them.