It’s certainly possible to think that way, but it’s suspicious.
If I was talking with a skeptic, I’d say things like, “No, you don’t need to think of the Tarot as a mystical force to think that it works—it’s designed to work, the cards are designed to be about human experience, it’s just a useful hook to hang a conversation on.” But if I was talking with a fellow believer, I’d say things like, “The cards don’t lie.”
An excellent point. Follow-up question: do you think there is any good way to guard against or even notice this sort of thing happening in yourself? In a way I guess Less Wrong is kind of an answer to that and closely related questions.
Another thing worth pointing out is that things which begin having little to no structural similarities to religion can start to look like them over time. Cults spring up in unlikely places, which is not all that surprising if you think about it. Given the deep and powerful grooves which religions fill so well it’s only natural that unrelated entities might settle in them over time. Loyalty to philosophical ideas, sports teams, etc., can reach a deeply irrational fervor which makes more sense when considered in the light of the tribal EEA, when loyalty was a matter of life and death. I can’t help but think of the cultishness of Ayn Rand’s later life. (Yes, I’m aware that’s been written about elsewhere on LW).
Sunlight, I suppose. Write down things you say in various circumstances and reread them in others and make them public. Hash out your beliefs in deep detail, so that kind of slipperiness requires deeper changes.
Or even running through how you would explain it to your more rational friends, even if you don’t get a handy opportunity to explain your views on tarot or the eucharist or whatever to them.
These are both good suggestions. The exercise of trying to explain something in my head to an imaginary rationalist friend is something I already do. I also frequently go public with opinions so that they may be savaged by those that disagree with me. Alas, however, even these two things must be handled with care; it’s possible that, having survived arguments and various degrees of perfunctory research, you may assign the label of “rational” to an irrational idea, feeling an enormous amount of confidence that you’ve done the necessary thinking when in fact you’ve only engaged in rationalist hand waving.
Trying to find truth and avoid your own biases too is akin to climbing up a steep hill, if that hill is covered in a sheet of ice, fraught with land mines, and has packs of velociraptors patrolling it’s base.
It’s certainly possible to think that way, but it’s suspicious.
-- Greta Christina, When anyone is watching: metaphors and the slipperiness of religion
An excellent point. Follow-up question: do you think there is any good way to guard against or even notice this sort of thing happening in yourself? In a way I guess Less Wrong is kind of an answer to that and closely related questions.
Another thing worth pointing out is that things which begin having little to no structural similarities to religion can start to look like them over time. Cults spring up in unlikely places, which is not all that surprising if you think about it. Given the deep and powerful grooves which religions fill so well it’s only natural that unrelated entities might settle in them over time. Loyalty to philosophical ideas, sports teams, etc., can reach a deeply irrational fervor which makes more sense when considered in the light of the tribal EEA, when loyalty was a matter of life and death. I can’t help but think of the cultishness of Ayn Rand’s later life. (Yes, I’m aware that’s been written about elsewhere on LW).
Sunlight, I suppose. Write down things you say in various circumstances and reread them in others and make them public. Hash out your beliefs in deep detail, so that kind of slipperiness requires deeper changes.
Or even running through how you would explain it to your more rational friends, even if you don’t get a handy opportunity to explain your views on tarot or the eucharist or whatever to them.
These are both good suggestions. The exercise of trying to explain something in my head to an imaginary rationalist friend is something I already do. I also frequently go public with opinions so that they may be savaged by those that disagree with me. Alas, however, even these two things must be handled with care; it’s possible that, having survived arguments and various degrees of perfunctory research, you may assign the label of “rational” to an irrational idea, feeling an enormous amount of confidence that you’ve done the necessary thinking when in fact you’ve only engaged in rationalist hand waving.
Trying to find truth and avoid your own biases too is akin to climbing up a steep hill, if that hill is covered in a sheet of ice, fraught with land mines, and has packs of velociraptors patrolling it’s base.