It’s valid to be worried about the introduction of rituals producing death spirals. That is their express purpose after all, to produce and reinforce whatever death spirals the community has defined as essential.
Ritualism is a mind hack invented by early humanity to reinforce the group worldview and build/maintain group cohesion. And in the intervening thousands of years, either we or ritualism itself has evolved into something deeply ingrained in our cognitive makeup. At this point, it’s how our brains are wired and I don’t think it’s feasible to simply ignore it. Instead, we have to do exactly what Raemon is attempting: coopt its techniques and replace the ones that propagate untruth and less than optimal behavior with ones that propagate truth and optimal behavior.
But rituals are a fundamentally irrational business, there’s no way around it. The solution, I think, lies in thinking of rituals as a mnemonic device, understanding that they’re not really a way of arriving at new truth, but reinforcing what we’re reasonably sure is settled truth. Mandating constant and aribtrary change is the wrong track, since a huge part of rituals is simple reinforcement. To limit that is to cut the whole thing off at the knees.
Instead, I suggest only included the very settled science of rationality and being very conservative about what gets defined as such. For the inaugural core tenant, I would suggest the Litany of Tarski and the idea that if it’s wrong it gets discarded, no matter what, with an appropriately weighty ritual to accompany it. So even if you did have something that was part of the canon for ten years that must then be discarded, you can still fall back to this ability to acknowledge mistakes and self modify. Everyone performs a ritual expunging the obsolete piece from the canon and it’s forever removed. Thus, we’re still taking advantage of the ritualism mind hack, while building in appropriate safeguards to keep the death spiral from going on forever and allowing for future self modification.
I agree with you in much of your assessment about what rituals are. Rituals are a very powerful, fundamentally irrational force on our minds. However, I don’t think that our minds known weakness to rituals is something we should be trying to solve with, well, rituals.
First:
The solution, I think, lies in thinking of rituals as a mnemonic device, understanding that they’re not really a way of arriving at new truth, but reinforcing what we’re reasonably sure is settled truth.
“What we’re reasonably sure is settled truth” does not necessarily equal truth. Nor does it necessarily equal “what we will want to believe once we know more”.
Secondly, I think that a skilled rationalist should be able to avoid acquiring incorrect beliefs through rituals. If, for any reason, I have to participate in a ritual, I would like to have acquired the skills necessary to avoid getting caught up in it. This is a bias I would like to defeat, or reduce, just like any other. And I really don’t think we can teach that skill through rituals. I’m rather disinclined against trying, either, since I suspect that would make us weaker to this form of manipulation.
Bottom line: I think we should try to be, well, less wrong, rather than wrong-in-opposite-directions-so-they-cancel-out.
“What we’re reasonably sure is settled truth” does not necessarily equal truth. Nor does it necessarily equal “what we will want to believe once we know more”.
Absolutely, which is what makes building in the ability to self modify so intrinsically important. The function of any ritual like activity shouldn’t be any where near the vicinity of the “research arm” of the rationality community. Nothing should be acquired within them, nor determined through them. They should be about reinforcing the settled science, to minimize the amount of falseness that enters into the canon (I should point out, to be clear I’m using this term tongue in cheek). And for what does, something built around the Litany of Tarksi still allows for self modification.
And yes, any and all rationalists should be far enough along that they’ve developed a certain immunity to the process. That in and of itself makes no difference. Doing these types of things does measurable things to the brain, just as prayer/meditation do. The details are arbitrary; it doesn’t matter if you’re sacrificing a virgin, eating a wafer, or lighting a candle. What matters is doing the same thing as your fellow tribe members to build/maintain a sense of community. The proposition here is to simply replace the incorrect proclamations of how the universe works with correct ones. Instead of proclaiming Jesus Lord and Savior, you’re proclaiming the map is not the territory and that your desire to know what is true is actually true (so if it turns out that the map IS the territory, then out it goes from the hymn book).
And the rationalist has the added (and important) benefit that no matter how much they give themselves over to the emotions of whatever ceremony, once they walk back out to the parking lot, their level headedness will return. The rationalist can walk out and think, “That sure was fun, but I understand what was happening and can safely put that suspension of rationality back on the shelf.” In a way the Catholic can’t (consciously) do when walking out of Mass.
So I disagree, I think these kinds of things, with effective substitutions of content, won’t make us weaker to this form of manipulation, but rather stronger. Ultimately, when we cross the Singularity, we probably won’t need these kinds of mind hacks anymore, but in the interim, I think they’ll end up being quite important.
I actually think you are a bit overconfident in the ability to self-described rationalists to walk away from this unchanged. I think this is valuable, and yes I even agree that rationality training should help reduce the negative side-effects. But I don’t think for a second that our level-headedness will automatically return the instant we step out of the ritual room.
Hm, perhaps you’re right. It would depend largely on the composition of the ritual(s). Certainly, extraordinary care must be taken when intentionally playing with any kind of death spiral. A generous dose of tongue in cheek self deprivation would probably be a good idea.
It’s valid to be worried about the introduction of rituals producing death spirals. That is their express purpose after all, to produce and reinforce whatever death spirals the community has defined as essential.
Ritualism is a mind hack invented by early humanity to reinforce the group worldview and build/maintain group cohesion. And in the intervening thousands of years, either we or ritualism itself has evolved into something deeply ingrained in our cognitive makeup. At this point, it’s how our brains are wired and I don’t think it’s feasible to simply ignore it. Instead, we have to do exactly what Raemon is attempting: coopt its techniques and replace the ones that propagate untruth and less than optimal behavior with ones that propagate truth and optimal behavior.
But rituals are a fundamentally irrational business, there’s no way around it. The solution, I think, lies in thinking of rituals as a mnemonic device, understanding that they’re not really a way of arriving at new truth, but reinforcing what we’re reasonably sure is settled truth. Mandating constant and aribtrary change is the wrong track, since a huge part of rituals is simple reinforcement. To limit that is to cut the whole thing off at the knees.
Instead, I suggest only included the very settled science of rationality and being very conservative about what gets defined as such. For the inaugural core tenant, I would suggest the Litany of Tarski and the idea that if it’s wrong it gets discarded, no matter what, with an appropriately weighty ritual to accompany it. So even if you did have something that was part of the canon for ten years that must then be discarded, you can still fall back to this ability to acknowledge mistakes and self modify. Everyone performs a ritual expunging the obsolete piece from the canon and it’s forever removed. Thus, we’re still taking advantage of the ritualism mind hack, while building in appropriate safeguards to keep the death spiral from going on forever and allowing for future self modification.
I agree with you in much of your assessment about what rituals are. Rituals are a very powerful, fundamentally irrational force on our minds. However, I don’t think that our minds known weakness to rituals is something we should be trying to solve with, well, rituals.
First:
“What we’re reasonably sure is settled truth” does not necessarily equal truth. Nor does it necessarily equal “what we will want to believe once we know more”.
Secondly, I think that a skilled rationalist should be able to avoid acquiring incorrect beliefs through rituals. If, for any reason, I have to participate in a ritual, I would like to have acquired the skills necessary to avoid getting caught up in it. This is a bias I would like to defeat, or reduce, just like any other. And I really don’t think we can teach that skill through rituals. I’m rather disinclined against trying, either, since I suspect that would make us weaker to this form of manipulation.
Bottom line: I think we should try to be, well, less wrong, rather than wrong-in-opposite-directions-so-they-cancel-out.
Absolutely, which is what makes building in the ability to self modify so intrinsically important. The function of any ritual like activity shouldn’t be any where near the vicinity of the “research arm” of the rationality community. Nothing should be acquired within them, nor determined through them. They should be about reinforcing the settled science, to minimize the amount of falseness that enters into the canon (I should point out, to be clear I’m using this term tongue in cheek). And for what does, something built around the Litany of Tarksi still allows for self modification.
And yes, any and all rationalists should be far enough along that they’ve developed a certain immunity to the process. That in and of itself makes no difference. Doing these types of things does measurable things to the brain, just as prayer/meditation do. The details are arbitrary; it doesn’t matter if you’re sacrificing a virgin, eating a wafer, or lighting a candle. What matters is doing the same thing as your fellow tribe members to build/maintain a sense of community. The proposition here is to simply replace the incorrect proclamations of how the universe works with correct ones. Instead of proclaiming Jesus Lord and Savior, you’re proclaiming the map is not the territory and that your desire to know what is true is actually true (so if it turns out that the map IS the territory, then out it goes from the hymn book).
And the rationalist has the added (and important) benefit that no matter how much they give themselves over to the emotions of whatever ceremony, once they walk back out to the parking lot, their level headedness will return. The rationalist can walk out and think, “That sure was fun, but I understand what was happening and can safely put that suspension of rationality back on the shelf.” In a way the Catholic can’t (consciously) do when walking out of Mass.
So I disagree, I think these kinds of things, with effective substitutions of content, won’t make us weaker to this form of manipulation, but rather stronger. Ultimately, when we cross the Singularity, we probably won’t need these kinds of mind hacks anymore, but in the interim, I think they’ll end up being quite important.
I actually think you are a bit overconfident in the ability to self-described rationalists to walk away from this unchanged. I think this is valuable, and yes I even agree that rationality training should help reduce the negative side-effects. But I don’t think for a second that our level-headedness will automatically return the instant we step out of the ritual room.
I very much agree.
Hm, perhaps you’re right. It would depend largely on the composition of the ritual(s). Certainly, extraordinary care must be taken when intentionally playing with any kind of death spiral. A generous dose of tongue in cheek self deprivation would probably be a good idea.