Reading this evoked an emotional reaction of stress and anxiety, the nature of which I am uncertain, so take that into consideration as you read my response.
I’m not sure what “pain is steering the ship” means but it’s probably true. I am motivated almost entirely by fear—and of course, by ecstasy, which is perhaps its cousin. And in particular I desperately fear being seen as a lunatic. I have to hold back, hard, in order to appear as sane as I do. Or—I believe that I have to and in fact do that, at least.
symbiotic memes basically never push themselves onto potential hosts.
I have grown up in an area surrounded by fundamentalistic religiosity. I do not see people converting to believe in science and rationality. I do not see people independently recognizing that the ex-president they voted for and still revere almost as a god lied to them. If the truth was pushed on them in an efficient way that took into account their pre-existing views and biases and emotional tendencies, they would. But the truth does not win by default—it has no teeth.
Only people optimizing to modify one another’s minds actually enables the spread of any egregore. The reason science succeeded is that it produces results that cannot be denied—but most truths are much subtler and easier to deny. Rationalists will fail to save the world if they cannot lie, manipulate, simplify, and use rhetoric, in order to make use of the manpower of those not yet rational enough to see through it, while maintaining their own perception of truth internally unscathed. But a devotion to truth above pragmatism will kill the entire human race.
The convergent evolutionary strategy for symbiotic egregores is to make the truth easy to see. Rather than rhetoric, the emphasis is clarity.
This sounds reasonable but I don’t see it happening in real life. Can you point me to some examples of this actually working that don’t involve physically demonstrating something before people’s senses as science does (and remember, there are still many, many people who believe in neither evolution nor global warming)?
They spread by offering real value to their potential hosts and making this true fact vividly clear to their potential hosts.
Christ (that is, whichever variant of the Christianity egregore has possessed them) offers real value to his believers, from their own perspective, and it is indeed vividly clear. It’s also based partly on lies. How can someone distinguish that true-feeling-ness from actual truth? What is making your symbiotic egregores better at making people trust them than their opponents who have a larger (due to not being constrained by ethics or anti-rhetoric cultural standards) toolbox to use?
A lot of what you’re saying is meant to evoke strong emotions, which bypasses conscious reflection in order to influence
I actually do not consciously optimize my speech to do this. I feel strong emotions, and I do not like going to the effort of pretending that I do not as a social signal that people ought to trust me. If they dislike / distrust emotional people, they deserve to know I’m one of them so that they can keep away, after all. It just so happens, I guess, that emotion is contagious under some circumstances.
(Note: To be honest I find people who put no emotion into their speech, like most people on LessWrong, off-putting and uncomfortable. Part of why I feel like I don’t belong on this site is the terrible dryness of it. I am never as neutral about anything, not even math, as most people are here talking about things like the survival of the human race!)
Basically, the crux here seems to be—and I mentioned this one another of your posts also, about the non-signalling thing—I don’t believe that the truth is strong enough to overcome the effect of active optimization away from truth. We have to fight rhetoric with rhetoric, or we will lose.
Beyond that, I am skeptical that truth matters in and of itself. I care about beauty. I care about feelings. I care about the way I feel when my spirits are possessing me, the divine presence that instantly seemed more important than any “real” thing when I first experienced it as a young teen, and has continued to feel that way ever since—like something I would be willing to sacrifice anything for except my own life. I care about the kind of stuff that is epistemically toxic.
Reality, to me, is just raw materials to reshape into an artwork. Stuff for us to eat and make into parts of our body. Plenty of “lies” are actually self-fulfilling prophecies—hell, agency is all about making self-fulfilling prophecies. I think that optimizing for truth or clarity for its own sake is perverse. What we want is a good world, not truth. Truth is just the set of obstacles in the way of goodness that we need to be aware of so we can knock them down. Rationality is the art of finding out what those obstacles are and how to knock them down, so that you can get what you want, and make the world beautiful.
To me it feels like to the extent that something which makes the world uglier cannot be knocked down at all, you ought to stop perceiving it, so that you can see more beauty. But there should always be someone who looks directly at it, because it’s never 100% certain that an obstacle can’t be knocked down. The people who do that, who look for ways to attack the as-yet seemingly infallible horror-truths (like the Second Law of Thermodynamics, which promises we all shall die), would be the Beisutsukai, I guess, and would be revered as martyrs, who are cursed to perceive reality instead of being happy, and do so in order to protect everyone else from having to. But most people should not do that. Most people should be as innocent as they can safely be. Part of me regrets that I must be one of the ones who isn’t. (And part of me takes egotistical pride in it, for which I ritualistically admonish myself without actually doing anything about it.)
Reading this evoked an emotional reaction of stress and anxiety, the nature of which I am uncertain, so take that into consideration as you read my response.
I’m not sure what “pain is steering the ship” means but it’s probably true. I am motivated almost entirely by fear—and of course, by ecstasy, which is perhaps its cousin. And in particular I desperately fear being seen as a lunatic. I have to hold back, hard, in order to appear as sane as I do. Or—I believe that I have to and in fact do that, at least.
I have grown up in an area surrounded by fundamentalistic religiosity. I do not see people converting to believe in science and rationality. I do not see people independently recognizing that the ex-president they voted for and still revere almost as a god lied to them. If the truth was pushed on them in an efficient way that took into account their pre-existing views and biases and emotional tendencies, they would. But the truth does not win by default—it has no teeth.
Only people optimizing to modify one another’s minds actually enables the spread of any egregore. The reason science succeeded is that it produces results that cannot be denied—but most truths are much subtler and easier to deny. Rationalists will fail to save the world if they cannot lie, manipulate, simplify, and use rhetoric, in order to make use of the manpower of those not yet rational enough to see through it, while maintaining their own perception of truth internally unscathed. But a devotion to truth above pragmatism will kill the entire human race.
This sounds reasonable but I don’t see it happening in real life. Can you point me to some examples of this actually working that don’t involve physically demonstrating something before people’s senses as science does (and remember, there are still many, many people who believe in neither evolution nor global warming)?
Christ (that is, whichever variant of the Christianity egregore has possessed them) offers real value to his believers, from their own perspective, and it is indeed vividly clear. It’s also based partly on lies. How can someone distinguish that true-feeling-ness from actual truth? What is making your symbiotic egregores better at making people trust them than their opponents who have a larger (due to not being constrained by ethics or anti-rhetoric cultural standards) toolbox to use?
I actually do not consciously optimize my speech to do this. I feel strong emotions, and I do not like going to the effort of pretending that I do not as a social signal that people ought to trust me. If they dislike / distrust emotional people, they deserve to know I’m one of them so that they can keep away, after all. It just so happens, I guess, that emotion is contagious under some circumstances.
(Note: To be honest I find people who put no emotion into their speech, like most people on LessWrong, off-putting and uncomfortable. Part of why I feel like I don’t belong on this site is the terrible dryness of it. I am never as neutral about anything, not even math, as most people are here talking about things like the survival of the human race!)
Basically, the crux here seems to be—and I mentioned this one another of your posts also, about the non-signalling thing—I don’t believe that the truth is strong enough to overcome the effect of active optimization away from truth. We have to fight rhetoric with rhetoric, or we will lose.
Beyond that, I am skeptical that truth matters in and of itself. I care about beauty. I care about feelings. I care about the way I feel when my spirits are possessing me, the divine presence that instantly seemed more important than any “real” thing when I first experienced it as a young teen, and has continued to feel that way ever since—like something I would be willing to sacrifice anything for except my own life. I care about the kind of stuff that is epistemically toxic.
Reality, to me, is just raw materials to reshape into an artwork. Stuff for us to eat and make into parts of our body. Plenty of “lies” are actually self-fulfilling prophecies—hell, agency is all about making self-fulfilling prophecies. I think that optimizing for truth or clarity for its own sake is perverse. What we want is a good world, not truth. Truth is just the set of obstacles in the way of goodness that we need to be aware of so we can knock them down. Rationality is the art of finding out what those obstacles are and how to knock them down, so that you can get what you want, and make the world beautiful.
To me it feels like to the extent that something which makes the world uglier cannot be knocked down at all, you ought to stop perceiving it, so that you can see more beauty. But there should always be someone who looks directly at it, because it’s never 100% certain that an obstacle can’t be knocked down. The people who do that, who look for ways to attack the as-yet seemingly infallible horror-truths (like the Second Law of Thermodynamics, which promises we all shall die), would be the Beisutsukai, I guess, and would be revered as martyrs, who are cursed to perceive reality instead of being happy, and do so in order to protect everyone else from having to. But most people should not do that. Most people should be as innocent as they can safely be. Part of me regrets that I must be one of the ones who isn’t. (And part of me takes egotistical pride in it, for which I ritualistically admonish myself without actually doing anything about it.)