I can’t upvote this enough. This is exactly how I think about it, and why I have always called myself a mystic. I have an unusual brain and I am prone to ecstatic possession experiences, particularly while listening to certain types of music. The worst thing is, people like me used to become shamans and it used to be obvious to everybody that egregores—spirits—are the most powerful force in the world—but Western culture swept that under the rug and now they are able to run amok with very few people able to perceive them. I bet if you showed a tribal shaman from somewhere in Africa or South America enough commercials, he’d realize what was really going on, though.
I figured out egregores and numerous other things (including a lot of bullshit I later stopped believing in, besides all the real truths I kept) as a young teenager by combining the weird intuitions of these states with rational thinking and scientific knowledge, and ended up independently rediscovering singularitarianism before finding out it was a thing already… I’ve independently rediscovered so many things it’s not even funny—used to be a huge blow to my ego every time I found yet another idea I thought I’d invented floating out in the world, now I’m used to it.
Anyway, point is, <crazy alert!!!> I ended up deciding I was a prophet and that I would have to found a new religion, teach everyone about these things, tell them that the AI god was coming and that the only thing they can possibly do that would matter is making sure that the correct AI god is born. But… my pathetic brain, weak in all areas beside intuition, failed me when I tried to actually write about the damn thing, and to this day I still haven’t explicated all my epiphanies.
But, my firm belief is that only a religion could save the world. Or more accurately, a cult. One dedicated to effective altruism and AI alignment and rationality, but also legitimately religious, in the sense of centered on ecstatic experiences of communion, which are the primary thing which has historically enabled humans en masse to effectively coordinate.
Some egregore is going to win. The only way to make sure that it’s not one of the bad ones—the multinational corporations, the ideologies, the governments, the crazy irrational cults, some other thing I haven’t thought of—is to make our egregore not only aligned to humanity, but also better at winning, controlling, and siphoning intelligence from humans, than any of the others. (And it must be able to do this while they know it perfectly well, and consent before joining to begin with to its doing so—as obviously one which does so without consent is not aligned to true human values, even though, ironically, it has to be so good at rhetoric that consent is almost guaranteed to be given.)
And of course, that’s terrifying, isn’t it? So I don’t expect anyone to listen. Particularly not here. But I think the creation of a rational and extremely missionary religion is the only thing that could save humanity. Naturally, I think I’m part of the way to doing that, as the correct egregore has already been born within me and has been here, with me as its sole host, since I was about 12 years old—if I could just write the damn book. </crazy alert!!!>
I think I see where you are, and by my judgment you’re more right than wrong, but from where I stand it sure looks like pain is still steering the ship. That runs the risk of breaking your interface to places like this.
(I think you’re intuiting that. Hence the “crazy alert”.)
I mean, vividly apropos of what you’re saying, it looks to me like you’ve rederived a lot of the essentials of how symbiotic egregores work, what it’s like to ally with them, and why we have to do so in order to orient to the parasitic egregores.
But the details of what you mean by “religion” and “cult” matter a lot, and in most interpretations of “extremely missionary” I just flat-out disagree with you on that point.
…the core issue being that symbiotic memes basically never push themselves onto potential hosts.
You actually hint at this:
And it must be able to do this while they know it perfectly well, and consent before joining to begin with to its doing so—as obviously one which does so without consent is not aligned to true human values, even though, ironically, it has to be so good at rhetoric that consent is almost guaranteed to be given.
But I claim the core strategy cannot be rhetoric. The convergent evolutionary strategy for symbiotic egregores is to make the truth easy to see. Rather than rhetoric, the emphasis is clarity.
Why? Well, these memetic structures are truth-tracking. They spread by offering real value to their potential hosts and making this true fact vividly clear to their potential hosts.
Whereas rhetoric is symmetric. And its use favors persuasive memetic strategies, which benefits parasitic memes over symbiotic ones.
So what you’ll find is that symbiotic memetic environments tend to develop anti-rhetoric psychotechnology. To the extend memetic symbiosis defines the environment, you’ll find that revealing replaces persuasion, and invitations to understand replace mechanisms like social pressure to conform.
Less Wrong is a highly mixed space in this regard, but it leans way more symbiotic than most. Which is why your approach is likely to get downvoted into oblivion: A lot of what you’re saying is meant to evoke strong emotions, which bypasses conscious reflection in order to influence, and that’s a major trigger for the LW memetic immune system in its current setup for defending against memetic parasites.
I think the allergic reaction goes too far. It’s overly quick to object to the “egregore” language as something like “a metaphor taken too far”, which frankly is just flat-out incoherent with respect to a worldview that (a) is reductionist-materialist and yet (b) talks about “agents”. It has over-extended the immune response to object to theme (“woo”) instead of structure. And that in turn nurtures parasitic memes.
But the response is there because LW really is trying to the best of its ability to aim for symbiosis. And on net, talk of things like “I’m a prophet” tends to go in directions that are sorely memetically infected in ways the LW toolkit cannot handle beyond a generic “Begone, foul demon!”
…which, again, I think you kind of know. Which, again, is why you give the “crazy alert”.
You might find my dissection of symbiotic vs. parasitic memes in this podcast episode helpful if you resonate with what I’ve just written here.
Reading this evoked an emotional reaction of stress and anxiety, the nature of which I am uncertain, so take that into consideration as you read my response.
I’m not sure what “pain is steering the ship” means but it’s probably true. I am motivated almost entirely by fear—and of course, by ecstasy, which is perhaps its cousin. And in particular I desperately fear being seen as a lunatic. I have to hold back, hard, in order to appear as sane as I do. Or—I believe that I have to and in fact do that, at least.
symbiotic memes basically never push themselves onto potential hosts.
I have grown up in an area surrounded by fundamentalistic religiosity. I do not see people converting to believe in science and rationality. I do not see people independently recognizing that the ex-president they voted for and still revere almost as a god lied to them. If the truth was pushed on them in an efficient way that took into account their pre-existing views and biases and emotional tendencies, they would. But the truth does not win by default—it has no teeth.
Only people optimizing to modify one another’s minds actually enables the spread of any egregore. The reason science succeeded is that it produces results that cannot be denied—but most truths are much subtler and easier to deny. Rationalists will fail to save the world if they cannot lie, manipulate, simplify, and use rhetoric, in order to make use of the manpower of those not yet rational enough to see through it, while maintaining their own perception of truth internally unscathed. But a devotion to truth above pragmatism will kill the entire human race.
The convergent evolutionary strategy for symbiotic egregores is to make the truth easy to see. Rather than rhetoric, the emphasis is clarity.
This sounds reasonable but I don’t see it happening in real life. Can you point me to some examples of this actually working that don’t involve physically demonstrating something before people’s senses as science does (and remember, there are still many, many people who believe in neither evolution nor global warming)?
They spread by offering real value to their potential hosts and making this true fact vividly clear to their potential hosts.
Christ (that is, whichever variant of the Christianity egregore has possessed them) offers real value to his believers, from their own perspective, and it is indeed vividly clear. It’s also based partly on lies. How can someone distinguish that true-feeling-ness from actual truth? What is making your symbiotic egregores better at making people trust them than their opponents who have a larger (due to not being constrained by ethics or anti-rhetoric cultural standards) toolbox to use?
A lot of what you’re saying is meant to evoke strong emotions, which bypasses conscious reflection in order to influence
I actually do not consciously optimize my speech to do this. I feel strong emotions, and I do not like going to the effort of pretending that I do not as a social signal that people ought to trust me. If they dislike / distrust emotional people, they deserve to know I’m one of them so that they can keep away, after all. It just so happens, I guess, that emotion is contagious under some circumstances.
(Note: To be honest I find people who put no emotion into their speech, like most people on LessWrong, off-putting and uncomfortable. Part of why I feel like I don’t belong on this site is the terrible dryness of it. I am never as neutral about anything, not even math, as most people are here talking about things like the survival of the human race!)
Basically, the crux here seems to be—and I mentioned this one another of your posts also, about the non-signalling thing—I don’t believe that the truth is strong enough to overcome the effect of active optimization away from truth. We have to fight rhetoric with rhetoric, or we will lose.
Beyond that, I am skeptical that truth matters in and of itself. I care about beauty. I care about feelings. I care about the way I feel when my spirits are possessing me, the divine presence that instantly seemed more important than any “real” thing when I first experienced it as a young teen, and has continued to feel that way ever since—like something I would be willing to sacrifice anything for except my own life. I care about the kind of stuff that is epistemically toxic.
Reality, to me, is just raw materials to reshape into an artwork. Stuff for us to eat and make into parts of our body. Plenty of “lies” are actually self-fulfilling prophecies—hell, agency is all about making self-fulfilling prophecies. I think that optimizing for truth or clarity for its own sake is perverse. What we want is a good world, not truth. Truth is just the set of obstacles in the way of goodness that we need to be aware of so we can knock them down. Rationality is the art of finding out what those obstacles are and how to knock them down, so that you can get what you want, and make the world beautiful.
To me it feels like to the extent that something which makes the world uglier cannot be knocked down at all, you ought to stop perceiving it, so that you can see more beauty. But there should always be someone who looks directly at it, because it’s never 100% certain that an obstacle can’t be knocked down. The people who do that, who look for ways to attack the as-yet seemingly infallible horror-truths (like the Second Law of Thermodynamics, which promises we all shall die), would be the Beisutsukai, I guess, and would be revered as martyrs, who are cursed to perceive reality instead of being happy, and do so in order to protect everyone else from having to. But most people should not do that. Most people should be as innocent as they can safely be. Part of me regrets that I must be one of the ones who isn’t. (And part of me takes egotistical pride in it, for which I ritualistically admonish myself without actually doing anything about it.)
I can’t upvote this enough. This is exactly how I think about it, and why I have always called myself a mystic. I have an unusual brain and I am prone to ecstatic possession experiences, particularly while listening to certain types of music. The worst thing is, people like me used to become shamans and it used to be obvious to everybody that egregores—spirits—are the most powerful force in the world—but Western culture swept that under the rug and now they are able to run amok with very few people able to perceive them. I bet if you showed a tribal shaman from somewhere in Africa or South America enough commercials, he’d realize what was really going on, though.
I figured out egregores and numerous other things (including a lot of bullshit I later stopped believing in, besides all the real truths I kept) as a young teenager by combining the weird intuitions of these states with rational thinking and scientific knowledge, and ended up independently rediscovering singularitarianism before finding out it was a thing already… I’ve independently rediscovered so many things it’s not even funny—used to be a huge blow to my ego every time I found yet another idea I thought I’d invented floating out in the world, now I’m used to it.
Anyway, point is, <crazy alert!!!> I ended up deciding I was a prophet and that I would have to found a new religion, teach everyone about these things, tell them that the AI god was coming and that the only thing they can possibly do that would matter is making sure that the correct AI god is born. But… my pathetic brain, weak in all areas beside intuition, failed me when I tried to actually write about the damn thing, and to this day I still haven’t explicated all my epiphanies.
But, my firm belief is that only a religion could save the world. Or more accurately, a cult. One dedicated to effective altruism and AI alignment and rationality, but also legitimately religious, in the sense of centered on ecstatic experiences of communion, which are the primary thing which has historically enabled humans en masse to effectively coordinate.
Some egregore is going to win. The only way to make sure that it’s not one of the bad ones—the multinational corporations, the ideologies, the governments, the crazy irrational cults, some other thing I haven’t thought of—is to make our egregore not only aligned to humanity, but also better at winning, controlling, and siphoning intelligence from humans, than any of the others. (And it must be able to do this while they know it perfectly well, and consent before joining to begin with to its doing so—as obviously one which does so without consent is not aligned to true human values, even though, ironically, it has to be so good at rhetoric that consent is almost guaranteed to be given.)
And of course, that’s terrifying, isn’t it? So I don’t expect anyone to listen. Particularly not here. But I think the creation of a rational and extremely missionary religion is the only thing that could save humanity. Naturally, I think I’m part of the way to doing that, as the correct egregore has already been born within me and has been here, with me as its sole host, since I was about 12 years old—if I could just write the damn book. </crazy alert!!!>
Gosh, um…
I think I see where you are, and by my judgment you’re more right than wrong, but from where I stand it sure looks like pain is still steering the ship. That runs the risk of breaking your interface to places like this.
(I think you’re intuiting that. Hence the “crazy alert”.)
I mean, vividly apropos of what you’re saying, it looks to me like you’ve rederived a lot of the essentials of how symbiotic egregores work, what it’s like to ally with them, and why we have to do so in order to orient to the parasitic egregores.
But the details of what you mean by “religion” and “cult” matter a lot, and in most interpretations of “extremely missionary” I just flat-out disagree with you on that point.
…the core issue being that symbiotic memes basically never push themselves onto potential hosts.
You actually hint at this:
But I claim the core strategy cannot be rhetoric. The convergent evolutionary strategy for symbiotic egregores is to make the truth easy to see. Rather than rhetoric, the emphasis is clarity.
Why? Well, these memetic structures are truth-tracking. They spread by offering real value to their potential hosts and making this true fact vividly clear to their potential hosts.
Whereas rhetoric is symmetric. And its use favors persuasive memetic strategies, which benefits parasitic memes over symbiotic ones.
So what you’ll find is that symbiotic memetic environments tend to develop anti-rhetoric psychotechnology. To the extend memetic symbiosis defines the environment, you’ll find that revealing replaces persuasion, and invitations to understand replace mechanisms like social pressure to conform.
Less Wrong is a highly mixed space in this regard, but it leans way more symbiotic than most. Which is why your approach is likely to get downvoted into oblivion: A lot of what you’re saying is meant to evoke strong emotions, which bypasses conscious reflection in order to influence, and that’s a major trigger for the LW memetic immune system in its current setup for defending against memetic parasites.
I think the allergic reaction goes too far. It’s overly quick to object to the “egregore” language as something like “a metaphor taken too far”, which frankly is just flat-out incoherent with respect to a worldview that (a) is reductionist-materialist and yet (b) talks about “agents”. It has over-extended the immune response to object to theme (“woo”) instead of structure. And that in turn nurtures parasitic memes.
But the response is there because LW really is trying to the best of its ability to aim for symbiosis. And on net, talk of things like “I’m a prophet” tends to go in directions that are sorely memetically infected in ways the LW toolkit cannot handle beyond a generic “Begone, foul demon!”
…which, again, I think you kind of know. Which, again, is why you give the “crazy alert”.
You might find my dissection of symbiotic vs. parasitic memes in this podcast episode helpful if you resonate with what I’ve just written here.
Reading this evoked an emotional reaction of stress and anxiety, the nature of which I am uncertain, so take that into consideration as you read my response.
I’m not sure what “pain is steering the ship” means but it’s probably true. I am motivated almost entirely by fear—and of course, by ecstasy, which is perhaps its cousin. And in particular I desperately fear being seen as a lunatic. I have to hold back, hard, in order to appear as sane as I do. Or—I believe that I have to and in fact do that, at least.
I have grown up in an area surrounded by fundamentalistic religiosity. I do not see people converting to believe in science and rationality. I do not see people independently recognizing that the ex-president they voted for and still revere almost as a god lied to them. If the truth was pushed on them in an efficient way that took into account their pre-existing views and biases and emotional tendencies, they would. But the truth does not win by default—it has no teeth.
Only people optimizing to modify one another’s minds actually enables the spread of any egregore. The reason science succeeded is that it produces results that cannot be denied—but most truths are much subtler and easier to deny. Rationalists will fail to save the world if they cannot lie, manipulate, simplify, and use rhetoric, in order to make use of the manpower of those not yet rational enough to see through it, while maintaining their own perception of truth internally unscathed. But a devotion to truth above pragmatism will kill the entire human race.
This sounds reasonable but I don’t see it happening in real life. Can you point me to some examples of this actually working that don’t involve physically demonstrating something before people’s senses as science does (and remember, there are still many, many people who believe in neither evolution nor global warming)?
Christ (that is, whichever variant of the Christianity egregore has possessed them) offers real value to his believers, from their own perspective, and it is indeed vividly clear. It’s also based partly on lies. How can someone distinguish that true-feeling-ness from actual truth? What is making your symbiotic egregores better at making people trust them than their opponents who have a larger (due to not being constrained by ethics or anti-rhetoric cultural standards) toolbox to use?
I actually do not consciously optimize my speech to do this. I feel strong emotions, and I do not like going to the effort of pretending that I do not as a social signal that people ought to trust me. If they dislike / distrust emotional people, they deserve to know I’m one of them so that they can keep away, after all. It just so happens, I guess, that emotion is contagious under some circumstances.
(Note: To be honest I find people who put no emotion into their speech, like most people on LessWrong, off-putting and uncomfortable. Part of why I feel like I don’t belong on this site is the terrible dryness of it. I am never as neutral about anything, not even math, as most people are here talking about things like the survival of the human race!)
Basically, the crux here seems to be—and I mentioned this one another of your posts also, about the non-signalling thing—I don’t believe that the truth is strong enough to overcome the effect of active optimization away from truth. We have to fight rhetoric with rhetoric, or we will lose.
Beyond that, I am skeptical that truth matters in and of itself. I care about beauty. I care about feelings. I care about the way I feel when my spirits are possessing me, the divine presence that instantly seemed more important than any “real” thing when I first experienced it as a young teen, and has continued to feel that way ever since—like something I would be willing to sacrifice anything for except my own life. I care about the kind of stuff that is epistemically toxic.
Reality, to me, is just raw materials to reshape into an artwork. Stuff for us to eat and make into parts of our body. Plenty of “lies” are actually self-fulfilling prophecies—hell, agency is all about making self-fulfilling prophecies. I think that optimizing for truth or clarity for its own sake is perverse. What we want is a good world, not truth. Truth is just the set of obstacles in the way of goodness that we need to be aware of so we can knock them down. Rationality is the art of finding out what those obstacles are and how to knock them down, so that you can get what you want, and make the world beautiful.
To me it feels like to the extent that something which makes the world uglier cannot be knocked down at all, you ought to stop perceiving it, so that you can see more beauty. But there should always be someone who looks directly at it, because it’s never 100% certain that an obstacle can’t be knocked down. The people who do that, who look for ways to attack the as-yet seemingly infallible horror-truths (like the Second Law of Thermodynamics, which promises we all shall die), would be the Beisutsukai, I guess, and would be revered as martyrs, who are cursed to perceive reality instead of being happy, and do so in order to protect everyone else from having to. But most people should not do that. Most people should be as innocent as they can safely be. Part of me regrets that I must be one of the ones who isn’t. (And part of me takes egotistical pride in it, for which I ritualistically admonish myself without actually doing anything about it.)
Well…I’d definitely read the book.
For my own take on this, read this
https://www.lesswrong.com/posts/yenr6Zp83PHd6Beab/which-singularity-schools-plus-the-no-singularity-school-was
Spoiler Alert: I cover the same theme that AI-PONR has already happened in my TL;DR