Great article. However, there’s one more item that needs to be added to the list to really differentiate us from religious institutions, and that item needs to become the new #1: Offer Transformation that actually works. That is, make being alive a significantly more meaningful experience for people who participate. Do this as well as items #1 thru #10 and we’ll be more popular than any existing religion. The kind of transformation religion offers used to work OK, but not so much anymore because the religious world-view hasn’t been updated to reflect current knowledge. People sense that religion doesn’t speak very well to the world as we now know it through science, but they don’t know where else to go.
Most UU (Unitarian-Universalist) congregations already practice items #2 thru #10, and many are working hard on getting better at #1. Maybe that explains why UUs have 1,094 groups, way more than any other class of skeptics. But, so far, UU’s don’t have a good plan for Transformation either (I’ve been a UU Humanist for 25 years).
logi said, “Do we really need to simulate drugs to be taken seriously?”
Yes, so long as the competition offers something like it. I accepted Jesus Christ as my personal Lord and Savior at age 12, and it was a lasting emotional high that only recently have I been able to reproduce (at age 50). At age 12, my behavior and life really did improve considerably (as well as those around me) for a long period of time, until I finally backslid. I become rational at age 20 (and was quite ashamed of the personal Lord and Savior thing), and wasn’t able to reproduce the postive effects that I experienced at age 12 until only very recently, at age 50. Fortunately, I was able to do so without giving up rationality. In fact, rationality is key to a lasting Transformation.
Interesting. I rejected belief in God and the afterlife also at age 12, because I realized that the only reason I believed those things were true was that it made me feel good to believe them. It was a lasting intellectual high knowing that I had overcome my base emotions with some clear thought.
Emotional manipulation is like propaganda. One of things I found most striking about OB was the high signal-to-noise ratio. No base emotional pleas, no Giant Implicit Moral Framework silently guiding the discourse. I’d hate to see this community lose that by stooping to emotionally drugging its members in an attempt to “replace church”.
Can’t we just tell people the truth? You want transformation? Take psilocybin. You want to feel like things are more meaningful? Smoke marijuana. You want willpower and IQ? Take amphetamines. These all carry risks, just as religious emotional manipulation does. At least they don’t soak up precious time and words.
I guess it comes down to the epistemic/instrumental divide. I’m here for the truth, not for a bag of naive attempts to stimulate dopamine release.
I’m not talking about emotional manipulation. I’m talking about a healthy emotional life, one that provides authentic happiness. Emotions are wonderful so long as they’re guided by reason.
“Can’t we just tell people the truth?”
Yes, absolutely!
Apparently logi has given up on the idea the transformation, meaning, and willpower are achievable without doing long-term damage to oneself. I totally disagree, it is possible if we just open up our thinking a little bit.
I’m talking about a healthy emotional life, one that provides authentic happiness.
I don’t think I know what you mean by “healthy emotional life” or “authentic happiness” here.
I’m not talking about emotional manipulation.
But earlier, you said to the question of simulating drugs:
Yes, so long as the competition offers something like it. I accepted Jesus Christ as my personal Lord and Savior at age 12, and it was a lasting emotional high that only recently have I been able to reproduce (at age 50).
Drugs are a direct form of emotional and cognitive manipulation, so what I mean by “simulating drugs” is to achieve something similar.
Emotions are wonderful so long as they’re guided by reason.
I never claimed otherwise. I actually even value some emotions that aren’t “guided by reason”. But I certainly try not to let any of them in turn guide my reason.
Apparently logi has given up on the idea the transformation, meaning, and willpower are achievable without doing long-term damage to oneself.
No, I think you miss my points. One, I’m saying this stuff is totally orthogonal to what I find valuable in a rational community. This in the end must be a personal objection, but I am probably not alone in it.
Two, I’m as skeptical of the content of these “transformations” and “meanings” as I am of their drug-induced counterparts, regardless of their long-term harm or other drawbacks.
We probably share a concept of “willpower”, and it’s probably true that there are generally effective and sustainable techniques superior to drugs.
I totally disagree, it is possible if we just open up our thinking a little bit.
Even if all you want is to improve your own accuracy of belief, the more that people around you want the same thing, the better you can correct for your mistakes (c.f. Aumann). To that end, it is helpful to find ways to make rationalism more approachable and welcoming to people who are unlikely to experience such an intellectual high, a group that the evidence suggests encompasses most people.
the more that people around you want the same thing
Do they really want the same thing? It’s hard to tell when we’re also offering other tasty treats. Prostitutes will do things for cocaine that they wouldn’t do otherwise, and pretend to enjoy themselves.
As for Aumann, Robin Hanson points out that we’d often do better to update toward “average beliefs” than the beliefs of our chosen in-group. So it appears I can already maximize my “Aumann benefit” by conversing with random strangers. It seems to me that the benefits of a rationalist group are precisely the opposite: Discover good arguments and important data that we were previously unaware of, that we find convincing regardless of source. If we value our own mere opinions too highly, we’ve already lost.
Do they really want the same thing? It’s hard to tell when we’re also offering other tasty treats. Prostitutes will do things for cocaine that they wouldn’t do otherwise, and pretend to enjoy themselves.
Maybe they would, maybe they wouldn’t. But if rationalism doesn’t at least offer something comparable to other options, many people won’t even try.
As for Aumann, Robin Hanson points out that we’d often do better to update toward “average beliefs” than the beliefs of our chosen in-group. So it appears I can already maximize my “Aumann benefit” by conversing with random strangers. It seems to me that the benefits of a rationalist group are precisely the opposite: Discover good arguments and important data that we were previously unaware of, that we find convincing regardless of source. If we value our own mere opinions too highly, we’ve already lost.
Robin’s argument in that link seems to be that taking pleasure in disagreement with average beliefs is, all else equal, a bad thing; it’s certainly not an argument in favor of updating toward average beliefs. Aumann agreement only strictly applies to ideal rationalists with shared assumptions, but as a rule of thumb one should update toward other agents’ beliefs based on the demonstrated rationality of their belief-forming process.
Among other reasons, because mass opinion often influences decisions, e.g. politics, in ways that impact everyone, including us. The greater the average rationality of the masses, the better those decisions are likely to be.
But if rationalism doesn’t at least offer something comparable to other options, many people won’t even try.
True. I think the goal here is a bit more complex than “maximize number of self-proclaimed rationalists”, though.
Robin’s argument in that link seems to be that taking pleasure in disagreement with average beliefs is, all else equal, a bad thing; it’s certainly not an argument in favor of updating toward average beliefs.
I was presenting it as an argument favoring updates toward average beliefs over doing so for in-group beliefs, but you’re still right that it’s really making an unrelated point.
Aumann agreement only strictly applies to ideal rationalists with shared assumptions, but as a rule of thumb one should update toward other agents’ beliefs based on the demonstrated rationality of their belief-forming process.
I find such demonstrations quite difficult to identify. Doing so requires both confidence in the correctness of their conclusion and, to a lesser extent, confidence that the beliefs you observe aren’t being selected for by other rationalists.
Great article. However, there’s one more item that needs to be added to the list to really differentiate us from religious institutions, and that item needs to become the new #1: Offer Transformation that actually works. That is, make being alive a significantly more meaningful experience for people who participate. Do this as well as items #1 thru #10 and we’ll be more popular than any existing religion. The kind of transformation religion offers used to work OK, but not so much anymore because the religious world-view hasn’t been updated to reflect current knowledge. People sense that religion doesn’t speak very well to the world as we now know it through science, but they don’t know where else to go.
Most UU (Unitarian-Universalist) congregations already practice items #2 thru #10, and many are working hard on getting better at #1. Maybe that explains why UUs have 1,094 groups, way more than any other class of skeptics. But, so far, UU’s don’t have a good plan for Transformation either (I’ve been a UU Humanist for 25 years).
Do we really need to simulate drugs to be taken seriously?
logi said, “Do we really need to simulate drugs to be taken seriously?”
Yes, so long as the competition offers something like it. I accepted Jesus Christ as my personal Lord and Savior at age 12, and it was a lasting emotional high that only recently have I been able to reproduce (at age 50). At age 12, my behavior and life really did improve considerably (as well as those around me) for a long period of time, until I finally backslid. I become rational at age 20 (and was quite ashamed of the personal Lord and Savior thing), and wasn’t able to reproduce the postive effects that I experienced at age 12 until only very recently, at age 50. Fortunately, I was able to do so without giving up rationality. In fact, rationality is key to a lasting Transformation.
Interesting. I rejected belief in God and the afterlife also at age 12, because I realized that the only reason I believed those things were true was that it made me feel good to believe them. It was a lasting intellectual high knowing that I had overcome my base emotions with some clear thought.
Emotional manipulation is like propaganda. One of things I found most striking about OB was the high signal-to-noise ratio. No base emotional pleas, no Giant Implicit Moral Framework silently guiding the discourse. I’d hate to see this community lose that by stooping to emotionally drugging its members in an attempt to “replace church”.
Can’t we just tell people the truth? You want transformation? Take psilocybin. You want to feel like things are more meaningful? Smoke marijuana. You want willpower and IQ? Take amphetamines. These all carry risks, just as religious emotional manipulation does. At least they don’t soak up precious time and words.
I guess it comes down to the epistemic/instrumental divide. I’m here for the truth, not for a bag of naive attempts to stimulate dopamine release.
logi’s comments are in quotes.
“Emotional manipulation is like propaganda”.
I’m not talking about emotional manipulation. I’m talking about a healthy emotional life, one that provides authentic happiness. Emotions are wonderful so long as they’re guided by reason.
“Can’t we just tell people the truth?” Yes, absolutely!
Apparently logi has given up on the idea the transformation, meaning, and willpower are achievable without doing long-term damage to oneself. I totally disagree, it is possible if we just open up our thinking a little bit.
“I’m here for the truth” So am I!
I don’t think I know what you mean by “healthy emotional life” or “authentic happiness” here.
But earlier, you said to the question of simulating drugs:
Drugs are a direct form of emotional and cognitive manipulation, so what I mean by “simulating drugs” is to achieve something similar.
I never claimed otherwise. I actually even value some emotions that aren’t “guided by reason”. But I certainly try not to let any of them in turn guide my reason.
No, I think you miss my points. One, I’m saying this stuff is totally orthogonal to what I find valuable in a rational community. This in the end must be a personal objection, but I am probably not alone in it.
Two, I’m as skeptical of the content of these “transformations” and “meanings” as I am of their drug-induced counterparts, regardless of their long-term harm or other drawbacks.
We probably share a concept of “willpower”, and it’s probably true that there are generally effective and sustainable techniques superior to drugs.
I don’t know what this means.
Even if all you want is to improve your own accuracy of belief, the more that people around you want the same thing, the better you can correct for your mistakes (c.f. Aumann). To that end, it is helpful to find ways to make rationalism more approachable and welcoming to people who are unlikely to experience such an intellectual high, a group that the evidence suggests encompasses most people.
Do they really want the same thing? It’s hard to tell when we’re also offering other tasty treats. Prostitutes will do things for cocaine that they wouldn’t do otherwise, and pretend to enjoy themselves.
As for Aumann, Robin Hanson points out that we’d often do better to update toward “average beliefs” than the beliefs of our chosen in-group. So it appears I can already maximize my “Aumann benefit” by conversing with random strangers. It seems to me that the benefits of a rationalist group are precisely the opposite: Discover good arguments and important data that we were previously unaware of, that we find convincing regardless of source. If we value our own mere opinions too highly, we’ve already lost.
Maybe they would, maybe they wouldn’t. But if rationalism doesn’t at least offer something comparable to other options, many people won’t even try.
Robin’s argument in that link seems to be that taking pleasure in disagreement with average beliefs is, all else equal, a bad thing; it’s certainly not an argument in favor of updating toward average beliefs. Aumann agreement only strictly applies to ideal rationalists with shared assumptions, but as a rule of thumb one should update toward other agents’ beliefs based on the demonstrated rationality of their belief-forming process.
“Maybe they would, maybe they wouldn’t. But if rationalism doesn’t at least offer something comparable to other options, many people won’t even try.”
So why should we want to attract such people?
We know why cult groups usually try to attract as many people as possible: they’re just raw material to them, explicitly or implicitly.
How is it to our benefit to adopt an r-strategy, rather than a K?
Among other reasons, because mass opinion often influences decisions, e.g. politics, in ways that impact everyone, including us. The greater the average rationality of the masses, the better those decisions are likely to be.
Rational arguments, being restricted to sanity, are un-optimized for swaying masses for political gain.
It’s not a good idea to fight irrationality’s strengths with rationality’s weaknesses.
True. I think the goal here is a bit more complex than “maximize number of self-proclaimed rationalists”, though.
I was presenting it as an argument favoring updates toward average beliefs over doing so for in-group beliefs, but you’re still right that it’s really making an unrelated point.
I find such demonstrations quite difficult to identify. Doing so requires both confidence in the correctness of their conclusion and, to a lesser extent, confidence that the beliefs you observe aren’t being selected for by other rationalists.