From when I was still forced to attend, I remember our synagogue’s annual fundraising appeal. It was a simple enough format, if I recall correctly. The rabbi and the treasurer talked about the shul’s expenses and how vital this annual fundraise was, and then the synagogue’s members called out their pledges from their seats.
Straightforward, yes?
Let me tell you about a different annual fundraising appeal. One that I ran, in fact; during the early years of a nonprofit organization that may not be named. One difference was that the appeal was conducted over the Internet. And another difference was that the audience was largely drawn from the atheist/libertarian/technophile/sf-fan/early-adopter/programmer/etc crowd. (To point in the rough direction of an empirical cluster in personspace. If you understood the phrase “empirical cluster in personspace” then you know who I’m talking about.)
I crafted the fundraising appeal with care. By my nature I’m too proud to ask other people for help; but I’ve gotten over around 60% of that reluctance over the years. The nonprofit needed money and was growing too slowly, so I put some force and poetry into that year’s annual appeal. I sent it out to several mailing lists that covered most of our potential support base.
And almost immediately, people started posting to the mailing lists about why they weren’t going to donate. Some of them raised basic questions about the nonprofit’s philosophy and mission. Others talked about their brilliant ideas for all the other sources that the nonprofit could get funding from, instead of them. (They didn’t volunteer to contact any of those sources themselves, they just had ideas for how we could do it.)
Now you might say, “Well, maybe your mission and philosophy did have basic problems—you wouldn’t want to censor that discussion, would you?”
Hold on to that thought.
Because people were donating. We started getting donations right away, via Paypal. We even got congratulatory notes saying how the appeal had finally gotten them to start moving. A donation of $111.11 was accompanied by a message saying, “I decided to give **** a little bit more. One more hundred, one more ten, one more single, one more dime, and one more penny. All may not be for one, but this one is trying to be for all.”
But none of those donors posted their agreement to the mailing list. Not one.
So far as any of those donors knew, they were alone. And when they tuned in the next day, they discovered not thanks, but arguments for why they shouldn’t have donated. The criticisms, the justifications for not donating—only those were displayed proudly in the open.
As though the treasurer had finished his annual appeal, and everyone not making a pledge had proudly stood up to call out justifications for refusing; while those making pledges whispered them quietly, so that no one could hear.
I know someone with a rationalist cause who goes around plaintively asking, “How come the Raelian flying-saucer cult can get tens of thousands of members [probably around 40,000] interested in complete nonsense, but we can’t even get a thousand people working on this?”
The obvious wrong way to finish this thought is to say, “Let’s do what the Raelians do! Let’s add some nonsense to this meme!” For the benefit of those not immediately stopped by their ethical inhibitions, I will observe that there may be a hundred failed flying-saucer cults for every one that becomes famous. And the Dark Side may require non-obvious skills, which you, yes you, do not have: Not everyone can be a Sith Lord. In particular, if you talk about your planned lies on the public Internet, you fail. I’m no master criminal, but even I can tell certain people are not cut out to be crooks.
So it’s probably not a good idea to cultivate a sense of violated entitlement at the thought that some other group, who you think ought to be inferior to you, has more money and followers. That path leads to—pardon the expression—the Dark Side.
But it probably does make sense to start asking ourselves some pointed questions, if supposed “rationalists” can’t manage to coordinate as well as a flying-saucer cult.
How do things work on the Dark Side?
The respected leader speaks, and there comes a chorus of pure agreement: if there are any who harbor inward doubts, they keep them to themselves. So all the individual members of the audience see this atmosphere of pure agreement, and they feel more confident in the ideas presented—even if they, personally, harbored inward doubts, why, everyone else seems to agree with it.
If anyone is still unpersuaded after that, they leave the group (or in some places, are executed)—and the remainder are more in agreement, and reinforce each other with less interference.
The ideas themselves, not just the leader, generate unbounded enthusiasm and praise. The halo effect is that perceptions of all positive qualities correlate—e.g. telling subjects about the benefits of a food preservative made them judge it as lower-risk, even though the quantities were logically uncorrelated. This can create a positive feedback effect that makes an idea seem better and better and better, especially if criticism is perceived as traitorous or sinful.
So these are all examples of strong Dark Side forces that can bind groups together.
And presumably we would not go so far as to dirty our hands with such...
Therefore, as a group, the Light Side will always be divided and weak. Atheists, libertarians, technophiles, nerds, science-fiction fans, scientists, or even non-fundamentalist religions, will never be capable of acting with the fanatic unity that animates radical Islam. Technological advantage can only go so far; your tools can be copied or stolen, and used against you. In the end the Light Side will always lose in any group conflict, and the future inevitably belongs to the Dark.
I think that one’s reaction to this prospect says a lot about their attitude towards “rationality”.
Some “Clash of Civilizations” writers seem to accept that the Enlightenment is destined to lose out in the long run to radical Islam, and sigh, and shake their heads sadly. I suppose they’re trying to signal their cynical sophistication or something.
For myself, I always thought—call me loony—that a true rationalist ought to be effective in the real world.
So I have a problem with the idea that the Dark Side, thanks to their pluralistic ignorance and affective death spirals, will always win because they are better coordinated than us.
You would think, perhaps, that real rationalists ought to be more coordinated? Surely all that unreason must have its disadvantages? That mode can’t be optimal, can it?
And if current “rationalist” groups cannot coordinate—if they can’t support group projects so well as a single synagogue draws donations from its members—well, I leave it to you to finish that syllogism.
There’s a saying I sometimes use: “It is dangerous to be half a rationalist.”
For example, I can think of ways to sabotage someone’s intelligence by selectively teaching them certain methods of rationality. Suppose you taught someone a long list of logical fallacies and cognitive biases, and trained them to spot those fallacies in biases in other people’s arguments. But you are careful to pick those fallacies and biases that are easiest to accuse others of, the most general ones that can easily be misapplied. And you do notwarn them to scrutinize arguments they agree with just as hard as they scrutinize incongruent arguments for flaws. So they have acquired a great repertoire of flaws of which to accuse only arguments and arguers who they don’t like. This, I suspect, is one of the primary ways that smart people end up stupid. (And note, by the way, that I have just given you another Fully General Counterargument against smart people whose arguments you don’t like.)
Similarly, if you wanted to ensure that a group of “rationalists” never accomplished any task requiring more than one person, you could teach them only techniques of individual rationality, without mentioning anything about techniques of coordinated group rationality.
I’ll write more later (tomorrow?) on how I think rationalists might be able to coordinate better. But today I want to focus on what you might call the culture of disagreement, or even, the culture of objections, which is one of the two major forces preventing the atheist/libertarian/technophile crowd from coordinating.
Imagine that you’re at a conference, and the speaker gives a 30-minute talk. Afterward, people line up at the microphones for questions. The first questioner objects to the graph used in slide 14 using a logarithmic scale; he quotes Tufte on The Visual Display of Quantitative Information. The second questioner disputes a claim made in slide 3. The third questioner suggests an alternative hypothesis that seems to explain the same data...
Perfectly normal, right? Now imagine that you’re at a conference, and the speaker gives a 30-minute talk. People line up at the microphone.
The first person says, “I agree with everything you said in your talk, and I think you’re brilliant.” Then steps aside.
The second person says, “Slide 14 was beautiful, I learned a lot from it. You’re awesome.” Steps aside.
The third person—
Well, you’ll never know what the third person at the microphone had to say, because by this time, you’ve fled screaming out of the room, propelled by a bone-deep terror as if Cthulhu had erupted from the podium, the fear of the impossibly unnatural phenomenon that has invaded your conference.
Yes, a group which can’t tolerate disagreement is not rational. But if you tolerate only disagreement—if you tolerate disagreement but not agreement—then you also are not rational. You’re only willing to hear some honest thoughts, but not others. You are a dangerous half-a-rationalist.
Let’s say we have two groups of soldiers. In group 1, the privates are ignorant of tactics and strategy; only the sergeants know anything about tactics and only the officers know anything about strategy. In group 2, everyone at all levels knows all about tactics and strategy.
Should we expect group 1 to defeat group 2, because group 1 will follow orders, while everyone in group 2 comes up with better ideas than whatever orders they were given?
In this case I have to question how much group 2 really understands about military theory, because it is an elementary proposition that an uncoordinated mob gets slaughtered.
Doing worse with more knowledge means you are doing something very wrong. You should always be able to at least implement the same strategy you would use if you are ignorant, and preferably do better. You definitely should not do worse. If you find yourself regretting your “rationality” then you should reconsider what is rational.
On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge. I recall a lovely experiment which showed that politically opinionated students with more knowledge of the issues reacted less to incongruent evidence, because they had more ammunition with which to counter-argue only incongruent evidence.
We would seem to be stuck in an awful valley of partial rationality where we end up more poorly coordinated than religious fundamentalists, able to put forth less effort than flying-saucer cultists. True, what little effort we do manage to put forth may be better-targeted at helping people rather than the reverse—but that is not an acceptable excuse.
If I were setting forth to systematically train rationalists, there would be lessons on how to disagree and lessons on how to agree, lessons intended to make the trainee more comfortable with dissent, and lessons intended to make them more comfortable with conformity. One day everyone shows up dressed differently, another day they all show up in uniform. You’ve got to cover both sides, or you’re only half a rationalist.
Can you imagine training prospective rationalists to wear a uniform and march in lockstep, and practice sessions where they agree with each other and applaud everything a speaker on a podium says? It sounds like unspeakable horror, doesn’t it, like the whole thing has admitted outright to being an evil cult? But why is it not okay to practice that, while it is okay to practice disagreeing with everyone else in the crowd? Are you never going to have to agree with the majority?
Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus. We signal our superior intelligence and our membership in the nonconformist community by inventing clever objections to others’ arguments. Perhaps that is why the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd stays marginalized, losing battles with less nonconformist factions in larger society. No, we’re not losing because we’re so superior, we’re losing because our exclusively individualist traditions sabotage our ability to cooperate.
The other major component that I think sabotages group efforts in the atheist/libertarian/technophile/etcetera community, is being ashamed of strong feelings. We still have the Spock archetype of rationality stuck in our heads, rationality as dispassion. Or perhaps a related mistake, rationality as cynicism—trying to signal your superior world-weary sophistication by showing that you care less than others. Being careful to ostentatiously, publicly look down on those so naive as to show they care strongly about anything.
Wouldn’t it make you feel uncomfortable if the speaker at the podium said that he cared so strongly about, say, fighting aging, that he would willingly die for the cause?
But it is nowhere written in either probability theory or decision theory that a rationalist should not care. I’ve looked over those equations and, really, it’s not in there.
The best informal definition I’ve ever heard of rationality is “That which can be destroyed by the truth should be.” We should aspire to feel the emotions that fit the facts, not aspire to feel no emotion. If an emotion can be destroyed by truth, we should relinquish it. But if a cause is worth striving for, then let us by all means feel fully its importance.
Some things are worth dying for. Yes, really! And if we can’t get comfortable with admitting it and hearing others say it, then we’re going to have trouble caring enough—as well as coordinating enough—to put some effort into group projects. You’ve got to teach both sides of it, “That which can be destroyed by the truth should be,” and “That which the truth nourishes should thrive.”
I’ve heard it argued that the taboo against emotional language in, say, science papers, is an important part of letting the facts fight it out without distraction. That doesn’t mean the taboo should apply everywhere. I think that there are parts of life where we should learn to applaud strong emotional language, eloquence, and poetry. When there’s something that needs doing, poetic appeals help get it done, and, therefore, are themselves to be applauded.
We need to keep our efforts to expose counterproductive causes and unjustified appeals, from stomping on tasks that genuinely need doing. You need both sides of it—the willingness to turn away from counterproductive causes, and the willingness to praise productive ones; the strength to be unswayed by ungrounded appeals, and the strength to be swayed by grounded ones.
I think the synagogue at their annual appeal had it right, really. They weren’t going down row by row and putting individuals on the spot, staring at them and saying, “How much will you donate, Mr. Schwartz?” People simply announced their pledges—not with grand drama and pride, just simple announcements—and that encouraged others to do the same. Those who had nothing to give, stayed silent; those who had objections, chose some later or earlier time to voice them. That’s probably about the way things should be in a sane human community—taking into account that people often have trouble getting as motivated as they wish they were, and can be helped by social encouragement to overcome this weakness of will.
But even if you disagree with that part, then let us say that both supporting and countersupporting opinions should have been publicly voiced. Supporters being faced by an apparently solid wall of objections and disagreements—even if it resulted from their own uncomfortable self-censorship—is not group rationality. It is the mere mirror image of what Dark Side groups do to keep their followers. Reversed stupidity is not intelligence.
Why Our Kind Can’t Cooperate
From when I was still forced to attend, I remember our synagogue’s annual fundraising appeal. It was a simple enough format, if I recall correctly. The rabbi and the treasurer talked about the shul’s expenses and how vital this annual fundraise was, and then the synagogue’s members called out their pledges from their seats.
Straightforward, yes?
Let me tell you about a different annual fundraising appeal. One that I ran, in fact; during the early years of a nonprofit organization that may not be named. One difference was that the appeal was conducted over the Internet. And another difference was that the audience was largely drawn from the atheist/libertarian/technophile/sf-fan/early-adopter/programmer/etc crowd. (To point in the rough direction of an empirical cluster in personspace. If you understood the phrase “empirical cluster in personspace” then you know who I’m talking about.)
I crafted the fundraising appeal with care. By my nature I’m too proud to ask other people for help; but I’ve gotten over around 60% of that reluctance over the years. The nonprofit needed money and was growing too slowly, so I put some force and poetry into that year’s annual appeal. I sent it out to several mailing lists that covered most of our potential support base.
And almost immediately, people started posting to the mailing lists about why they weren’t going to donate. Some of them raised basic questions about the nonprofit’s philosophy and mission. Others talked about their brilliant ideas for all the other sources that the nonprofit could get funding from, instead of them. (They didn’t volunteer to contact any of those sources themselves, they just had ideas for how we could do it.)
Now you might say, “Well, maybe your mission and philosophy did have basic problems—you wouldn’t want to censor that discussion, would you?”
Hold on to that thought.
Because people were donating. We started getting donations right away, via Paypal. We even got congratulatory notes saying how the appeal had finally gotten them to start moving. A donation of $111.11 was accompanied by a message saying, “I decided to give **** a little bit more. One more hundred, one more ten, one more single, one more dime, and one more penny. All may not be for one, but this one is trying to be for all.”
But none of those donors posted their agreement to the mailing list. Not one.
So far as any of those donors knew, they were alone. And when they tuned in the next day, they discovered not thanks, but arguments for why they shouldn’t have donated. The criticisms, the justifications for not donating—only those were displayed proudly in the open.
As though the treasurer had finished his annual appeal, and everyone not making a pledge had proudly stood up to call out justifications for refusing; while those making pledges whispered them quietly, so that no one could hear.
I know someone with a rationalist cause who goes around plaintively asking, “How come the Raelian flying-saucer cult can get tens of thousands of members [probably around 40,000] interested in complete nonsense, but we can’t even get a thousand people working on this?”
The obvious wrong way to finish this thought is to say, “Let’s do what the Raelians do! Let’s add some nonsense to this meme!” For the benefit of those not immediately stopped by their ethical inhibitions, I will observe that there may be a hundred failed flying-saucer cults for every one that becomes famous. And the Dark Side may require non-obvious skills, which you, yes you, do not have: Not everyone can be a Sith Lord. In particular, if you talk about your planned lies on the public Internet, you fail. I’m no master criminal, but even I can tell certain people are not cut out to be crooks.
So it’s probably not a good idea to cultivate a sense of violated entitlement at the thought that some other group, who you think ought to be inferior to you, has more money and followers. That path leads to—pardon the expression—the Dark Side.
But it probably does make sense to start asking ourselves some pointed questions, if supposed “rationalists” can’t manage to coordinate as well as a flying-saucer cult.
How do things work on the Dark Side?
The respected leader speaks, and there comes a chorus of pure agreement: if there are any who harbor inward doubts, they keep them to themselves. So all the individual members of the audience see this atmosphere of pure agreement, and they feel more confident in the ideas presented—even if they, personally, harbored inward doubts, why, everyone else seems to agree with it.
(“Pluralistic ignorance” is the standard label for this.)
If anyone is still unpersuaded after that, they leave the group (or in some places, are executed)—and the remainder are more in agreement, and reinforce each other with less interference.
(I call that “evaporative cooling of groups”.)
The ideas themselves, not just the leader, generate unbounded enthusiasm and praise. The halo effect is that perceptions of all positive qualities correlate—e.g. telling subjects about the benefits of a food preservative made them judge it as lower-risk, even though the quantities were logically uncorrelated. This can create a positive feedback effect that makes an idea seem better and better and better, especially if criticism is perceived as traitorous or sinful.
(Which I term the “affective death spiral”.)
So these are all examples of strong Dark Side forces that can bind groups together.
And presumably we would not go so far as to dirty our hands with such...
Therefore, as a group, the Light Side will always be divided and weak. Atheists, libertarians, technophiles, nerds, science-fiction fans, scientists, or even non-fundamentalist religions, will never be capable of acting with the fanatic unity that animates radical Islam. Technological advantage can only go so far; your tools can be copied or stolen, and used against you. In the end the Light Side will always lose in any group conflict, and the future inevitably belongs to the Dark.
I think that one’s reaction to this prospect says a lot about their attitude towards “rationality”.
Some “Clash of Civilizations” writers seem to accept that the Enlightenment is destined to lose out in the long run to radical Islam, and sigh, and shake their heads sadly. I suppose they’re trying to signal their cynical sophistication or something.
For myself, I always thought—call me loony—that a true rationalist ought to be effective in the real world.
So I have a problem with the idea that the Dark Side, thanks to their pluralistic ignorance and affective death spirals, will always win because they are better coordinated than us.
You would think, perhaps, that real rationalists ought to be more coordinated? Surely all that unreason must have its disadvantages? That mode can’t be optimal, can it?
And if current “rationalist” groups cannot coordinate—if they can’t support group projects so well as a single synagogue draws donations from its members—well, I leave it to you to finish that syllogism.
There’s a saying I sometimes use: “It is dangerous to be half a rationalist.”
For example, I can think of ways to sabotage someone’s intelligence by selectively teaching them certain methods of rationality. Suppose you taught someone a long list of logical fallacies and cognitive biases, and trained them to spot those fallacies in biases in other people’s arguments. But you are careful to pick those fallacies and biases that are easiest to accuse others of, the most general ones that can easily be misapplied. And you do not warn them to scrutinize arguments they agree with just as hard as they scrutinize incongruent arguments for flaws. So they have acquired a great repertoire of flaws of which to accuse only arguments and arguers who they don’t like. This, I suspect, is one of the primary ways that smart people end up stupid. (And note, by the way, that I have just given you another Fully General Counterargument against smart people whose arguments you don’t like.)
Similarly, if you wanted to ensure that a group of “rationalists” never accomplished any task requiring more than one person, you could teach them only techniques of individual rationality, without mentioning anything about techniques of coordinated group rationality.
I’ll write more later (tomorrow?) on how I think rationalists might be able to coordinate better. But today I want to focus on what you might call the culture of disagreement, or even, the culture of objections, which is one of the two major forces preventing the atheist/libertarian/technophile crowd from coordinating.
Imagine that you’re at a conference, and the speaker gives a 30-minute talk. Afterward, people line up at the microphones for questions. The first questioner objects to the graph used in slide 14 using a logarithmic scale; he quotes Tufte on The Visual Display of Quantitative Information. The second questioner disputes a claim made in slide 3. The third questioner suggests an alternative hypothesis that seems to explain the same data...
Perfectly normal, right? Now imagine that you’re at a conference, and the speaker gives a 30-minute talk. People line up at the microphone.
The first person says, “I agree with everything you said in your talk, and I think you’re brilliant.” Then steps aside.
The second person says, “Slide 14 was beautiful, I learned a lot from it. You’re awesome.” Steps aside.
The third person—
Well, you’ll never know what the third person at the microphone had to say, because by this time, you’ve fled screaming out of the room, propelled by a bone-deep terror as if Cthulhu had erupted from the podium, the fear of the impossibly unnatural phenomenon that has invaded your conference.
Yes, a group which can’t tolerate disagreement is not rational. But if you tolerate only disagreement—if you tolerate disagreement but not agreement—then you also are not rational. You’re only willing to hear some honest thoughts, but not others. You are a dangerous half-a-rationalist.
We are as uncomfortable together as flying-saucer cult members are uncomfortable apart. That can’t be right either. Reversed stupidity is not intelligence.
Let’s say we have two groups of soldiers. In group 1, the privates are ignorant of tactics and strategy; only the sergeants know anything about tactics and only the officers know anything about strategy. In group 2, everyone at all levels knows all about tactics and strategy.
Should we expect group 1 to defeat group 2, because group 1 will follow orders, while everyone in group 2 comes up with better ideas than whatever orders they were given?
In this case I have to question how much group 2 really understands about military theory, because it is an elementary proposition that an uncoordinated mob gets slaughtered.
Doing worse with more knowledge means you are doing something very wrong. You should always be able to at least implement the same strategy you would use if you are ignorant, and preferably do better. You definitely should not do worse. If you find yourself regretting your “rationality” then you should reconsider what is rational.
On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge. I recall a lovely experiment which showed that politically opinionated students with more knowledge of the issues reacted less to incongruent evidence, because they had more ammunition with which to counter-argue only incongruent evidence.
We would seem to be stuck in an awful valley of partial rationality where we end up more poorly coordinated than religious fundamentalists, able to put forth less effort than flying-saucer cultists. True, what little effort we do manage to put forth may be better-targeted at helping people rather than the reverse—but that is not an acceptable excuse.
If I were setting forth to systematically train rationalists, there would be lessons on how to disagree and lessons on how to agree, lessons intended to make the trainee more comfortable with dissent, and lessons intended to make them more comfortable with conformity. One day everyone shows up dressed differently, another day they all show up in uniform. You’ve got to cover both sides, or you’re only half a rationalist.
Can you imagine training prospective rationalists to wear a uniform and march in lockstep, and practice sessions where they agree with each other and applaud everything a speaker on a podium says? It sounds like unspeakable horror, doesn’t it, like the whole thing has admitted outright to being an evil cult? But why is it not okay to practice that, while it is okay to practice disagreeing with everyone else in the crowd? Are you never going to have to agree with the majority?
Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus. We signal our superior intelligence and our membership in the nonconformist community by inventing clever objections to others’ arguments. Perhaps that is why the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd stays marginalized, losing battles with less nonconformist factions in larger society. No, we’re not losing because we’re so superior, we’re losing because our exclusively individualist traditions sabotage our ability to cooperate.
The other major component that I think sabotages group efforts in the atheist/libertarian/technophile/etcetera community, is being ashamed of strong feelings. We still have the Spock archetype of rationality stuck in our heads, rationality as dispassion. Or perhaps a related mistake, rationality as cynicism—trying to signal your superior world-weary sophistication by showing that you care less than others. Being careful to ostentatiously, publicly look down on those so naive as to show they care strongly about anything.
Wouldn’t it make you feel uncomfortable if the speaker at the podium said that he cared so strongly about, say, fighting aging, that he would willingly die for the cause?
But it is nowhere written in either probability theory or decision theory that a rationalist should not care. I’ve looked over those equations and, really, it’s not in there.
The best informal definition I’ve ever heard of rationality is “That which can be destroyed by the truth should be.” We should aspire to feel the emotions that fit the facts, not aspire to feel no emotion. If an emotion can be destroyed by truth, we should relinquish it. But if a cause is worth striving for, then let us by all means feel fully its importance.
Some things are worth dying for. Yes, really! And if we can’t get comfortable with admitting it and hearing others say it, then we’re going to have trouble caring enough—as well as coordinating enough—to put some effort into group projects. You’ve got to teach both sides of it, “That which can be destroyed by the truth should be,” and “That which the truth nourishes should thrive.”
I’ve heard it argued that the taboo against emotional language in, say, science papers, is an important part of letting the facts fight it out without distraction. That doesn’t mean the taboo should apply everywhere. I think that there are parts of life where we should learn to applaud strong emotional language, eloquence, and poetry. When there’s something that needs doing, poetic appeals help get it done, and, therefore, are themselves to be applauded.
We need to keep our efforts to expose counterproductive causes and unjustified appeals, from stomping on tasks that genuinely need doing. You need both sides of it—the willingness to turn away from counterproductive causes, and the willingness to praise productive ones; the strength to be unswayed by ungrounded appeals, and the strength to be swayed by grounded ones.
I think the synagogue at their annual appeal had it right, really. They weren’t going down row by row and putting individuals on the spot, staring at them and saying, “How much will you donate, Mr. Schwartz?” People simply announced their pledges—not with grand drama and pride, just simple announcements—and that encouraged others to do the same. Those who had nothing to give, stayed silent; those who had objections, chose some later or earlier time to voice them. That’s probably about the way things should be in a sane human community—taking into account that people often have trouble getting as motivated as they wish they were, and can be helped by social encouragement to overcome this weakness of will.
But even if you disagree with that part, then let us say that both supporting and countersupporting opinions should have been publicly voiced. Supporters being faced by an apparently solid wall of objections and disagreements—even if it resulted from their own uncomfortable self-censorship—is not group rationality. It is the mere mirror image of what Dark Side groups do to keep their followers. Reversed stupidity is not intelligence.