I agree that charisma is important, but within the EA movement in particular, you won’t get far without intelligence. Intelligence is a necessary qualifier for leadership in EA, in other words.
I have a pretty strong confidence level on that one. I’m ready to make a $100 bet that if you ask EA people whether intelligence is a necessary qualifier for leadership in the EA movement, 9 out of 10 will say yes. Want to take me up on this?
Within the current EA movement, or within the EA movement you propose to create by filling your ranks with people who don’t share your culture or values?
I don’t think you understand how bets work in staking out certainty, but $100 against $100 implies your certainty that you won’t destroy the EA movement is ~50%.
And he’ll find little disagreement from me over what the current group of EA members will say, which says nothing at all about what the group of EA members -after- a successful advertising campaign aimed at increasing membership substantially will say, and at which point the EA movement is, by my view, effectively destroyed, even if it doesn’t know it yet.
My certainty is against your claim of emotionally oriented people becoming EA leaders, not the destruction of the EA movement. Please avoid shifting goalposts :-) So taking the bet, or taking back your claim?
My unwillingness to accept a strawman in place of the positions I have actually stated does not constitute shifting goalposts. But that’s irrelevant, compared to the larger mistake you’re making, in trying to utilize this technique.
A lesson in Dark Arts: I am Nobody. I could delete this account right now, start over from scratch, and lose nothing but some karma points I don’t care about. You, however, are a Somebody. Your account is linked to your identity. Anybody who cares to know who you are, can know who you are. Anybody who already knows who you are can find out what you say here.
As a Somebody, you have credibility. As a Nobody, I have none. So in a war of discrediting—you discredit me, I discredit you—I lose nothing. What do you lose?
Your identify gives you credibility. But it also gives you something to lose. My lack of identity means any credit I gain or lose here is totally meaningless. But it means I have nothing to lose. That means that our credibility disparity is precisely mirrored by a power disparity; one in your favor, the other in mine. But the credibility disparity lasts only until you let yourself be mired in credibility-destroying tactics.
You really shouldn’t engage anybody, much less me, in a Dark Arts competition. Indeed, it’s vaguely foolish of you to have admitted to the practice of Dark Arts in the first place.
I agree that I have something significant to lose, as my account is tied to my public identity.
However, I do not share your belief that me having acknowledged engaging in Dark Arts is foolish. I am comfortable with being publicly identified as someone who is comfortable with using light forms of Dark Arts, stuff that Less Wrongers generally do not perceive as crossing into real Dark Arts, to promote rationality and Effective Altruism. In fact, I explored this question in a Less Wrong discussion post earlier. I want to be open and transparent, to help myself and Intentional Insights make good decisions and update beliefs.
I agree that emotionally-oriented people are not automatically stupid, my point was about what EAs value. If an emotionally-oriented person happens to be also intelligent, then that has certain benefits for the EA movement, of course.
I have a strong probabilistic estimate that there are currently a substantial number of people in the EA movement who care about status games. I’m willing to take a bet on that.
I agree that charisma is important, but within the EA movement in particular, you won’t get far without intelligence. Intelligence is a necessary qualifier for leadership in EA, in other words.
I have a pretty strong confidence level on that one. I’m ready to make a $100 bet that if you ask EA people whether intelligence is a necessary qualifier for leadership in the EA movement, 9 out of 10 will say yes. Want to take me up on this?
Within the current EA movement, or within the EA movement you propose to create by filling your ranks with people who don’t share your culture or values?
Within the EA movement currently, but I disagree with the second presumption. So are you taking that bet?
I don’t think you understand how bets work in staking out certainty, but $100 against $100 implies your certainty that you won’t destroy the EA movement is ~50%.
The bet is not on the question of whether he destroys the EA movement but about whether people say intelligence is important.
And he’ll find little disagreement from me over what the current group of EA members will say, which says nothing at all about what the group of EA members -after- a successful advertising campaign aimed at increasing membership substantially will say, and at which point the EA movement is, by my view, effectively destroyed, even if it doesn’t know it yet.
My certainty is against your claim of emotionally oriented people becoming EA leaders, not the destruction of the EA movement. Please avoid shifting goalposts :-) So taking the bet, or taking back your claim?
My unwillingness to accept a strawman in place of the positions I have actually stated does not constitute shifting goalposts. But that’s irrelevant, compared to the larger mistake you’re making, in trying to utilize this technique.
A lesson in Dark Arts: I am Nobody. I could delete this account right now, start over from scratch, and lose nothing but some karma points I don’t care about. You, however, are a Somebody. Your account is linked to your identity. Anybody who cares to know who you are, can know who you are. Anybody who already knows who you are can find out what you say here.
As a Somebody, you have credibility. As a Nobody, I have none. So in a war of discrediting—you discredit me, I discredit you—I lose nothing. What do you lose?
Your identify gives you credibility. But it also gives you something to lose. My lack of identity means any credit I gain or lose here is totally meaningless. But it means I have nothing to lose. That means that our credibility disparity is precisely mirrored by a power disparity; one in your favor, the other in mine. But the credibility disparity lasts only until you let yourself be mired in credibility-destroying tactics.
You really shouldn’t engage anybody, much less me, in a Dark Arts competition. Indeed, it’s vaguely foolish of you to have admitted to the practice of Dark Arts in the first place.
I agree that I have something significant to lose, as my account is tied to my public identity.
However, I do not share your belief that me having acknowledged engaging in Dark Arts is foolish. I am comfortable with being publicly identified as someone who is comfortable with using light forms of Dark Arts, stuff that Less Wrongers generally do not perceive as crossing into real Dark Arts, to promote rationality and Effective Altruism. In fact, I explored this question in a Less Wrong discussion post earlier. I want to be open and transparent, to help myself and Intentional Insights make good decisions and update beliefs.
I think you go wrong if you assume that “emotionally-oriented” are automatically stupid.
I agree that emotionally-oriented people are not automatically stupid, my point was about what EAs value. If an emotionally-oriented person happens to be also intelligent, then that has certain benefits for the EA movement, of course.
A person who cares about playing status games might be intelligent but still harmful to the EA movement.
I have a strong probabilistic estimate that there are currently a substantial number of people in the EA movement who care about status games. I’m willing to take a bet on that.