Talking about the conjunctive fallacy looks disingenuous when the conjuncts have strong dependencies.
I am sick of being accused of being disingenuous, using dark arts and countless other things like asking “rhetorical questions”. Are people really that incapable of seeing that I might simply lack the necessary training? Concluding that all I am saying is therefore just wrong is then making me use emotionally loaded language.
All those accusations rather look incredible sad. As if those people are just pissed off that someone tried to criticize their most cherished ideas but they don’t know what to say other than ridiculing the opponent based on his inexperience.
All those accusations rather look incredible sad. As if those people are just pissed off that someone tried to criticize their most cherished ideas but they don’t know what to say other than ridiculing the opponent based on his inexperience.
Sorry, I didn’t mean to ridicule you. I’m not annoyed be the fact that you’re criticizing—if I’m annoyed at all (does ‘someone is WRONG on the Internet!’ syndrome count as annoyance?). I wasn’t bothered by your criticisms of SI when you started posting them. But since then you’ve been going at it, repeating the same arguments over and over again.
You’re trying to create something out of nothing here. Currently available arguments about intelligence explosion are simple. There’s no deep math in them (and that’s a problem for sure but it cuts both ways — the SIAI don’t have a mathy model of intelligence explosion, you don’t have mathy arguments that recursive self-improvement will run into fundamental limitations).
People are moved by those arguments to various extents. And that’s it. We’re done. Someone has to come up with a novel insight that will shed additional light on the issue. Until then, people won’t change their minds by being exposed to the same arguments even if they come with a brand new rhetorical packaging, heretofore unseen decomposition into bullet points, and a sprinkling of yet-unseen cool quotations.
People will change their minds by being exposed to new background knowledge that isn’t a directly about intelligence explosion but causes them to see existing arguments in new light. The sequences are a likely example of that. They will also change their minds for epistemologically insane reasons like social pressure. Both those factors are hard to affect and writing posts on LessWrong seems like one of the worst ways to go about it.
No one likes being told the same thing over and over again in an insistent tone of voice. If you do that, people will get frustrated and want to criticize you. If you give in to your intuitive feeling that you need to rephrase just a little bit and this time they will surely see the light, then you will eventually rephrase your way to bullshit and give those frustrated people ample opportunity to poke holes in your arguments.
I am sick of being accused of being disingenuous, using dark arts and countless other things like asking “rhetorical questions”.
Using somewhat different language this is exactly what you declare about yourself. Those things which you describe so casually as your own preferred behaviors are seen by those with a lesswrong mindset as disengenuity and the abuse of the dark arts. That isn’t necessarily a bad thing—you’d fit right in at MENSA for example, aside from the entry requirement I suppose—it just isn’t received well on lesswrong.
That isn’t necessarily a bad thing—you’d fit right in at MENSA for example, aside from the entry requirement I suppose—it just isn’t received well on lesswrong.
Was that some sort of a dig at Mensa, or XiXiDu, or both ? I know next to nothing about Mensa, so I feel like I’m missing the context here… Aren’t they just a bunch of guys who solve IQ tests as a hobby ?
Those things which you describe so casually as your own preferred behaviors are seen by those with a lesswrong mindset as disengenuity and the abuse of the dark arts.
It is not dark arts if you are honest about what you are doing.
What I am often doing is exploring various viewpoints by taking the position of someone who would be emotionally attached to it and convinced about it. I also use the opponents arguments against the opponent if it shows that it cuts both ways. I don’t see why that would be a problem, especially since I always admitted that I am doing that. See for example this comment from 2010.
It is not dark arts if you are honest about what you are doing.
That’s absolutely false. The terror management theory people, for example, discovered that mortality salience still kicks in even if you tell people that you’re going to expose them to something in order to provoke their own feeling of mortality.
EDIT: The paper I wanted to cite is still paywalled, afaik, but the relevant references are mostly linked in this section of the Wikipedia article. The relevant study is the one where the threat was writing about one’s feelings on death.
It is not dark arts if you are honest about what you are doing.
That’s absolutely false.
Okay. I possibly mistakenly assumed that the only way I could get answers is to challenge people directly and emotionally. I didn’t expect that I could just ask how people associated with SI/LW could possible believe what they believe and get answers. I tried, but it didn’t work.
I am sick of being accused of being disingenuous, using dark arts and countless other things like asking “rhetorical questions”. Are people really that incapable of seeing that I might simply lack the necessary training? Concluding that all I am saying is therefore just wrong is then making me use emotionally loaded language.
All those accusations rather look incredible sad. As if those people are just pissed off that someone tried to criticize their most cherished ideas but they don’t know what to say other than ridiculing the opponent based on his inexperience.
Sorry, I didn’t mean to ridicule you. I’m not annoyed be the fact that you’re criticizing—if I’m annoyed at all (does ‘someone is WRONG on the Internet!’ syndrome count as annoyance?). I wasn’t bothered by your criticisms of SI when you started posting them. But since then you’ve been going at it, repeating the same arguments over and over again.
You’re trying to create something out of nothing here. Currently available arguments about intelligence explosion are simple. There’s no deep math in them (and that’s a problem for sure but it cuts both ways — the SIAI don’t have a mathy model of intelligence explosion, you don’t have mathy arguments that recursive self-improvement will run into fundamental limitations). People are moved by those arguments to various extents. And that’s it. We’re done. Someone has to come up with a novel insight that will shed additional light on the issue. Until then, people won’t change their minds by being exposed to the same arguments even if they come with a brand new rhetorical packaging, heretofore unseen decomposition into bullet points, and a sprinkling of yet-unseen cool quotations.
People will change their minds by being exposed to new background knowledge that isn’t a directly about intelligence explosion but causes them to see existing arguments in new light. The sequences are a likely example of that. They will also change their minds for epistemologically insane reasons like social pressure. Both those factors are hard to affect and writing posts on LessWrong seems like one of the worst ways to go about it.
No one likes being told the same thing over and over again in an insistent tone of voice. If you do that, people will get frustrated and want to criticize you. If you give in to your intuitive feeling that you need to rephrase just a little bit and this time they will surely see the light, then you will eventually rephrase your way to bullshit and give those frustrated people ample opportunity to poke holes in your arguments.
Using somewhat different language this is exactly what you declare about yourself. Those things which you describe so casually as your own preferred behaviors are seen by those with a lesswrong mindset as disengenuity and the abuse of the dark arts. That isn’t necessarily a bad thing—you’d fit right in at MENSA for example, aside from the entry requirement I suppose—it just isn’t received well on lesswrong.
Was that some sort of a dig at Mensa, or XiXiDu, or both ? I know next to nothing about Mensa, so I feel like I’m missing the context here… Aren’t they just a bunch of guys who solve IQ tests as a hobby ?
Neither, more of a mild compliment combined with an acknowledgement that the lesswrong way is not the only way—or even particularly common.
It is not dark arts if you are honest about what you are doing.
What I am often doing is exploring various viewpoints by taking the position of someone who would be emotionally attached to it and convinced about it. I also use the opponents arguments against the opponent if it shows that it cuts both ways. I don’t see why that would be a problem, especially since I always admitted that I am doing that. See for example this comment from 2010.
That’s absolutely false. The terror management theory people, for example, discovered that mortality salience still kicks in even if you tell people that you’re going to expose them to something in order to provoke their own feeling of mortality.
EDIT: The paper I wanted to cite is still paywalled, afaik, but the relevant references are mostly linked in this section of the Wikipedia article. The relevant study is the one where the threat was writing about one’s feelings on death.
Okay. I possibly mistakenly assumed that the only way I could get answers is to challenge people directly and emotionally. I didn’t expect that I could just ask how people associated with SI/LW could possible believe what they believe and get answers. I tried, but it didn’t work.