Presuming it’s not entirely rhetorical, that sounds more than a little overblown. I’d buy “foolish” or “dangerous”, but this seems pretty ubiquitous and generally doesn’t lead to more than the usual amount of disaster. In particular, I hardly think this is unique to nerds or uniquely horrible in their hands; best I can tell, pretty much everyone is under the impression that they’re substantially free of ideological bias, whether they wear a blue collar or a pocket protector, and their attitude toward ideological foes is very likely to be informed by that.
With regard to the OP, I think I broadly accept the theory that technically minded folks are less inclined than average to tolerate fuzziness or internal contradiction in systems, and that this tends to attract them to totalizing systems in the absence of suitable countervailing influences: a set which, unfortunately, includes quite a lot of fundamentalist nastiness.
best I can tell, pretty much everyone is under the impression that they’re substantially free of ideological bias, whether they wear a blue collar or a pocket protector
In far mode most people think in terms of good and evil first, correct and incorrect second. They might think that their enemies are evil mutants, but most sense, underneath it all, that their enemies still have their own unique truth (evil mutant truth). This leads to hatred and aggression, but it’s less bad than an impersonal, clinical, mechanistic approach.
The people I’m so afraid of are the ones who look for some “objective position” first and feel simply that they’re technically correct in the Engineering Challenge of Life, while others are “making mistakes”. Thinking that you’re fixing others’ mistakes all day (like mistakenly allowing Jews to “contaminate” a nation) promotes a much more simplified picture of the world than thinking you’re opposing dread and cunning evil—like Catholics do.
In far mode most people think in terms of good and evil first, correct and incorrect second. They might think that their enemies are evil mutants, but most sense that their enemies still have their own unique truth (evil mutant truth). This leads to hatred and aggression, but it’s less bad than an impersonal, clinical, mechanistic approach.
I agree with the first sentence, but not with the second. Good and evil, for most people, implies correct and incorrect—ideological enemies are both wrong and evil, and they’re wrong because they’re evil. Also evil because they’re wrong, if you back them into a corner on that one. Christian conceptions of sin are tied pretty closely to correctness, for example—the etymology implies “missing the mark”.
I’m honestly not sure unemotional, subjectively-objective hatred exists in neurotypical folks, human psychology being what it is. I’ve gotten pretty angry at software bugs before.
Might be mind projection on my part, true. However, it genuinely looks to me that many people do feel like this, for example, in the trolley problem: the math might say it’s more “correct” to end up with +4 saved lives, yet it’s still an “evil” act to them—they’d say that a solution can be the only technically correct one and still less moral than alternatives.
Presuming it’s not entirely rhetorical, that sounds more than a little overblown. I’d buy “foolish” or “dangerous”, but this seems pretty ubiquitous and generally doesn’t lead to more than the usual amount of disaster. In particular, I hardly think this is unique to nerds or uniquely horrible in their hands; best I can tell, pretty much everyone is under the impression that they’re substantially free of ideological bias, whether they wear a blue collar or a pocket protector, and their attitude toward ideological foes is very likely to be informed by that.
With regard to the OP, I think I broadly accept the theory that technically minded folks are less inclined than average to tolerate fuzziness or internal contradiction in systems, and that this tends to attract them to totalizing systems in the absence of suitable countervailing influences: a set which, unfortunately, includes quite a lot of fundamentalist nastiness.
In far mode most people think in terms of good and evil first, correct and incorrect second. They might think that their enemies are evil mutants, but most sense, underneath it all, that their enemies still have their own unique truth (evil mutant truth). This leads to hatred and aggression, but it’s less bad than an impersonal, clinical, mechanistic approach.
The people I’m so afraid of are the ones who look for some “objective position” first and feel simply that they’re technically correct in the Engineering Challenge of Life, while others are “making mistakes”. Thinking that you’re fixing others’ mistakes all day (like mistakenly allowing Jews to “contaminate” a nation) promotes a much more simplified picture of the world than thinking you’re opposing dread and cunning evil—like Catholics do.
I agree with the first sentence, but not with the second. Good and evil, for most people, implies correct and incorrect—ideological enemies are both wrong and evil, and they’re wrong because they’re evil. Also evil because they’re wrong, if you back them into a corner on that one. Christian conceptions of sin are tied pretty closely to correctness, for example—the etymology implies “missing the mark”.
I’m honestly not sure unemotional, subjectively-objective hatred exists in neurotypical folks, human psychology being what it is. I’ve gotten pretty angry at software bugs before.
Might be mind projection on my part, true. However, it genuinely looks to me that many people do feel like this, for example, in the trolley problem: the math might say it’s more “correct” to end up with +4 saved lives, yet it’s still an “evil” act to them—they’d say that a solution can be the only technically correct one and still less moral than alternatives.