If short timelines advocates were seeking out people with personalities that predisposed them toward apocalyptic terror, would you find it similarly unobjectionable? My guess is no. It seems to me that a neutral observer who didn’t care about any of the object-level arguments would say that seeking out high-psychoticism people is more analogous to seeking out high-apocalypticism people than it is to seeking out programmers, transhumanists, reductionists, or people who think machine learning / deep learning are important.
The way I can make sense of seeking high-psychoticism people being morally equivalent to seeking high IQ systematizers, is if I drain any normative valance from “psychotic,” and imagine there is a spectrum from autistic to psychotic. In this spectrum the extreme autistic is exclusively focused on exactly one thing at a time, and is incapable of cognition that has to take into account context, especially context they aren’t already primed to have in mind, and the extreme psychotic can only see the globally interconnected context where everything means/is connected to everything else. Obviously neither extreme state is desirable, but leaning one way or another could be very helpful in different contexts.
See also: indexicality.
On the other hand, back in my reflective beliefs, I think psychosis is a much scarier failure mode than “autism,” on this scale, and I would not personally pursue any actions that pushed people toward it without, among other things, a supporting infrastructure of some kind for processing the psychotic state without losing the plot (social or cultural would work, but whatever).
I wouldn’t find it objectionable. I’m not really sure what morally relevant distinction is being pointed at here, apocalyptic beliefs might make the inferential distance to specific apocalyptic hypotheses lower.
Well, I don’t think it’s obviously objectionable, and I’d have trouble putting my finger on the exact criterion for objectionability we should be using here. Something like “we’d all be better off in the presence of a norm against encouraging people to think in ways that might be valid in the particular case where we’re talking to them but whose appeal comes from emotional predispositions that we sought out in them that aren’t generally either truth-tracking or good for them” seems plausible to me. But I think it’s obviously not as obviously unobjectionable as Zack seemed to be suggesting in his last few sentences, which was what moved me to comment.
I don’t have well-formed thoughts on this topic, but one factor that seems relevant to me has a core that might be verbalized as “susceptibility to invalid methods of persuasion”, which seems notably higher in the case of people with high “apocalypticism” than people with the other attributes described in the grandparent. (A similar argument applies in the case of people with high “psychoticism”.)
That might be relevant in some cases but seems unobjectionable both in the psychoticism case and the apocalypse case. I would predict that LW people cluster together in personality measurements like OCEAN and Eysenck, it’s by default easier to write for people of a similar personality to yourself. Also, people notice high rates of Asperger’s-like characteristics around here, which are correlated with Jewish ethnicity and transgenderism (also both frequent around here).
If short timelines advocates were seeking out people with personalities that predisposed them toward apocalyptic terror, would you find it similarly unobjectionable? My guess is no. It seems to me that a neutral observer who didn’t care about any of the object-level arguments would say that seeking out high-psychoticism people is more analogous to seeking out high-apocalypticism people than it is to seeking out programmers, transhumanists, reductionists, or people who think machine learning / deep learning are important.
The way I can make sense of seeking high-psychoticism people being morally equivalent to seeking high IQ systematizers, is if I drain any normative valance from “psychotic,” and imagine there is a spectrum from autistic to psychotic. In this spectrum the extreme autistic is exclusively focused on exactly one thing at a time, and is incapable of cognition that has to take into account context, especially context they aren’t already primed to have in mind, and the extreme psychotic can only see the globally interconnected context where everything means/is connected to everything else. Obviously neither extreme state is desirable, but leaning one way or another could be very helpful in different contexts.
See also: indexicality.
On the other hand, back in my reflective beliefs, I think psychosis is a much scarier failure mode than “autism,” on this scale, and I would not personally pursue any actions that pushed people toward it without, among other things, a supporting infrastructure of some kind for processing the psychotic state without losing the plot (social or cultural would work, but whatever).
I wouldn’t find it objectionable. I’m not really sure what morally relevant distinction is being pointed at here, apocalyptic beliefs might make the inferential distance to specific apocalyptic hypotheses lower.
Well, I don’t think it’s obviously objectionable, and I’d have trouble putting my finger on the exact criterion for objectionability we should be using here. Something like “we’d all be better off in the presence of a norm against encouraging people to think in ways that might be valid in the particular case where we’re talking to them but whose appeal comes from emotional predispositions that we sought out in them that aren’t generally either truth-tracking or good for them” seems plausible to me. But I think it’s obviously not as obviously unobjectionable as Zack seemed to be suggesting in his last few sentences, which was what moved me to comment.
I don’t have well-formed thoughts on this topic, but one factor that seems relevant to me has a core that might be verbalized as “susceptibility to invalid methods of persuasion”, which seems notably higher in the case of people with high “apocalypticism” than people with the other attributes described in the grandparent. (A similar argument applies in the case of people with high “psychoticism”.)
That might be relevant in some cases but seems unobjectionable both in the psychoticism case and the apocalypse case. I would predict that LW people cluster together in personality measurements like OCEAN and Eysenck, it’s by default easier to write for people of a similar personality to yourself. Also, people notice high rates of Asperger’s-like characteristics around here, which are correlated with Jewish ethnicity and transgenderism (also both frequent around here).