I’m curious, do you also think that a singleton is a desirable outcome? It’s possible my thinking is biased because I view this outcome as a dystopia and so underestimate it’s probability due to motivated cognition.
Funny you should mention it; that’s exactly what I was thinking. I have a friend (also named matt, incidentally) who I strongly believe is guilty of motivated cognition about the desirability of a singleton AI (he thinks it is likely, and therefore is biased toward thinking it would be good) and so I leaped naturally to the ad hominem attack you level against yourself. :-)
I’m curious, do you also think that a singleton is a desirable outcome? It’s possible my thinking is biased because I view this outcome as a dystopia and so underestimate it’s probability due to motivated cognition.
Most of them, no. Some, yes. Particularly since the alternative is the inevitable loss of everything that is valuable to me in the universe.
This is incredibly tangential, but I was talking to a friend earlier and I realized how difficult it is to instill in someone the desire for altruism. Her reasoning was basically, “Yeah… I feel like I should care about cancer, and I do care a little, but honestly, I don’t really care.” This sort of off-hand egoism is something I wasn’t used to; most smart people try to rationalize selfishness with crazy beliefs. But it’s hard to argue with “I just don’t care” other than to say “I bet you will have wanted to have cared”, which is gramatically horrible and a pretty terrible argument.
I guess I’m playing the game right then :)
I’m curious, do you also think that a singleton is a desirable outcome? It’s possible my thinking is biased because I view this outcome as a dystopia and so underestimate it’s probability due to motivated cognition.
Funny you should mention it; that’s exactly what I was thinking. I have a friend (also named matt, incidentally) who I strongly believe is guilty of motivated cognition about the desirability of a singleton AI (he thinks it is likely, and therefore is biased toward thinking it would be good) and so I leaped naturally to the ad hominem attack you level against yourself. :-)
Most of them, no. Some, yes. Particularly since the alternative is the inevitable loss of everything that is valuable to me in the universe.
This is incredibly tangential, but I was talking to a friend earlier and I realized how difficult it is to instill in someone the desire for altruism. Her reasoning was basically, “Yeah… I feel like I should care about cancer, and I do care a little, but honestly, I don’t really care.” This sort of off-hand egoism is something I wasn’t used to; most smart people try to rationalize selfishness with crazy beliefs. But it’s hard to argue with “I just don’t care” other than to say “I bet you will have wanted to have cared”, which is gramatically horrible and a pretty terrible argument.
I respect blatant apathy a whole hell of a lot more than masked apathy, which is how I would qualify the average person’s altruism.