I feel like this could be related to a problem of rationalists and people in general binarizing too much, either there’s extreme action 1 or extreme action 2, with no middle ground. And in general, this relates to one intuitive reason to be skeptical of extreme utopia or dystopia, even if my models say otherwise: We underestimate the chances of a middle ground situation.
Also a problem of ignoring non-extreme outcomes even though they don’t require as much optimization power as extreme outcomes.
I feel like this could be related to a problem of rationalists and people in general binarizing too much, either there’s extreme action 1 or extreme action 2, with no middle ground. And in general, this relates to one intuitive reason to be skeptical of extreme utopia or dystopia, even if my models say otherwise: We underestimate the chances of a middle ground situation.
Also a problem of ignoring non-extreme outcomes even though they don’t require as much optimization power as extreme outcomes.
Hmm, I’m not sure that’s the right way to look at it. Cause the way I would have seen those three scenarios at the time would be
Doing something quick and drastic is best
Not doing anything is bad
Doing this weird middle ground thing is worst
So it’s not that I gravitated towards expecting either the best or the worst, actually I wasn’t pessimistic enough!