I don’t agree with your reasoning for the misalignment.
If everyone did nothing, no group loses out proportionally. And so the groups are fine with this, because it is proportion that is selected for.
What? Individuals and (some) groups definitely care about more than relative position. Most care about relative position AS WELL, but they do care about their own existence and absolute-valued satisfaction. This does not invalidate your thesis, which is that the median human doesn’t put much effort into x-risk avoidance. Assuming that it’s for selection or commons reasons is just wrong, though. It’s for much more prosaic reasons of scope insensitivity and construal issues (near-mode actions vs far-mode values).
I’m not saying LessWrong are the only misaligned ones, although we might be more than others. I’m saying any group who wants humanity to survive is misaligned with respect to the optimization process that created that group.
Luckily, at least a little bit of this misalignment is common! I’m just pointing out that we were never optimized for this; the only reason humans care about humanity as a whole is that our society isn’t the optimum of the optimization process that created it. And it’s not random either; surviving is an instrumental value that any optimization process has to deal with when creating intelligences.
I don’t agree with your reasoning for the misalignment.
What? Individuals and (some) groups definitely care about more than relative position. Most care about relative position AS WELL, but they do care about their own existence and absolute-valued satisfaction. This does not invalidate your thesis, which is that the median human doesn’t put much effort into x-risk avoidance. Assuming that it’s for selection or commons reasons is just wrong, though. It’s for much more prosaic reasons of scope insensitivity and construal issues (near-mode actions vs far-mode values).
I’m not saying LessWrong are the only misaligned ones, although we might be more than others. I’m saying any group who wants humanity to survive is misaligned with respect to the optimization process that created that group.
Luckily, at least a little bit of this misalignment is common! I’m just pointing out that we were never optimized for this; the only reason humans care about humanity as a whole is that our society isn’t the optimum of the optimization process that created it. And it’s not random either; surviving is an instrumental value that any optimization process has to deal with when creating intelligences.