As a thought experiment, this is interesting, and I’m sure informative, but there is one crucial thing that this post neglects to examine: whether the inscription under the hood actually reads “humanity maximizer.” The impression from the post is that this is already established.
But has anybody established, or even stopped to consider whether avoiding the loss of 10^46 potential lives per century is really what we value? If so, I see no evidence of it here. I see no reason to even suspect that enabling that many lives in the distant future has any remotely proportional value to us.
Does Nick Bostrom believe that sentient beings living worthwhile lives in the future are the ultimate value structures? If so, whose value is he thinking of, ours or theirs? If theirs, then he is chasing something non-existent, they can’t reach back in time to us (there can be no social contract with them).
No matter how clever we are at devising ways to maximize our colonization of the universe, if this is not actually what we desire, then it isn’t rational to do so. Surely, we must decide what we want, before arguing the best way to get it.
As a thought experiment, this is interesting, and I’m sure informative, but there is one crucial thing that this post neglects to examine: whether the inscription under the hood actually reads “humanity maximizer.” The impression from the post is that this is already established.
But has anybody established, or even stopped to consider whether avoiding the loss of 10^46 potential lives per century is really what we value? If so, I see no evidence of it here. I see no reason to even suspect that enabling that many lives in the distant future has any remotely proportional value to us.
Does Nick Bostrom believe that sentient beings living worthwhile lives in the future are the ultimate value structures? If so, whose value is he thinking of, ours or theirs? If theirs, then he is chasing something non-existent, they can’t reach back in time to us (there can be no social contract with them).
No matter how clever we are at devising ways to maximize our colonization of the universe, if this is not actually what we desire, then it isn’t rational to do so. Surely, we must decide what we want, before arguing the best way to get it.