Agree with “myself”, disagree with “diversity of minds”. If the future needs diversity, it has its random number generators and person templates. Additional argument: death is bad, life-creation is not morally reversible.
I can understand valuing oneself more than others (simple selfishness is unsurprising), but I think Eliezer is saying cryonics is a positive good, not just that it benefits some people at the equal expense of others.
If uploading gets up as is probably required for people to be woken up, the future will be able to make minds as diverse as they like.
I think Eliezer is saying cryonics is a positive good
I don’t think so; when people say “shouldn’t you argue that people give the money to SIAI”, he says “why does this come out of our saving the world budget, and not your curry budget?”
I think this is a very weak point and the far future will probably be able to make whatever kinds of minds they like, but we could have scanning/WBE long before we know enough about minds to diversify them.
I care more about myself than future potential people.
More seriously, I value a diversity of minds, and if the future does too they may be glad to have us along.
Agree with “myself”, disagree with “diversity of minds”. If the future needs diversity, it has its random number generators and person templates. Additional argument: death is bad, life-creation is not morally reversible.
I don’t know why I said “more seriously” when it’s by far the less defensible argument.
I can understand valuing oneself more than others (simple selfishness is unsurprising), but I think Eliezer is saying cryonics is a positive good, not just that it benefits some people at the equal expense of others.
If uploading gets up as is probably required for people to be woken up, the future will be able to make minds as diverse as they like.
I don’t think so; when people say “shouldn’t you argue that people give the money to SIAI”, he says “why does this come out of our saving the world budget, and not your curry budget?”
I think this is a very weak point and the far future will probably be able to make whatever kinds of minds they like, but we could have scanning/WBE long before we know enough about minds to diversify them.