If I conducted a sequence of experiments that could kill me with high probability and remained alive, I would become pretty convinced that some form of MWI is right, but I would not expect my survival to convince you of this.
Probably, but I’m having trouble thinking of this experiment as scientifically useful if you cannot convince anyone else of your findings. Maybe there is a way to gather some statistics from so called “miracle survival stories” and see if there is an excess that can be attributed to the MWI, but I doubt that there is such excess to begin with.
Presumably Eliezer wants to maximize the wave-function mass associated with humanity surviving.
Why? The only ones that matter are those where he survives.
Why? The only ones that matter are those where he survives.
This seems like a pretty controversial ethical position. I disagree and I’m pretty sure Eliezer does as well. To analogize, I’m pretty confident that I won’t be alive a thousand years from now, but I wouldn’t be indifferent about actions that would lead to the extinction of all life at that time.
I’m pretty confident that I won’t be alive a thousand years from now, but I wouldn’t be ambivalent about actions that would lead to the extinction of all life at that time.
Indifferent. Ambivalent means, more or less, that you have reasons for wanting it either way as opposed to not caring at all.
Why? The only ones that matter are those where he survives.
If they don’t matter to you, that still doesn’t necessitate that they don’t matter to him. Each person’s utility function may care about whatever it pleases.
Probably, but I’m having trouble thinking of this experiment as scientifically useful if you cannot convince anyone else of your findings. Maybe there is a way to gather some statistics from so called “miracle survival stories” and see if there is an excess that can be attributed to the MWI, but I doubt that there is such excess to begin with.
Why? The only ones that matter are those where he survives.
If if he doesn’t care at all about anyone else at all. This doesn’t seem likely.
This seems like a pretty controversial ethical position. I disagree and I’m pretty sure Eliezer does as well. To analogize, I’m pretty confident that I won’t be alive a thousand years from now, but I wouldn’t be indifferent about actions that would lead to the extinction of all life at that time.
Indifferent. Ambivalent means, more or less, that you have reasons for wanting it either way as opposed to not caring at all.
Well, presumably he wouldn’t be ambivalent as well as not being indifferent about performing/not-performing those actions.
Thanks. Corrected.
If they don’t matter to you, that still doesn’t necessitate that they don’t matter to him. Each person’s utility function may care about whatever it pleases.