Mm. No, still not quite clear. I mean, I agree that all of us not dying is better than all of us dying (I guess… it’s actually more complicated than that, but I don’t think it matters), but that seems beside the point.
Suppose I endorse the New World Order the AI is going to create (nobody dies, etc.), and I’m given a choice between starting the New World Order at time T1 or at a later time T2.
In general, I’d prefer it start at T1. Why not? Waiting seems pointless at best, if not actively harmful.
I can imagine situations where I’d prefer it start on T2, I guess. For example, if the expected value of my making further improvements on the AI before I turn it on is high enough, I might prefer to wait. Or if by some coincidence all the people I value are going to live past T2 regardless of the NWO, and all the people I anti-value are going to die on or before T2, then the world would be better if the NWO begins at T2 than T1. (I’m not sure whether I’d actually choose that, but I guess I agree that I ought to, in the same way that I ought to prefer that the AI extrapolate my values rather than all of humanity’s.)
But either way, it doesn’t seem to matter when I’m given that choice. If I would choose T1 over T2 at T1, then if I create a time-traveling AI at T2 and it gives me that choice, it seems I should choose T1 over T2 at T2 as well. If I would not choose T1 over T2 at T2, it’s not clear to me why I’m endorsing the NWO at all.
Don’t disagree. You must have caught the comment that I took down five seconds later when I realized the specific falsehood I rejected was intended as the ‘Q’ in a modus tollens.
Mm. No, still not quite clear. I mean, I agree that all of us not dying is better than all of us dying (I guess… it’s actually more complicated than that, but I don’t think it matters), but that seems beside the point.
Suppose I endorse the New World Order the AI is going to create (nobody dies, etc.), and I’m given a choice between starting the New World Order at time T1 or at a later time T2.
In general, I’d prefer it start at T1. Why not? Waiting seems pointless at best, if not actively harmful.
I can imagine situations where I’d prefer it start on T2, I guess. For example, if the expected value of my making further improvements on the AI before I turn it on is high enough, I might prefer to wait. Or if by some coincidence all the people I value are going to live past T2 regardless of the NWO, and all the people I anti-value are going to die on or before T2, then the world would be better if the NWO begins at T2 than T1. (I’m not sure whether I’d actually choose that, but I guess I agree that I ought to, in the same way that I ought to prefer that the AI extrapolate my values rather than all of humanity’s.)
But either way, it doesn’t seem to matter when I’m given that choice. If I would choose T1 over T2 at T1, then if I create a time-traveling AI at T2 and it gives me that choice, it seems I should choose T1 over T2 at T2 as well. If I would not choose T1 over T2 at T2, it’s not clear to me why I’m endorsing the NWO at all.
Don’t disagree. You must have caught the comment that I took down five seconds later when I realized the specific falsehood I rejected was intended as the ‘Q’ in a modus tollens.