This assumes humanity is capable of coordinating to do something like that (accept the wishes of a vanquished AGI instead of just destroying it and ignoring paperclips). Any superintelligence could easily predict, with some awareness of human nature, that we would never do that. Also there is not a lot of good reasons imo to simulate the past. It’s already happened, what use is it to do that? So I think this whole thing is a little far-fetched.
That said, what do I know, I have a very hard time following acausal reasoning and I have an extremely small prior that we are living in a simulation. (By extremely small I mean as close to zero as it is reasonable to get while still attempting to be a proper Bayesian.)
This assumes humanity is capable of coordinating to do something like that (accept the wishes of a vanquished AGI instead of just destroying it and ignoring paperclips). Any superintelligence could easily predict, with some awareness of human nature, that we would never do that. Also there is not a lot of good reasons imo to simulate the past. It’s already happened, what use is it to do that? So I think this whole thing is a little far-fetched.
That said, what do I know, I have a very hard time following acausal reasoning and I have an extremely small prior that we are living in a simulation. (By extremely small I mean as close to zero as it is reasonable to get while still attempting to be a proper Bayesian.)