That depends on whether the AGI is told (and accepts) to HailMary once, or to HailMary to completion, or something in between. It also depends on which decision theory the AGI uses to decide I believe. There seem to be, for a large ensemble of decisions, a one-round version of the many-round decision (“No Regrets” Arntzenius2007, “TDT” Yudkowksy2010, “UDT” WeiDai 20xx).
Isn’t it the civilization not the AGI that will need to decide what to do?
That depends on whether the AGI is told (and accepts) to HailMary once, or to HailMary to completion, or something in between. It also depends on which decision theory the AGI uses to decide I believe. There seem to be, for a large ensemble of decisions, a one-round version of the many-round decision (“No Regrets” Arntzenius2007, “TDT” Yudkowksy2010, “UDT” WeiDai 20xx).