I don’t trust humanity to make it through the invention of nuclear weapons again, so let’s not go back too far. Within the last few decades, you could try a reroll on the alignment problem. Collect a selection of safety papers and try to excise hints at such facts as “throwing enough money at simple known architectures produces AGI”. Wait to jump back until waiting longer carries a bigger risk of surprise UFAI than it’s worth, or until the local intel agency knocks on your door for your time machine. You could build a reverse box—a Faraday bunker that sends you back if it’s breached, leaving only a communication channel for new papers, X-risk alerts and UFAI hackers—some UFAIs may not care enough whether I make it out of their timeline. Balance acquiring researcher’s recognition codes against the threat of other people taking the possibility of time travel seriously.
I don’t trust humanity to make it through the invention of nuclear weapons again, so let’s not go back too far. Within the last few decades, you could try a reroll on the alignment problem. Collect a selection of safety papers and try to excise hints at such facts as “throwing enough money at simple known architectures produces AGI”. Wait to jump back until waiting longer carries a bigger risk of surprise UFAI than it’s worth, or until the local intel agency knocks on your door for your time machine. You could build a reverse box—a Faraday bunker that sends you back if it’s breached, leaving only a communication channel for new papers, X-risk alerts and UFAI hackers—some UFAIs may not care enough whether I make it out of their timeline. Balance acquiring researcher’s recognition codes against the threat of other people taking the possibility of time travel seriously.