Thanks! I think your tag of @avturchin didn’t work, so just pinging them here to see if they think I missed important and probable scenarios.
Taking the Doomsday argument seriously, the “Futures without AGI because we go extinct in another way” and the “Futures with AGI in which we die” seem most probable. In futures with conscious AGI agents, it will depend a lot on how experience gets sampled (e.g. one agent vs many).
Thanks! I think your tag of @avturchin didn’t work, so just pinging them here to see if they think I missed important and probable scenarios.
Taking the Doomsday argument seriously, the “Futures without AGI because we go extinct in another way” and the “Futures with AGI in which we die” seem most probable. In futures with conscious AGI agents, it will depend a lot on how experience gets sampled (e.g. one agent vs many).