For my prediction, like those of others, I basically just went with my AGI timeline multiplied by 50% (representing my uncertainty about how dangerous AGI is; I feel like if I thought a lot more about it the number could go up to 90% or down to 10%) and then added a small background risk rate from everything else combined (nuclear war, bio stuff, etc.)
I didn’t spend long on this so my distribution probably isn’t exactly reflective of my views, but it’s mostly right.
Note that I’m using a definition of existential catastrophe where the date it happens is the date it becomes too late to stop it happening, not the date when the last human dies.
For some reason I can’t drag-and-drop images into here; when I do it just opens up a new window.
For my prediction, like those of others, I basically just went with my AGI timeline multiplied by 50% (representing my uncertainty about how dangerous AGI is; I feel like if I thought a lot more about it the number could go up to 90% or down to 10%) and then added a small background risk rate from everything else combined (nuclear war, bio stuff, etc.)
I didn’t spend long on this so my distribution probably isn’t exactly reflective of my views, but it’s mostly right.
Note that I’m using a definition of existential catastrophe where the date it happens is the date it becomes too late to stop it happening, not the date when the last human dies.
For some reason I can’t drag-and-drop images into here; when I do it just opens up a new window.
(Just a heads up that the link leads back to this thread, rather than to your Elicit snapshot :) )
Oops, thanks!