That was a surprisingly good summary of Roko’s basilisk. Thanks for the link.
In case anyone’s wondering, here’s the standard answer I give to people who are unsure whether to worry about the basilisk: the AI won’t adopt the awful strategy if adopting it hurts the AI overall instead of helping, which is something you can affect by (conditionally) refusing to donate. Of course this answer doesn’t come with a guarantee of correctness, but feel free to accept it if it works for you.
That was a surprisingly good summary of Roko’s basilisk. Thanks for the link.
In case anyone’s wondering, here’s the standard answer I give to people who are unsure whether to worry about the basilisk: the AI won’t adopt the awful strategy if adopting it hurts the AI overall instead of helping, which is something you can affect by (conditionally) refusing to donate. Of course this answer doesn’t come with a guarantee of correctness, but feel free to accept it if it works for you.