More parents might let their toddler get hit by a car if they could fix the toddler afterwards.
There are an awful lot of types of Buddhism. Some allow mind annihilation, and even claim that it should be our goal. Some strains of Epicurianism hold that mind annihilation is a) neutral, and b) better than what all the religions believed in. Some ancient religions seemed to believe in the same awful universal fate as quantum immortality believers do, e.g. eternal degeneration, progressively advanced Alzheimers forever more or less. Adam Smith suggests that this is what most people secretly believe in.
It would take quite a black swan tech to undo all the good from tech up to this point. UFAI probably wouldn’t pass the test, since without tech humans would go extinct with a smaller total population of lives lived anyway. Hell worlds seem unlikely. 1984 or the Brave New World (roughly) are a bit more likely, but is it worse than extinction? I don’t generally feel that way, though I’m not sure.
I think one of the problems with UFAI isn’t just human extinction, or even future human suffering. It’s that some kinds of UFAI (the paperclip-maximizer comes to mind) could take over our entire future light cone. preventing any future intelligent life (Earth-originating or otherwise) from evolving and finding a better path.
More parents might let their toddler get hit by a car if they could fix the toddler afterwards.
There are an awful lot of types of Buddhism. Some allow mind annihilation, and even claim that it should be our goal. Some strains of Epicurianism hold that mind annihilation is a) neutral, and b) better than what all the religions believed in. Some ancient religions seemed to believe in the same awful universal fate as quantum immortality believers do, e.g. eternal degeneration, progressively advanced Alzheimers forever more or less. Adam Smith suggests that this is what most people secretly believe in.
It would take quite a black swan tech to undo all the good from tech up to this point. UFAI probably wouldn’t pass the test, since without tech humans would go extinct with a smaller total population of lives lived anyway. Hell worlds seem unlikely. 1984 or the Brave New World (roughly) are a bit more likely, but is it worse than extinction? I don’t generally feel that way, though I’m not sure.
Eight years late reply, but oh well.
I think one of the problems with UFAI isn’t just human extinction, or even future human suffering. It’s that some kinds of UFAI (the paperclip-maximizer comes to mind) could take over our entire future light cone. preventing any future intelligent life (Earth-originating or otherwise) from evolving and finding a better path.