The thing about the Superhappies is that, well, people want to be able to be sad in certain situations. It’s like Huxley’s Brave New World—people are “happier” in that society, but they’ve sacrificed something fundamental to being human in the course of achieving that happiness. (Personally, I think that “not waiting the eight hours it would take to evacuate the system” isn’t the right decision—the gap between the “compromise” position the Superhappies are offering and humans’ actual values, when combined with the very real possibility that the Superhappies will indeed take more than eight hours to return in force, just doesn’t seem big enough to make not waiting the right decision.)
And as for the story with CelestAI in it, as far as I can tell, what it’s doing might not be perfect but it’s close enough not to matter… at least, as long as we don’t have to start worrying about the ethics of what it might do if it encounters aliens.
at least, as long as we don’t have to start worrying about the ethics of what it might do if it encounters aliens.
Well, that is quite horriffic. Poor non-humanlike alien minds...
I don’t think the SH’s plan was anything like Huxley’s BNW (which is about numbing people into docility). Saying pain should be maintained reminds me of that analogy Yudkowsky made about a world where people get truncheoned in the head daily, can’t help it, keep making up reasons why getting truncheoned is full of benefits, but, if you ask someone outside of that culture if they want to start getting truncheoned in exchange for all those wonderful benefits...
The thing about the Superhappies is that, well, people want to be able to be sad in certain situations. It’s like Huxley’s Brave New World—people are “happier” in that society, but they’ve sacrificed something fundamental to being human in the course of achieving that happiness. (Personally, I think that “not waiting the eight hours it would take to evacuate the system” isn’t the right decision—the gap between the “compromise” position the Superhappies are offering and humans’ actual values, when combined with the very real possibility that the Superhappies will indeed take more than eight hours to return in force, just doesn’t seem big enough to make not waiting the right decision.)
And as for the story with CelestAI in it, as far as I can tell, what it’s doing might not be perfect but it’s close enough not to matter… at least, as long as we don’t have to start worrying about the ethics of what it might do if it encounters aliens.
Well, that is quite horriffic. Poor non-humanlike alien minds...
I don’t think the SH’s plan was anything like Huxley’s BNW (which is about numbing people into docility). Saying pain should be maintained reminds me of that analogy Yudkowsky made about a world where people get truncheoned in the head daily, can’t help it, keep making up reasons why getting truncheoned is full of benefits, but, if you ask someone outside of that culture if they want to start getting truncheoned in exchange for all those wonderful benefits...