What’s wrong with embracing foreign cultures, uploadings, upliftings, and so on?
Maybe I am biased by my personal history, having embraced what, as far as I can tell, is the very cutting edge of Western Culture (i.e. the less-wrong brand of secular humanism), and feeling rather impatient for my origin cultures to follow a similar path, which they are violently reticent to. Maybe I’ve got a huge blind spot of some other sort.
But when the Superhappies demand that we let them eradicate suffering forever, or when CelestAI offers us all our own personal paradise on the only condition that it be pony-flavoured, I don’t just feel like I want to enthusiastically jump in, abandoning all caution. I feel like it’s a moral imperative to take them up on their offer, and that getting in their way is a crime that is potentially on the same level as genocide or mass torture.
Yet in both stories these examples come from, and in the commentary by the authors, this is qualified as a Bad Thing… but I don’t recall coming across an explanation that would satisfy me as to why.
Again, please warn me if I’m mixing things up here, as my purpose here is to correct any flaws that my stance may have, by consulting with minds that I expect will understand the problem better than I, and might see the flaws in how I frame it.
The thing about the Superhappies is that, well, people want to be able to be sad in certain situations. It’s like Huxley’s Brave New World—people are “happier” in that society, but they’ve sacrificed something fundamental to being human in the course of achieving that happiness. (Personally, I think that “not waiting the eight hours it would take to evacuate the system” isn’t the right decision—the gap between the “compromise” position the Superhappies are offering and humans’ actual values, when combined with the very real possibility that the Superhappies will indeed take more than eight hours to return in force, just doesn’t seem big enough to make not waiting the right decision.)
And as for the story with CelestAI in it, as far as I can tell, what it’s doing might not be perfect but it’s close enough not to matter… at least, as long as we don’t have to start worrying about the ethics of what it might do if it encounters aliens.
at least, as long as we don’t have to start worrying about the ethics of what it might do if it encounters aliens.
Well, that is quite horriffic. Poor non-humanlike alien minds...
I don’t think the SH’s plan was anything like Huxley’s BNW (which is about numbing people into docility). Saying pain should be maintained reminds me of that analogy Yudkowsky made about a world where people get truncheoned in the head daily, can’t help it, keep making up reasons why getting truncheoned is full of benefits, but, if you ask someone outside of that culture if they want to start getting truncheoned in exchange for all those wonderful benefits...
What’s wrong with embracing foreign cultures, uploadings, upliftings, and so on?
Maybe I am biased by my personal history, having embraced what, as far as I can tell, is the very cutting edge of Western Culture (i.e. the less-wrong brand of secular humanism), and feeling rather impatient for my origin cultures to follow a similar path, which they are violently reticent to. Maybe I’ve got a huge blind spot of some other sort.
But when the Superhappies demand that we let them eradicate suffering forever, or when CelestAI offers us all our own personal paradise on the only condition that it be pony-flavoured, I don’t just feel like I want to enthusiastically jump in, abandoning all caution. I feel like it’s a moral imperative to take them up on their offer, and that getting in their way is a crime that is potentially on the same level as genocide or mass torture.
Yet in both stories these examples come from, and in the commentary by the authors, this is qualified as a Bad Thing… but I don’t recall coming across an explanation that would satisfy me as to why.
Again, please warn me if I’m mixing things up here, as my purpose here is to correct any flaws that my stance may have, by consulting with minds that I expect will understand the problem better than I, and might see the flaws in how I frame it.
The thing about the Superhappies is that, well, people want to be able to be sad in certain situations. It’s like Huxley’s Brave New World—people are “happier” in that society, but they’ve sacrificed something fundamental to being human in the course of achieving that happiness. (Personally, I think that “not waiting the eight hours it would take to evacuate the system” isn’t the right decision—the gap between the “compromise” position the Superhappies are offering and humans’ actual values, when combined with the very real possibility that the Superhappies will indeed take more than eight hours to return in force, just doesn’t seem big enough to make not waiting the right decision.)
And as for the story with CelestAI in it, as far as I can tell, what it’s doing might not be perfect but it’s close enough not to matter… at least, as long as we don’t have to start worrying about the ethics of what it might do if it encounters aliens.
Well, that is quite horriffic. Poor non-humanlike alien minds...
I don’t think the SH’s plan was anything like Huxley’s BNW (which is about numbing people into docility). Saying pain should be maintained reminds me of that analogy Yudkowsky made about a world where people get truncheoned in the head daily, can’t help it, keep making up reasons why getting truncheoned is full of benefits, but, if you ask someone outside of that culture if they want to start getting truncheoned in exchange for all those wonderful benefits...