I tried to process through the story again, and I realized a perspective on it that I don’t think I noticed on my first run through. To start off with:
A: Almost everyone is viciously upset and wants the AI’s death, very, very quickly.
B: Even the AI is well aware that it failed.
C: 89.8% of the Human species includes people who aren’t even CLOSE to any kind of romantic/decision making age. (at least according to http://populationpyramid.net/ )
D: Yet the AI has to have failed so horribly that the statistics are implied that almost every human being remaining alive capable of expressing the thought “I want you dead.” wants it dead.
Now if the AI actually did something like this:
1: Terraform Mars and Venus
2: Relocate all Heterosexual Cisgender Adult Males to Mars, boost health.
3: Relocate all Heterosexual Cisgender Adult Females to Venus, boost health.
4: Make Complementary Partners on Mars and Venus.
Then it seems to imply, but not say:
5a: A large number of minor children have been abandoned to their deaths as any remaining adults who are still on Earth can’t possibly take care of 100% of the remaining minor children in the wake of a massive societal disruption of being left behind. Oh, and neither the remaining adults or the children get boosted health, either. So, all those people in #2 and #3? You’ll probably outlive your minor children even if they DID survive, and you get no say in it.
5b: Everyone in 5a was just killed very fast, possibly by being teleported to the Moon.
Either of those might be a horrible enough thing for the AI to have such a monumentally bad approval rating for a near total death wish to occur so quickly. But little else would.
I love my wife, A LOT, but I don’t think her and I being moved to separate planets where we were both given amicable divorce by force and received compensation for not being able to see several of our family members for years would make me start hurling death wishes at the only thing which could hypothetically reverse the situation and which obviously has an enormous amount of power. And even if it did, applying that to 89.8% of people doesn’t seem likely. I think a lot of them would spend much more time just being in shock until they got used to it.
On the other hand, if you kill my baby nephew and my young cousins and a shitload of other people then I can EASILY see myself hurling around death wishes on you, whether or not I really mean them, and hitting 89.8% feels much more likely.
If there are no implied deaths, then it seems like a vast portion of humanity is being excruciatingly dumb and reactionary for no reason, much like Stephen Grass, unless Stephen Grass DID realize the implied deaths and that’s why he vomited when he did.
Indeed. It’s not clear from the story what happened to them, not to mention everyone who isn’t heterosexual. Maybe they’re on a moon somewhere?
Whether or not that moon has been Terraformed/Paradised or is still a death trap makes a rather huge difference. Although, I may just be reading too much into a plothole, since he also said.
I assume that the children were forcibly separated from their families and placed with people (or “people”) who will be “better” for them in the long run.
That may have been the case (Since it is unclear in the story.) but from my perspective, that still doesn’t seem bad enough to cause a near species wide death rage, particularly since if the children are still alive, they might count for AI voting rights as a member of the human species. It seems the AI would have to have done something currently almost universally regarded as utterly horrible and beyond the pale.
There are a lot of possible alternatives, though. Example further alternative: All of Earth’s children were sold to the Baby Eating Aliens for terraforming technology.
particularly since if the children are still alive, they might count for AI voting rights as a member of the human species.
In short run I image the kids are quite upset about being separated from their families and being told they’ll never see them again. I don’t have, or work around, kids so I don’t know that this would translate into wishing the AI dead, but it feels plausiblish.
I tried to process through the story again, and I realized a perspective on it that I don’t think I noticed on my first run through. To start off with:
A: Almost everyone is viciously upset and wants the AI’s death, very, very quickly.
B: Even the AI is well aware that it failed.
C: 89.8% of the Human species includes people who aren’t even CLOSE to any kind of romantic/decision making age. (at least according to http://populationpyramid.net/ )
D: Yet the AI has to have failed so horribly that the statistics are implied that almost every human being remaining alive capable of expressing the thought “I want you dead.” wants it dead.
Now if the AI actually did something like this:
1: Terraform Mars and Venus
2: Relocate all Heterosexual Cisgender Adult Males to Mars, boost health.
3: Relocate all Heterosexual Cisgender Adult Females to Venus, boost health.
4: Make Complementary Partners on Mars and Venus.
Then it seems to imply, but not say:
5a: A large number of minor children have been abandoned to their deaths as any remaining adults who are still on Earth can’t possibly take care of 100% of the remaining minor children in the wake of a massive societal disruption of being left behind. Oh, and neither the remaining adults or the children get boosted health, either. So, all those people in #2 and #3? You’ll probably outlive your minor children even if they DID survive, and you get no say in it.
5b: Everyone in 5a was just killed very fast, possibly by being teleported to the Moon.
Either of those might be a horrible enough thing for the AI to have such a monumentally bad approval rating for a near total death wish to occur so quickly. But little else would.
I love my wife, A LOT, but I don’t think her and I being moved to separate planets where we were both given amicable divorce by force and received compensation for not being able to see several of our family members for years would make me start hurling death wishes at the only thing which could hypothetically reverse the situation and which obviously has an enormous amount of power. And even if it did, applying that to 89.8% of people doesn’t seem likely. I think a lot of them would spend much more time just being in shock until they got used to it.
On the other hand, if you kill my baby nephew and my young cousins and a shitload of other people then I can EASILY see myself hurling around death wishes on you, whether or not I really mean them, and hitting 89.8% feels much more likely.
If there are no implied deaths, then it seems like a vast portion of humanity is being excruciatingly dumb and reactionary for no reason, much like Stephen Grass, unless Stephen Grass DID realize the implied deaths and that’s why he vomited when he did.
This seems to be sort of left up to the reader, since all Yudkowsky said in http://lesswrong.com/lw/xu/failed_utopia_42/qia was
Whether or not that moon has been Terraformed/Paradised or is still a death trap makes a rather huge difference. Although, I may just be reading too much into a plothole, since he also said.
Elsewhere in the thread. http://lesswrong.com/lw/xu/failed_utopia_42/t4d
I assume that the children were forcibly separated from their families and placed with people (or “people”) who will be “better” for them in the long run.
That may have been the case (Since it is unclear in the story.) but from my perspective, that still doesn’t seem bad enough to cause a near species wide death rage, particularly since if the children are still alive, they might count for AI voting rights as a member of the human species. It seems the AI would have to have done something currently almost universally regarded as utterly horrible and beyond the pale.
There are a lot of possible alternatives, though. Example further alternative: All of Earth’s children were sold to the Baby Eating Aliens for terraforming technology.
Link: http://lesswrong.com/lw/y5/the_babyeating_aliens_18/
In short run I image the kids are quite upset about being separated from their families and being told they’ll never see them again. I don’t have, or work around, kids so I don’t know that this would translate into wishing the AI dead, but it feels plausiblish.