Eliezer:
I’d say most of the ‘optimism’ for this is because you’ve convinced us that much worse situations are much more likely.
Also, we’re picking out the one big thing the AI did wrong that the story is about, and ignoring other things it did wrong. (leaving no technology, kidnapping, creation of likely to be enslaved sentients) I’m sure there’s an already named bias for only looking at ‘big’ effects.
And we’re probably discounting how much better it could have been. All we got was perfect partners, immortality, and one more planet than we had before. But we don’t count the difference between singularity-utopia and #4-2 as a loss.
Eliezer:
I’d say most of the ‘optimism’ for this is because you’ve convinced us that much worse situations are much more likely.
Also, we’re picking out the one big thing the AI did wrong that the story is about, and ignoring other things it did wrong. (leaving no technology, kidnapping, creation of likely to be enslaved sentients) I’m sure there’s an already named bias for only looking at ‘big’ effects.
And we’re probably discounting how much better it could have been. All we got was perfect partners, immortality, and one more planet than we had before. But we don’t count the difference between singularity-utopia and #4-2 as a loss.
Two more planets than we had before. Men are from Mars, Women are...
...from Venus, and only animals left on Earth, so one more planet than we had before.
Well, until we get back there. It’s still ours even if we’re on vacation.