Sweet! I’d lost a copy of my exact syllogistic wording.
Damien_R._S.
Karma: 13
I hate that Planck quote. It’s full of “truthiness”. I think it is in fact falsified by the histories of relativity, quantum mechanics, and continental drift/plate tectonics. I’m pretty confident about the latter, trusting Hofstadter’s class lectures more for the former two.
The story has problems, and it’s not clear how it’s meant to be taken.
Way 1: we should believe the SAI, being a SAI, and so everyone will in fact be happier within a week. This creates cognitive dissonance, what with the scenario seeming flawed to us, and putting us in a position of rejecting a scenario that makes us happier.
Way 2: we should trust our reason, and evaluate the scenario on its own merits. This creates the cognitive dissonance of the SAI being really stupid. Yeah, being immortal and having a nice companion and good life support and protection is good, but it’s a failed utopia because it’s trivially improvable. The fridge logic is strong in this one, and much has been pointed out already: gays, opposite-sex friends, family. More specific than family: children. What happened to the five year olds in this scenario?
The AI was apparently programmed by a man who had no close female friends, no children, and was not close to his mother. Otherwise the idea that either catgirls or Belldandies should lead to a natural separation of the sexes would not occur. (Is the moral that such people should not be allowed to define gods? Duh.) If I had a catgirl/non-sentient sexbot, that would not make me spend less time with true female friends, or stop calling my mother (were she still alive.) Catgirl doesn’t play Settler of Catan or D&D or talk about politics. A Belldandy might, in the sense that finding a perfect mate often leads to spending less time with friends, but it still needn’t mean being happy with them being cut off, or being unreceptive to meeting new friends of either sex.
So yeah, it’s a pretty bad utopia, defensible only in the “hey, not dying or physically starving” way. But it’s implausibly bad, because it could be so much better by doing less work: immortalize people on Earth, angelnet Earth, give people the option of summoning an Idealized Companion. Your AI had to go to more effort for less result, and shouldn’t have followed this path if it had any consultation with remotely normal people. (Where are the children?)