Of course, it’s all a matter of degrees, some people channel their love to pets alone, some to partners and pets bit not children, etc. I was simplifying.
I don’t think this affects the high-level points I’m making: widespread AI partners will have rather catastrophic effect on the society, unless we bet on a relatively quick transformation into even weirder societal states, with AGIs as full members of societies (including as romantic partners), BCI, mind uploads, Chalmers’ experience machines, etc.
However, AI partners don’t appear as net positive without assuming all these downstream changes, and there will be no problem with introducing AI partners only when these downstream advances become available (there is a counterargument here that there is some benefit to letting society “adjust” to new arrangements, but it doesn’t make sense in this context, given the expected net negativity of this adjustment and maybe even “nuclear energy effect” of bad first experiences). Therefore, introducing future civilisational transformations into the argument don’t bail out AI partners as permissible businesses, as of 2023.
Of course, it’s all a matter of degrees, some people channel their love to pets alone, some to partners and pets bit not children, etc. I was simplifying.
I don’t think this affects the high-level points I’m making: widespread AI partners will have rather catastrophic effect on the society, unless we bet on a relatively quick transformation into even weirder societal states, with AGIs as full members of societies (including as romantic partners), BCI, mind uploads, Chalmers’ experience machines, etc.
However, AI partners don’t appear as net positive without assuming all these downstream changes, and there will be no problem with introducing AI partners only when these downstream advances become available (there is a counterargument here that there is some benefit to letting society “adjust” to new arrangements, but it doesn’t make sense in this context, given the expected net negativity of this adjustment and maybe even “nuclear energy effect” of bad first experiences). Therefore, introducing future civilisational transformations into the argument don’t bail out AI partners as permissible businesses, as of 2023.