I think it’s worth noting that truly unlimited power means being able to undo anything. But is it wrong to rewind when things go south? if you rewind far enough you’ll be erasing lives and conjuring up new different ones. Is rewinding back to before an AI explodes into a zillion copies morally equivalent to destroying them in this direction of time? unlimited power is unlimited ability to direct the future. Are the lives on every path you don’t choose “on your shoulders” so to speak?
I often think about a rewound reality, where the only difference is the data in my brain… and the biggest problem I have with this is all the people that are born after the time I’d go back to that I don’t want to unmake.
Of course, my attention span is terrible, so I never follow one of these long enough or thorough enough to simulate how I’d try to avert such issues… then chaos theory would screw it up in spite of all that. The point is that I concur.
A superintelligent AI doesn’t have truly unlimited power. It can’t even violate the laws of physics, let alone morality. If your moral system says that death is inherently bad, then undoing the creation of a child is bad.
I read that quote as saying “if you formalize this intuition, you wind up with the definition of murder”. While not entirely true, that statement does meet the “kill” requirement.
I think it’s worth noting that truly unlimited power means being able to undo anything. But is it wrong to rewind when things go south? if you rewind far enough you’ll be erasing lives and conjuring up new different ones. Is rewinding back to before an AI explodes into a zillion copies morally equivalent to destroying them in this direction of time? unlimited power is unlimited ability to direct the future. Are the lives on every path you don’t choose “on your shoulders” so to speak?
I’m pretty sure that “rewinding” is different to choosing now not to create lives.
I often think about a rewound reality, where the only difference is the data in my brain… and the biggest problem I have with this is all the people that are born after the time I’d go back to that I don’t want to unmake.
Of course, my attention span is terrible, so I never follow one of these long enough or thorough enough to simulate how I’d try to avert such issues… then chaos theory would screw it up in spite of all that. The point is that I concur.
A superintelligent AI doesn’t have truly unlimited power. It can’t even violate the laws of physics, let alone morality. If your moral system says that death is inherently bad, then undoing the creation of a child is bad.
It does seem intuitively right to say that killing something already existing is worse than not creating it in the first place.
(Though, formalizing this intuition is murder. Literally.)
No, murder requires that you kill someone (there are extra moral judgements necessary but the killing is rather unambiguous.)
I read that quote as saying “if you formalize this intuition, you wind up with the definition of murder”. While not entirely true, that statement does meet the “kill” requirement.
… it is?