why suffer a huge utility hit to preserve a blackbox, which at its best is still much worse than your best, and at its worst is possibly truly astronomically dreadful?
the reason i disagree with this is “killing people is bad” — i.e. i care more about satisfying the values of currently existing moral patients than satisfying the values of potential alternate moral patients; and those values can include “continuing to exist”. so if possible, even up to some reasonable compute waste factor, i’d want moral patients currently existing past event horizons to have their values satisfied.
as for the blackbox and universal abhorrence thing, i think that that smuggles in the assumptions “civilizations will tend to have roughly similar values” and “a civilization’s fate (such as being in an HEC without decryption keys) can be taken as representative of most of its inhabitants’ wills, let alone all”. that latter assumption especially, is evidenced against by the current expected fate of our own civilization (getting clipped).
the reason i disagree with this is “killing people is bad” — i.e. i care more about satisfying the values of currently existing moral patients than satisfying the values of potential alternate moral patients; and those values can include “continuing to exist”. so if possible, even up to some reasonable compute waste factor, i’d want moral patients currently existing past event horizons to have their values satisfied.
as for the blackbox and universal abhorrence thing, i think that that smuggles in the assumptions “civilizations will tend to have roughly similar values” and “a civilization’s fate (such as being in an HEC without decryption keys) can be taken as representative of most of its inhabitants’ wills, let alone all”. that latter assumption especially, is evidenced against by the current expected fate of our own civilization (getting clipped).