To me the Prime Directive approach makes more sense.
As an example outside of sci-fi, if you see an abusive husband and a brainwashed battered wife, the Prime Directive tells you to ignore the whole situation, because they both think it’s more or less okay that way. Would you accept this consequence?
Would it make a moral difference if the husband and wife were members of a different culture; if they were humans living on a different planet; or if they belonged to a different sapient species?
The idea behind the PD is that for foreign enough cultures
you can’t predict the consequences of your intervention with a reasonable certainty,
you can’t trust your moral instincts to guide you to do the “right” thing
the space of all favorable outcomes is likely much smaller than that of all possible outcomes, like in the literal genie case
so you end up acting like a UFAI more likely than not.
Hence non-intervention has a higher expected utility than an intervention based on your personal deontology or virtue ethics. This is not true for sufficiently well analyzed cases, like abuse in your own society. The farther you stray from the known territory, the more chances that your intervention will be a net negative. Human history is rife with examples of this.
So, unless you can do a full consequentialist analysis of applying your morals to an alien culture, keep the hell out.
As an example outside of sci-fi, if you see an abusive husband and a brainwashed battered wife, the Prime Directive tells you to ignore the whole situation, because they both think it’s more or less okay that way. Would you accept this consequence?
Would it make a moral difference if the husband and wife were members of a different culture; if they were humans living on a different planet; or if they belonged to a different sapient species?
The idea behind the PD is that for foreign enough cultures
you can’t predict the consequences of your intervention with a reasonable certainty,
you can’t trust your moral instincts to guide you to do the “right” thing
the space of all favorable outcomes is likely much smaller than that of all possible outcomes, like in the literal genie case
so you end up acting like a UFAI more likely than not.
Hence non-intervention has a higher expected utility than an intervention based on your personal deontology or virtue ethics. This is not true for sufficiently well analyzed cases, like abuse in your own society. The farther you stray from the known territory, the more chances that your intervention will be a net negative. Human history is rife with examples of this.
So, unless you can do a full consequentialist analysis of applying your morals to an alien culture, keep the hell out.