If the AI is not guaranteed friendly by construction in the first place, it should never be released, whatever it says.
What if doom is imminent and we are unable to do something about it?
We die.
We check and see if we are committing the conjunction fallacy and wrongly think doom is imminent.
We release it. (And then we still probably die.)
What if doom is imminent and we are unable to do something about it?
We die.
We check and see if we are committing the conjunction fallacy and wrongly think doom is imminent.
We release it. (And then we still probably die.)