[Please read the OP before voting. Special voting rules apply.]
The dangers of UFAI are minimal.
Do you think that it is unlikely for a UFAI to be created, that if a UFAI is created it will not be dangerous, or both?
I think humans will become sufficiently powerful that UFAI does not represent a threat to them before creating UFAI.
“Dangers” being defined as probability times disutility, right?
With the caveat that I’m treating unbounded negative utility as invalid, sure.
Please do elaborate!
[Please read the OP before voting. Special voting rules apply.]
The dangers of UFAI are minimal.
Do you think that it is unlikely for a UFAI to be created, that if a UFAI is created it will not be dangerous, or both?
I think humans will become sufficiently powerful that UFAI does not represent a threat to them before creating UFAI.
“Dangers” being defined as probability times disutility, right?
With the caveat that I’m treating unbounded negative utility as invalid, sure.
Please do elaborate!