Yeah, pretty much. I’d actually expect a FAI to place a very high value on survival, since it knows that it’s own survival benefits humanity greatly. An “Apathetic FAI” is… a very weird idea.
Although if this is an iterated dilemma, I’d consider it a good opening, even if it’d still get killed in this iteration :)
Yeah, pretty much. I’d actually expect a FAI to place a very high value on survival, since it knows that it’s own survival benefits humanity greatly. An “Apathetic FAI” is… a very weird idea.
Although if this is an iterated dilemma, I’d consider it a good opening, even if it’d still get killed in this iteration :)