Nah, I’m fine with replacing “goals” with “terminal values” in my argument.
I still see no law of nature or logic that would prevent an AI from changing its terminal values as it develops.
Nah, I’m fine with replacing “goals” with “terminal values” in my argument.
I still see no law of nature or logic that would prevent an AI from changing its terminal values as it develops.