People here sometimes say that a rational agent should never change its terminal values.
That’s simply mistaken. There are well-known cases where it is rational to change your “terminal” values.
Think about what might happen if you meet another agent of similar power but with different values / look into “vicarious selection” / go read Steve Omohundro.
That’s simply mistaken. There are well-known cases where it is rational to change your “terminal” values.
Think about what might happen if you meet another agent of similar power but with different values / look into “vicarious selection” / go read Steve Omohundro.