There’s something I’ve seen some rationalists try for, which I think Eliezer might be aiming at here, which is to try and be a truly robust agent.
I really like this phrasing. Previously, I’ve just had vague sense that there are ways to have a lot more integrity such that many situations get a huge utility bump, which can only be achieved through very clear thinking. Your comment helped me make that a lot more concrete.
I really like this phrasing. Previously, I’ve just had vague sense that there are ways to have a lot more integrity such that many situations get a huge utility bump, which can only be achieved through very clear thinking. Your comment helped me make that a lot more concrete.