And with that in mind, how would it have affected the sanity waterline if Tony had donated that $135 to an institution that’s pursuing the improvement of human rationality?
Look, sometimes you’ve just got to do things because they’re awesome.
But would you feel comfortable with that maxim encoded in an AI’s utility function?
For a sufficiently rigorous definition of “awesome”, why not?
If its a terminal value then CEV should converge to it.
And with that in mind, how would it have affected the sanity waterline if Tony had donated that $135 to an institution that’s pursuing the improvement of human rationality?
Look, sometimes you’ve just got to do things because they’re awesome.
But would you feel comfortable with that maxim encoded in an AI’s utility function?
For a sufficiently rigorous definition of “awesome”, why not?
If its a terminal value then CEV should converge to it.