For example, if I get smarter, would I stop loving my family because I applied too much optimization pressure to my own values? I think not.
This seems more likely than you might imagine to me. Not certain or not even an event of very high probability, but probable enough that you should take it into consideration.
This seems more likely than you might imagine to me. Not certain or not even an event of very high probability, but probable enough that you should take it into consideration.