So you might reason, “I’m doing martial arts for the exercise and self-defense benefits… but I could purchase both of those things for less time investment by jogging to work and carrying Mace.” If you listened to your emotional reaction to that proposal, however, you might notice you still feel sad about giving up martial arts even if you were getting the same amount of exercise and self-defense benefits somehow else.
Which probably means you’ve got other reasons for doing martial arts that you haven’t yet explicitly acknowledged—for example, maybe you just think it’s cool. If so, that’s important, and deserves a place in your decisionmaking. Listening for those emotional cues that your explicit reasoning has missed something is a crucial step
This is a great example of how human value is complicated. Optimizing for stated or obvious values can miss unstated or subtler values. Before we can figure out how to get what we want, we have to know what we want. I’m glad CFAR is taking this into account.
I’ve been wondering whether utilitarians should be more explicit about what they’re screening off. For example, trying to maximize QWALYs might mean doing less to support your own social network.
This is a great example of how human value is complicated. Optimizing for stated or obvious values can miss unstated or subtler values. Before we can figure out how to get what we want, we have to know what we want. I’m glad CFAR is taking this into account.
I’ve been wondering whether utilitarians should be more explicit about what they’re screening off. For example, trying to maximize QWALYs might mean doing less to support your own social network.