Imagine a planet with aliens living on it. Some of those aliens are having what we would consider morally valuable experiences. Some are suffering a lot. Suppose we now find that their planet has been vaporized. By tuning the relative amounts of joy and suffering, we can make it so that the vaporization is exactly neutral under our morality. This feels like a big deal, even if the aliens were in an alternate reality that we could watch but not observe.
Our intuitive feeling of impact is a proxy for how much something effects our values and our ability to achive them. You can set up contrived situations where an event doesn’t actually effect our ability to achive our values, but still triggers the proxy.
Would the technical definition that you are looking for be value of information. Feeling something to be impactful means that a bunch of mental heuristics think it has a large value of info?
An alien planet contains joy and suffering in a ratio that makes them exactly cancel out according to your morality. You are exactly ambivalent about the alien planet blowing up. The alien planet can’t be changed by your actions, so you don’t need to cancel plans to go there and reduce the suffering when you find out that the planet blew up. Say that they existed long ago. In general we are setting up the situation so that the planet blowing up doesn’t change your expected utility, or the best action for you to take. We set this up by a pile of contrivances. This still feels impactful.
That doesn’t feel at all impactful to me, under those assumptions. It feels like I’ve learned a new fact about the world, which isn’t the same feeling. ETA Another example of this was mentioned by Slider: if you’re a taxi driver ambivalent between different destinations, and the client announces where they want to go, it feels like you’ve learned something but doesn’t feel impactful (in the way I’m trying to point at).
I think an issue we might run into here is that I don’t exist in your mind, and I’ve tried to extensionally define for you what I’m pointing at. So if you try to find edge cases according to your understanding of exactly which emotion I’m pointing to, then you’ll probably be able to, and it could be difficult for me to clarify without access to your emotions. That said, I’m still happy to try, and I welcome this exploration of how what I’ve claimed lines up with others’ experiences.
Imagine a planet with aliens living on it. Some of those aliens are having what we would consider morally valuable experiences. Some are suffering a lot. Suppose we now find that their planet has been vaporized. By tuning the relative amounts of joy and suffering, we can make it so that the vaporization is exactly neutral under our morality. This feels like a big deal, even if the aliens were in an alternate reality that we could watch but not observe.
Our intuitive feeling of impact is a proxy for how much something effects our values and our ability to achive them. You can set up contrived situations where an event doesn’t actually effect our ability to achive our values, but still triggers the proxy.
Would the technical definition that you are looking for be value of information. Feeling something to be impactful means that a bunch of mental heuristics think it has a large value of info?
Can you elaborate the situation further? I’m not sure I follow where the proxy comes apart, but I’m interested in hearing more.
An alien planet contains joy and suffering in a ratio that makes them exactly cancel out according to your morality. You are exactly ambivalent about the alien planet blowing up. The alien planet can’t be changed by your actions, so you don’t need to cancel plans to go there and reduce the suffering when you find out that the planet blew up. Say that they existed long ago. In general we are setting up the situation so that the planet blowing up doesn’t change your expected utility, or the best action for you to take. We set this up by a pile of contrivances. This still feels impactful.
That doesn’t feel at all impactful to me, under those assumptions. It feels like I’ve learned a new fact about the world, which isn’t the same feeling. ETA Another example of this was mentioned by Slider: if you’re a taxi driver ambivalent between different destinations, and the client announces where they want to go, it feels like you’ve learned something but doesn’t feel impactful (in the way I’m trying to point at).
I think an issue we might run into here is that I don’t exist in your mind, and I’ve tried to extensionally define for you what I’m pointing at. So if you try to find edge cases according to your understanding of exactly which emotion I’m pointing to, then you’ll probably be able to, and it could be difficult for me to clarify without access to your emotions. That said, I’m still happy to try, and I welcome this exploration of how what I’ve claimed lines up with others’ experiences.