I think that you need to look at the generators of the instruction to go west.
For example.
Travel west. I want you to maximize the total westward distance you travel.
Getting on a westward orbiting space station would be really good.
Travel west. My utility is linear in your longditude.
You need to figure out where to cut the map and move just to the east of that line.
Travel west. There is a pot of gold a few miles west and I want you to be rich.
The information to distinguish between these interpretations is not within the request to travel west.
You need to look at why you were asked to travel west.
So it’s important to note that the “dissolving” process also generally involves discarding a portion of our values, those that don’t fit neatly on the new map we have.
I don’t think that those values are being discarded, I think they are being broken down into more basic parts.
I think that you need to look at the generators of the instruction to go west.
For example.
Travel west. I want you to maximize the total westward distance you travel.
Getting on a westward orbiting space station would be really good.
Travel west. My utility is linear in your longditude.
You need to figure out where to cut the map and move just to the east of that line.
Travel west. There is a pot of gold a few miles west and I want you to be rich.
The information to distinguish between these interpretations is not within the request to travel west.
You need to look at why you were asked to travel west.
I don’t think that those values are being discarded, I think they are being broken down into more basic parts.
Yes, but I’d argue that most of moral preferences are similarly underdefined when the various interpretations behind them come apart (eg purity).