First, let’s be clear what we mean by saying that probabilities are weights on values. Imagine I have an unfair coin which give heads with probability 90%. I care 9 times as much about the possible futures in which the coin comes up heads as I do about the possible futures in which the coins comes up tails. Notice that this does not mean I want to coin to come up heads. What it means is that I would prefer getting a dollar if the coin comes up heads to getting a dollar if the coin comes up tails.
Also see this comment from Squark in the other thread
This is an incorrect interpretation of Coscott’s philosophy. “Caring really hard about winning” = preferring winning to losing. The correct analogy would be “Caring about [whatever] only in case I win”. The losing scenarios are not necessarily assigned low utilities: they are assigned similar utilities. This philosophy is not saying: “I will win because I want to win”. It is saying: “If I lose, all the stuff I normally care about becomes unimportant, so when I’m optimizing this stuff I might just as well assume I’m going to win”. More precisely, it is saying “I will both lose and win but only the winning universe contains stuff that can be optimized”.
It has nothing to do with wanting one world more than another. It is all about thinking that one world is more important than another. If I observe that I am not in an important world, I work to make the most important world that I can change as good as possible.
I think you are misunderstanding me
Also see this comment from Squark in the other thread
It has nothing to do with wanting one world more than another. It is all about thinking that one world is more important than another. If I observe that I am not in an important world, I work to make the most important world that I can change as good as possible.