There is no such excuse as ‘but I did everything I am supposed to’.
Huh?
Imagine a lottery with a $500 prize, 100 tickets sold for a dollar each. The rational thing to do is buy every ticket you can. But you get to the sales office too late, and one ticket has already been sold. You buy the remainder, but don’t win the lottery. You ended up losing money, but you did everything right, didn’t you?
Well, rationalists should end up “winning” insofar as winning means “doing better than non-rationalists ON AVERAGE.
Then again, it doesn’t mean all rationalists end up living 120 years old and extremely rich. If yo are a non-rationalist born with 1 billion of dollars on your bank account you’ll probably end up richer than a rationalist born in North korea in a poor family with no legs and no arms.
But on the other hand, if you cannot identify the causes for your defeats as completely independant of yourself, it probably means you are doing something wrong or at least not optimally.
In the lottery example above, there is 99 other worlds where the rationalist who bought the tickets is better off than the man who did not (unless the lottery is rigged, in which case the rationalist is the one who realised that this smells funny and doesn’t buy tickets). Or more intuitively, if there is a lot of such lotteries, the Rationalist buying the tickets every time will end up richer than the man who doesn’t.
IN YOUR LIFE, there is probably enough such “lotteries” for you to end up better off if you are rationalist than if you are not, and reliably so.
(and “you did everything right” but maybe the right thing to do would have been to arrive at the sales office earlier).
On the other hand, you should expect such a thing to only happen 1% of the times in average, so if you’re consistently unlucky for a long period of time, odds are you’re doing something wrong.
It is not in principle possible to do better in that scenario. It is in principle possible to do better than, say, two-boxing on Newcomb’s problem, even though a CDT agent always does that.
If I randomly get hit by a meteor, there isn’t a lot I could have done to avoid it. If I willingly drive faster than the speed limit and get myself killed in an accident, there isn’t a lot of excuses for why not to abide by the speed limit and survive.
Huh?
Imagine a lottery with a $500 prize, 100 tickets sold for a dollar each. The rational thing to do is buy every ticket you can. But you get to the sales office too late, and one ticket has already been sold. You buy the remainder, but don’t win the lottery. You ended up losing money, but you did everything right, didn’t you?
Well, rationalists should end up “winning” insofar as winning means “doing better than non-rationalists ON AVERAGE.
Then again, it doesn’t mean all rationalists end up living 120 years old and extremely rich. If yo are a non-rationalist born with 1 billion of dollars on your bank account you’ll probably end up richer than a rationalist born in North korea in a poor family with no legs and no arms.
But on the other hand, if you cannot identify the causes for your defeats as completely independant of yourself, it probably means you are doing something wrong or at least not optimally.
In the lottery example above, there is 99 other worlds where the rationalist who bought the tickets is better off than the man who did not (unless the lottery is rigged, in which case the rationalist is the one who realised that this smells funny and doesn’t buy tickets). Or more intuitively, if there is a lot of such lotteries, the Rationalist buying the tickets every time will end up richer than the man who doesn’t.
IN YOUR LIFE, there is probably enough such “lotteries” for you to end up better off if you are rationalist than if you are not, and reliably so.
(and “you did everything right” but maybe the right thing to do would have been to arrive at the sales office earlier).
On the other hand, you should expect such a thing to only happen 1% of the times in average, so if you’re consistently unlucky for a long period of time, odds are you’re doing something wrong.
It is not in principle possible to do better in that scenario. It is in principle possible to do better than, say, two-boxing on Newcomb’s problem, even though a CDT agent always does that.
If I randomly get hit by a meteor, there isn’t a lot I could have done to avoid it. If I willingly drive faster than the speed limit and get myself killed in an accident, there isn’t a lot of excuses for why not to abide by the speed limit and survive.