Don’t mistake me, and think that I’m talking about the Hollywood Rationality stereotype that rationalists should be selfish or shortsighted. If your utility function has a term in it for others, then win their happiness. If your utility function has a term in it for a million years hence, then win the eon.
But at any rate, WIN. Don’t lose reasonably, WIN.
If it turns out that the techniques we advocate predictably lose, even though we thought they were reasonable, even though they came from our best mathematical investigation into what a rational agent should do, then we will conclude that those techniques are not actually rational, and we should figure out something else.
From Newcomb’s Problem and Regret of Rationality:
If it turns out that the techniques we advocate predictably lose, even though we thought they were reasonable, even though they came from our best mathematical investigation into what a rational agent should do, then we will conclude that those techniques are not actually rational, and we should figure out something else.