I didn’t have any other good examples on tap when I originally conceived of the idea, but come to think of it...
Truth: A scientific formula, seemingly trivial at first, but whose consequences, when investigated, lead to some terrible disaster, like the sun going nova. Oops.
Lies involving ‘good’ consequences are heavily dependent upon your utility function. If you define utility in such a way that allows your cult membership to be net-positive, then sure, you might get a happily-ever-after cult future. Whether or not this indicates a flaw in your utility function is a matter of personal choice; rationality cannot tell you what to protect.
That said, we are dealing with Omega, who is serious about those optimals. This really is a falsehood with optimal net long-term utility for you. It might be something like a false belief about lottery odds, which leads to you spending the next couple years wasting large sums of money on lottery tickets… only to win a huge jackpot, hundreds of millions of dollars, and retire young, able to donate huge sums to the charities you consider important. You don’t know, but it is, by definition, the best thing that could possibly happen to you as the result of believing a lie, as you define ‘best thing’.
You don’t know, but it is, by definition, the best thing that could possibly happen to you as the result of believing a lie, as you define ‘best thing’.
If that’s what you meant, then the choice is really “best thing in life” or “worst thing in life”; whatever belief leads you there is of little consequence. Say the truth option leads to an erudite you eradicating all present, past, and future sentient life, and the falsehood option leads to an ignorant you stumbling upon the nirvana-space that grants all infinite super-intelligent bliss and Dr. Manhattan-like superpowers (ironically enough):
What you believed is of little consequence to the resulting state of the verse(s).
I’d say that this is too optimistic. Omega checks the future and if, in fact, you would eventually win the lottery if you started playing, then deluding you about lotteries might be a good strategy. But for most people that Omega talks to, this wouldn’t work.
It’s possible that the number of falsehoods that have one-in-a-million odds of helping you exceeds a million by far, and then it’s very likely that Omega (being omniscient) can choose one that turns out to be helpful. But it’s more interesting to see if there are falsehoods that have at least a reasonably large probability of helping you.
True; being deluded about lotteries is unlikely to have positive consequences normally, so unless something weird is going to go on in the future (eg: the lottery machine’s random number function is going to predictably malfunction at some expected time, producing a predictable set of numbers; which Omega then imposes on your consciousness as being ‘lucky’), that’s not a belief with positive long-term consequences. That’s not an impossible set of circumstances, but it is an easy-to-specify set, so in terms of discussing ‘a false belief which would be long-term beneficial’, it leaps readily to mind.
I didn’t have any other good examples on tap when I originally conceived of the idea, but come to think of it...
Truth: A scientific formula, seemingly trivial at first, but whose consequences, when investigated, lead to some terrible disaster, like the sun going nova. Oops.
Lies involving ‘good’ consequences are heavily dependent upon your utility function. If you define utility in such a way that allows your cult membership to be net-positive, then sure, you might get a happily-ever-after cult future. Whether or not this indicates a flaw in your utility function is a matter of personal choice; rationality cannot tell you what to protect.
That said, we are dealing with Omega, who is serious about those optimals. This really is a falsehood with optimal net long-term utility for you. It might be something like a false belief about lottery odds, which leads to you spending the next couple years wasting large sums of money on lottery tickets… only to win a huge jackpot, hundreds of millions of dollars, and retire young, able to donate huge sums to the charities you consider important. You don’t know, but it is, by definition, the best thing that could possibly happen to you as the result of believing a lie, as you define ‘best thing’.
If that’s what you meant, then the choice is really “best thing in life” or “worst thing in life”; whatever belief leads you there is of little consequence. Say the truth option leads to an erudite you eradicating all present, past, and future sentient life, and the falsehood option leads to an ignorant you stumbling upon the nirvana-space that grants all infinite super-intelligent bliss and Dr. Manhattan-like superpowers (ironically enough):
What you believed is of little consequence to the resulting state of the verse(s).
I’d say that this is too optimistic. Omega checks the future and if, in fact, you would eventually win the lottery if you started playing, then deluding you about lotteries might be a good strategy. But for most people that Omega talks to, this wouldn’t work.
It’s possible that the number of falsehoods that have one-in-a-million odds of helping you exceeds a million by far, and then it’s very likely that Omega (being omniscient) can choose one that turns out to be helpful. But it’s more interesting to see if there are falsehoods that have at least a reasonably large probability of helping you.
True; being deluded about lotteries is unlikely to have positive consequences normally, so unless something weird is going to go on in the future (eg: the lottery machine’s random number function is going to predictably malfunction at some expected time, producing a predictable set of numbers; which Omega then imposes on your consciousness as being ‘lucky’), that’s not a belief with positive long-term consequences. That’s not an impossible set of circumstances, but it is an easy-to-specify set, so in terms of discussing ‘a false belief which would be long-term beneficial’, it leaps readily to mind.