the singularity institute’s budget grows much faster than linearly with cash. … sunk all its income into triple-rollover lottery tickets
I had the same idea of buying very risky investments. Intuitively, it seems that world-saving probability is superlinear in cash. But I think that the intuition is probably incorrect, though I’ll have to rethink now that someone else has had it.
Another advantage of buying triple rollover tickets is that if you adhere to quantum immortality plus the belief that uFAI reliably kills the world, then you’ll win the lottery in all the worlds that you care about.
Another advantage of buying triple rollover tickets is that if you adhere to quantum immortality plus the belief that uFAI reliably kills the world, then you’ll win the lottery in all the worlds that you care about.
If you had such an attitude then the lottery is irrelevant. You don’t care what the ‘world-saving probability’ is so don’t need to manipulate it.
Yes, but you can manipulate whether the world getting saved had anything to do with you, and you can influence what kind of world you survive into.
If you make a low-probability, high reward bet that and really commit to donating the money to an X-risks organization, you may find yourself winning that bet more often than you would probabilistically expect.
In general, QI means that you care about the nature of your survival, but not whether you survive.
I had the same idea of buying very risky investments. Intuitively, it seems that world-saving probability is superlinear in cash. But I think that the intuition is probably incorrect, though I’ll have to rethink now that someone else has had it.
Another advantage of buying triple rollover tickets is that if you adhere to quantum immortality plus the belief that uFAI reliably kills the world, then you’ll win the lottery in all the worlds that you care about.
If you had such an attitude then the lottery is irrelevant. You don’t care what the ‘world-saving probability’ is so don’t need to manipulate it.
Yes, but you can manipulate whether the world getting saved had anything to do with you, and you can influence what kind of world you survive into.
If you make a low-probability, high reward bet that and really commit to donating the money to an X-risks organization, you may find yourself winning that bet more often than you would probabilistically expect.
In general, QI means that you care about the nature of your survival, but not whether you survive.