You should weigh the importance of your choices more highly.
This doesn’t mean future stuff doesn’t matter; but it makes it so it’s not an obvious choice.
Suppose you do something that has a chance of saving the world. Suppose there have been 100 billion people so far. The expected amount you’d do is ∫k/n dn = ln(n_2/n_1) If there’s less than 200 billion people, that’s k ln 2. If it’s less than 210^40, that’s k ln 210^29. It works out to being about 100 times as important. That seems like a lot, but charity tends to work in orders of magnitude difference.
I’m not sure how good a value 10^40 is, but I think the order of magnitude is within a factor of two, so the predicted value would be within that.
You should weigh the importance of your choices more highly.
This doesn’t mean future stuff doesn’t matter; but it makes it so it’s not an obvious choice.
Suppose you do something that has a chance of saving the world. Suppose there have been 100 billion people so far. The expected amount you’d do is ∫k/n dn = ln(n_2/n_1) If there’s less than 200 billion people, that’s k ln 2. If it’s less than 210^40, that’s k ln 210^29. It works out to being about 100 times as important. That seems like a lot, but charity tends to work in orders of magnitude difference.
I’m not sure how good a value 10^40 is, but I think the order of magnitude is within a factor of two, so the predicted value would be within that.