On a personal level, it seems quite unlikely that any individual can meaningfully alter the risk of an existential catastrophe enough for their own efforts to be justified selfishly.
I think this depends a lot on 1) time discounting 2) whether you think there will be anything like impact certificates / rewards for helping in the future. That is, it may be the case that increasing chance of positive singularity by 1/million is worth more than your natural lifespan in EV terms (while, of course, mattering very little for most discount rates). And if you think the existence of Earth is currently worth like 2 quadrillion dollars (annual world GDP * 20), and you can increase probability of survival by a millionth, and you’ll be compensated something like a thousandth of the value you provided, then you’re looking at $2M in present value.
I think this depends a lot on 1) time discounting 2) whether you think there will be anything like impact certificates / rewards for helping in the future. That is, it may be the case that increasing chance of positive singularity by 1/million is worth more than your natural lifespan in EV terms (while, of course, mattering very little for most discount rates). And if you think the existence of Earth is currently worth like 2 quadrillion dollars (annual world GDP * 20), and you can increase probability of survival by a millionth, and you’ll be compensated something like a thousandth of the value you provided, then you’re looking at $2M in present value.