Well, curing cancer might be more important than finding a cure for common cold, but that doesn’t necessarily mean you should be trying to cure cancer instead of trying to get rid of common cold, unless of course you have some inner quality that makes you uniquely capable of curing cancer. There are other considerations.
Reducing existential risks is important. But suppose it is not as important as ending world poverty. There’s also lot of uncertainty. It may be that no matter how hard we try, something will come out of the blue and kill us all (three hours from now). Still, if you are the only one who is doing something about existential risks, and is capable of reducing it a tiny bit, your work is very valuable.
The thing is, outside few communities like this one, no one really cares about existential risks (even global warming is a political phenomenon for most people, rather than a scientific one. Other existential risks make blue-collar oil drillers go to space and blow up asteroids).
I don’t think that everyone working on x-risk should quit x-risk. I also don’t think that no one should go into x-risk. Obviously, we need some people working on x-risk, even if it’s only for value of information considerations.
~
Still, if you are the only one who is doing something about existential risks, and is capable of reducing it a tiny bit, your work is very valuable.
How would you know if you’re capable of reducing it a tiny bit?
Well, curing cancer might be more important than finding a cure for common cold, but that doesn’t necessarily mean you should be trying to cure cancer instead of trying to get rid of common cold, unless of course you have some inner quality that makes you uniquely capable of curing cancer. There are other considerations.
Reducing existential risks is important. But suppose it is not as important as ending world poverty. There’s also lot of uncertainty. It may be that no matter how hard we try, something will come out of the blue and kill us all (three hours from now). Still, if you are the only one who is doing something about existential risks, and is capable of reducing it a tiny bit, your work is very valuable.
The thing is, outside few communities like this one, no one really cares about existential risks (even global warming is a political phenomenon for most people, rather than a scientific one. Other existential risks make blue-collar oil drillers go to space and blow up asteroids).
I don’t think that everyone working on x-risk should quit x-risk. I also don’t think that no one should go into x-risk. Obviously, we need some people working on x-risk, even if it’s only for value of information considerations.
~
How would you know if you’re capable of reducing it a tiny bit?