As an aside, if your goal is earning to give, I think it currently looks unwise to go into distributed systems or cryptography rather than AI
True, though the other two are much less likely to cause an increase in existential risk. I would pretty strongly prefer we don’t randomly send people who are interested in helping the world to work on the single most dangerous technology that humanity is developing, without a plan for how to make it go better. My current guess is it’s worth it to just buy out marginal people, especially people with somewhat of an interest in AGI, from working in AI at all, so going into AI seems pretty net-negative to me, even if you make a good amount of money.
(I think one could make an argument that general non-AGI related AI work isn’t that bad for the world, but my sense is that a lot of the highest-paying AI jobs have some pretty decent impact on AI capabilities, either by directly involving research, or by substantially increasing the amount of commercialization and therefore funding and talent that goes into AI. I bet there are some AI jobs you can find that won’t make any substantial difference on AGI, so if you keep that in mind and are pretty conservative with regards to improving commercialization of AI technologies or attracting more talent to the field, you might be fine.)
True, though the other two are much less likely to cause an increase in existential risk. I would pretty strongly prefer we don’t randomly send people who are interested in helping the world to work on the single most dangerous technology that humanity is developing, without a plan for how to make it go better. My current guess is it’s worth it to just buy out marginal people, especially people with somewhat of an interest in AGI, from working in AI at all, so going into AI seems pretty net-negative to me, even if you make a good amount of money.
(I think one could make an argument that general non-AGI related AI work isn’t that bad for the world, but my sense is that a lot of the highest-paying AI jobs have some pretty decent impact on AI capabilities, either by directly involving research, or by substantially increasing the amount of commercialization and therefore funding and talent that goes into AI. I bet there are some AI jobs you can find that won’t make any substantial difference on AGI, so if you keep that in mind and are pretty conservative with regards to improving commercialization of AI technologies or attracting more talent to the field, you might be fine.)