Some of what seems to me to be good arguments against entering the field, depending on what you include as the field.
We may live in a world where AI safety is either easy, or almost impossible to solve. In such cases it may be better to work e.g. on global coordination or rationality of leaders
It may be the case the “near-term” issues with AI will transform the world in a profound way / are big enough to pose catastrophic risks, and given the shorter timelines, and better tractability, they are higher priority. (For example, you can imagine technological unemployment + addictive narrow AI aided VR environments + decay of shared epistemology leading to unraveling of society. Or narrow AI accelerating biorisk.)
It may be the case the useful work on reduction of AI risk requires very special talent / judgment calibrated in special ways / etc. and the many people who want to enter the field will mostly harm the field, because the people who should start working on it will be drowned out by the noise created by the large mass.
(Note: I do not endorse the arguments. Also they are not answering the part about worrying.)
Some of what seems to me to be good arguments against entering the field, depending on what you include as the field.
We may live in a world where AI safety is either easy, or almost impossible to solve. In such cases it may be better to work e.g. on global coordination or rationality of leaders
It may be the case the “near-term” issues with AI will transform the world in a profound way / are big enough to pose catastrophic risks, and given the shorter timelines, and better tractability, they are higher priority. (For example, you can imagine technological unemployment + addictive narrow AI aided VR environments + decay of shared epistemology leading to unraveling of society. Or narrow AI accelerating biorisk.)
It may be the case the useful work on reduction of AI risk requires very special talent / judgment calibrated in special ways / etc. and the many people who want to enter the field will mostly harm the field, because the people who should start working on it will be drowned out by the noise created by the large mass.
(Note: I do not endorse the arguments. Also they are not answering the part about worrying.)