(Un)luckily we don’t have many examples of potentially world destroying arms races. We might have to adopt the inside view. We’d have to look at how much mutual trust and co-operation there is currently for various things. Beyond my current knowledge.
By the research aspect, I think research can be done without the public having a good understanding of the problems. E.g. cern/CRISPR. I can also think of other bad outcomes of the the public having an understanding of AIrisk. It might be used as another stick to take away freedoms, see the war on terrorism and drugs for examples of the public’s fears.
Convincing the general public of AIrisk seems like shouting fire in crowded movie theatre, it is bound to have a large and chaotic impact on society.
This is the best steelman of this argument, that I can think of at the moment. I’m not sure I’m convinced. But I do think we should put more brain power into this question.
Literally speaking, I don’t disagree. It’s possible that spreading awareness has a net negative outcome. It’s just not likely. I don’t discourage looking into the question, and if facts start pointing the other way I can be convinced. But while we’re still vaguely uncertain, we should act on what seems more likely right now.
I would never argue for inaction. I think this line of thinking would argue for efforts being made to make sure any AGI researchers were educated but no efforts were made to make sure other people were (in the most extreme case).
But yep we may as well carry on as we are for the moment.
I guess it’s a legit argument, but it doesn’t have the research aspect and it’s a sample size of one.
(Un)luckily we don’t have many examples of potentially world destroying arms races. We might have to adopt the inside view. We’d have to look at how much mutual trust and co-operation there is currently for various things. Beyond my current knowledge.
By the research aspect, I think research can be done without the public having a good understanding of the problems. E.g. cern/CRISPR. I can also think of other bad outcomes of the the public having an understanding of AIrisk. It might be used as another stick to take away freedoms, see the war on terrorism and drugs for examples of the public’s fears.
Convincing the general public of AIrisk seems like shouting fire in crowded movie theatre, it is bound to have a large and chaotic impact on society.
This is the best steelman of this argument, that I can think of at the moment. I’m not sure I’m convinced. But I do think we should put more brain power into this question.
That sounds dangerously like justifying inaction.
Literally speaking, I don’t disagree. It’s possible that spreading awareness has a net negative outcome. It’s just not likely. I don’t discourage looking into the question, and if facts start pointing the other way I can be convinced. But while we’re still vaguely uncertain, we should act on what seems more likely right now.
I would never argue for inaction. I think this line of thinking would argue for efforts being made to make sure any AGI researchers were educated but no efforts were made to make sure other people were (in the most extreme case).
But yep we may as well carry on as we are for the moment.