The reason I mentioned that was to make the point that the problems are a function of progress in general and aren’t specific to AI—they are just exacerbated by AI. I think this is a weak reason to expect that solutions are likely to come from outside of AI.
This doesn’t make much sense to me. Why is this any kind of reason to expect that solutions are likely to come from outside of AI? Can you give me an analogy where this kind of reasoning more obviously makes sense?
Just to make sure I’m not misunderstanding, this was meant to be an observation, and not meant to argue that I personally should prioritize this, right?
Right, this argument wasn’t targeted to you, but I think there are other reasons for you to personally prioritize this. See my comment in the parallel thread.
This doesn’t make much sense to me. Why is this any kind of reason to expect that solutions are likely to come from outside of AI? Can you give me an analogy where this kind of reasoning more obviously makes sense?
Right, this argument wasn’t targeted to you, but I think there are other reasons for you to personally prioritize this. See my comment in the parallel thread.