I’ve said before that I tentatively think that “foster global coordination” might be a good cause area in its own right, because it benefits so many other cause areas. I think it might be useful to have a term for the cause areas that global coordination would help. More specifically, a term for the concept “(reasonably significant) problem that requires global coordination to solve, or that global coordination would significantly help with solving.” I propose “global coordination problem” (though I’m open to other suggestions). You may object “but coordination problem already has a meaning in game theory, this is likely to get confused with that.” But global coordination problems are coordination problems in precisely the game theory sense (I think, feel free to correct me), so the terminological overlap is a benefit.
What are some examples of global coordination problems? Certain x-risks and global catastrophic risks (such as AI, bioterrorism, pandemic risk, asteriod risk), climate change, some of the problems mentioned in The Possibility of an Ongoing Moral Catastrophe, as well as the general problem of ferreting out and fixing moral catastrophes, and almost certainly others.
In fact, it may be useful to think about a spectrum of problems, similar to Bostrom’s Global Catastrophic Risk spectrum, organized by how much coordination is required to solve them. Analogous to Bostrom’s spectrum, we could have: personal coordination problems (i.e. problems requiring no coordination with others, or perhaps only coordination with parts of oneself), local coordination problems, national coordination problems, global coordination problems, and transgenerational coordination problems.
Global coordination problems
I’ve said before that I tentatively think that “foster global coordination” might be a good cause area in its own right, because it benefits so many other cause areas. I think it might be useful to have a term for the cause areas that global coordination would help. More specifically, a term for the concept “(reasonably significant) problem that requires global coordination to solve, or that global coordination would significantly help with solving.” I propose “global coordination problem” (though I’m open to other suggestions). You may object “but coordination problem already has a meaning in game theory, this is likely to get confused with that.” But global coordination problems are coordination problems in precisely the game theory sense (I think, feel free to correct me), so the terminological overlap is a benefit.
What are some examples of global coordination problems? Certain x-risks and global catastrophic risks (such as AI, bioterrorism, pandemic risk, asteriod risk), climate change, some of the problems mentioned in The Possibility of an Ongoing Moral Catastrophe, as well as the general problem of ferreting out and fixing moral catastrophes, and almost certainly others.
In fact, it may be useful to think about a spectrum of problems, similar to Bostrom’s Global Catastrophic Risk spectrum, organized by how much coordination is required to solve them. Analogous to Bostrom’s spectrum, we could have: personal coordination problems (i.e. problems requiring no coordination with others, or perhaps only coordination with parts of oneself), local coordination problems, national coordination problems, global coordination problems, and transgenerational coordination problems.
Nuclear arms control & anti-proliferation efforts are a big one here. Other forms of arms control are important too.