I’m reposting Carl’s Facebook comments to LW, for convenience. Carl’s comments were:
Economic growth makes the world more peaceful and cooperative in multiple ways, reducing the temptation to take big risks with dangerous technologies to get ahead, risk of arms races, mistrust stopping international coordination, the chance of a nuclear war brutalizing society, and others. Link 1
Economic growth also makes people care more about long-term problems like global warming, and be more inclusive of and friendly towards foreigners and other neglected groups. Link 2
Then there’s the fact that Moore’s law is much faster than economic growth, and software spending is also growing as a share of the economy. So an overall stagnant economy does not mean stagnant technology.
Plus the model of serial-intensive FAI action you are using to drive the benefits of moving early relies on a lot of extreme predictions relative to the distribution of expert opinion, without a good predictive track record to back them up, and a plausible bias explanation. Otherwise, in the likely event that it is not dispositive, other factors predominate. It’s too small relative to all the other things that get affected.
[So] generally I think a uniform worldwide increase in per capita incomes improves global odds of good long-run futures.
Eliezer replied to Carl:
The most powerful mechanisms for this, in your model, are that (a) wealth transmits to international cooperation which improves FAI vs. UFAI somehow, and (b) wealth transmits to concern about global tidiness which you think successfully transmits more to FAI vs. UFAI? Neither of these forces have very much effectual power at all in my visualization—I wouldn’t mention them in the same breath as Moore’s Law or total funding for AI. They’re both double-fragile mechanisms.
It’s worth noting that the relationship between economic growth and the expected quality of global outcomes is not necessarily a linear one. The optimal speed of economic growth may be neither super-slow nor super-fast, but some “just right” value in between that makes peace, cooperation, and long-term thinking commonplace while avoiding technological advancement substantially faster than what we see today.
I’m reposting Carl’s Facebook comments to LW, for convenience. Carl’s comments were:
Eliezer replied to Carl:
It’s worth noting that the relationship between economic growth and the expected quality of global outcomes is not necessarily a linear one. The optimal speed of economic growth may be neither super-slow nor super-fast, but some “just right” value in between that makes peace, cooperation, and long-term thinking commonplace while avoiding technological advancement substantially faster than what we see today.