Another analyis is that there are at least two types of possible problem:
One is the “runaway superintelligence” problem—which the SIAI seems focused on;
Another type of problem involves the preferences of only a small subset of human being respected.
The former problem has potentially more severe consequences (astronomical waste), but an engineering error like that seems pretty unlikely—at least to me.
The latter problem could still have some pretty bad consequences for many people, and seems much more probable—at least to me.
In a resource-limited world, too much attention on the first problem could easily contribute to running into the second problem.
Another analyis is that there are at least two types of possible problem:
One is the “runaway superintelligence” problem—which the SIAI seems focused on;
Another type of problem involves the preferences of only a small subset of human being respected.
The former problem has potentially more severe consequences (astronomical waste), but an engineering error like that seems pretty unlikely—at least to me.
The latter problem could still have some pretty bad consequences for many people, and seems much more probable—at least to me.
In a resource-limited world, too much attention on the first problem could easily contribute to running into the second problem.