It’s a legitimate possibility that FAI is just too hard for the human race to achieve from anything like our current state, so that (barring some fantastic luck) we’re either doomed to an extinction event, or to a “cosmic locust” future, or to something completely different.
In fact, I’d bet 20 karma against 10 that Eliezer would assign a probability of at least 1% to this being the case, and I’d bet 50 against 10 that Robin assigns a probability of 50% or greater to it.
However, if FAI is in fact too difficult, then the SIAI program seems to do no harm; and if it’s not too hard, it could do a world of good. (This is one benefit of the “provably Friendly” requirement, IMO.)
It’s a legitimate possibility that FAI is just too hard for the human race to achieve from anything like our current state, so that (barring some fantastic luck) we’re either doomed to an extinction event, or to a “cosmic locust” future, or to something completely different.
In fact, I’d bet 20 karma against 10 that Eliezer would assign a probability of at least 1% to this being the case, and I’d bet 50 against 10 that Robin assigns a probability of 50% or greater to it.
However, if FAI is in fact too difficult, then the SIAI program seems to do no harm; and if it’s not too hard, it could do a world of good. (This is one benefit of the “provably Friendly” requirement, IMO.)