Is it actually more crippled than a wish-fulfilling FAI? Either sort of AI has to leave resources for people.
However, your point makes me realize that a big threat only FAI (such threats including that it might take too much from people) will need a model of and respect for human desires so that we aren’t left on a minimal reservation.
Is it actually more crippled than a wish-fulfilling FAI? Either sort of AI has to leave resources for people.
However, your point makes me realize that a big threat only FAI (such threats including that it might take too much from people) will need a model of and respect for human desires so that we aren’t left on a minimal reservation.