Do you expect a “best effort” FAI to be worse than UFAI or human extinction? Also, could there be a solution to the problem of philosophical principles being “fixed forever as governing principles for a new reality”?
Do you expect a “best effort” FAI to be worse than UFAI or human extinction? Also, could there be a solution to the problem of philosophical principles being “fixed forever as governing principles for a new reality”?