Worse, probably, since an ideal FAI would just optimize hard for utility, rather than actively trying to set a moral example.
Worse, probably, since an ideal FAI would just optimize hard for utility, rather than actively trying to set a moral example.