Yeah, I’ve wondered this for a while without getting any closer to an understanding.
It seems that everything that some human “really wants” (and therefore could potentially be included in the CEV target definition) is either something that, if I was sufficiently well-informed about it, I would want for that human (in which case my CEV, properly unpacked by a superintelligence, includes it for them) or is something that, no matter how well informed I was, I would not want for that human (in which case it’s not at all clear that I ought to endorse implementing it).
If CEV-humanity makes any sense at all (which I’m not sure it does), it seems that CEV-arbitrary-subset-of-humanity makes leads to results that are just as good by the standards of anyone whose standards are worth respecting.
My working answer is therefore that it’s valuable to signal the willingness to do so (so nobody feels left out), and one effective way to signal that willingness consistently and compellingly is to precommit to actually doing it.
Yeah, I’ve wondered this for a while without getting any closer to an understanding.
It seems that everything that some human “really wants” (and therefore could potentially be included in the CEV target definition) is either something that, if I was sufficiently well-informed about it, I would want for that human (in which case my CEV, properly unpacked by a superintelligence, includes it for them) or is something that, no matter how well informed I was, I would not want for that human (in which case it’s not at all clear that I ought to endorse implementing it).
If CEV-humanity makes any sense at all (which I’m not sure it does), it seems that CEV-arbitrary-subset-of-humanity makes leads to results that are just as good by the standards of anyone whose standards are worth respecting.
My working answer is therefore that it’s valuable to signal the willingness to do so (so nobody feels left out), and one effective way to signal that willingness consistently and compellingly is to precommit to actually doing it.