ADT cares about my copy which will have the biggest impact on the future utility and thus favors some kind of utility monster—and this is an analogue of Doomsday argument for ADT.
Example: Under ADT I should assume that I can create FAI and save the world – as in that case I will have the biggest impact on the future utility.
ADT cares about my copy which will have the biggest impact on the future utility and thus favors some kind of utility monster—and this is an analogue of Doomsday argument for ADT.
Example: Under ADT I should assume that I can create FAI and save the world – as in that case I will have the biggest impact on the future utility.