...a FAI gains you as much difference as available, minus the opportunity cost of FAI’s development...
Exactly. So for building FAI to be a good idea we need to expect its benefits to outweigh the opportunity cost (we can spend the remaining time “partying” rather than developing FAI).
For example, one possible effect is a few years of strongly optimized world, which might outweigh all of the moral value of the past human history.
Neat. One way it might work is the FAI running much-faster-than-realtime WBE’s so that we gain a huge amount of subjective years of life. This works for any inevitable impending disaster.
Exactly. So for building FAI to be a good idea we need to expect its benefits to outweigh the opportunity cost (we can spend the remaining time “partying” rather than developing FAI).
Neat. One way it might work is the FAI running much-faster-than-realtime WBE’s so that we gain a huge amount of subjective years of life. This works for any inevitable impending disaster.