That doesn’t work because the expected value of things that you should do, e.g. donating to an effective charity, is far lower than the expected value of a pascal mugging.
I expect an FAI to have at least 10% probability of acquiring infinite computational power. This means donations to MIRI have infinite expected utility.
That doesn’t work because the expected value of things that you should do, e.g. donating to an effective charity, is far lower than the expected value of a pascal mugging.
I expect an FAI to have at least 10% probability of acquiring infinite computational power. This means donations to MIRI have infinite expected utility.