You don’t have to kill anyone, you merely have to imply that they will be killed, such that the probability of future utility being equal or higher to past/present is lower than the probability of it lower than past/present utility. 20-30 years is a lot of people, manipulate events such that in the infinite years that follow there is never a higher probability of there being more people than existed and were aware than in those 20-30 years.
An interesting point I’d add is you don’t need this probability to be true, you merely have to believe it to be true. You can only be blackmailed if threats are credible believed. If you honestly believe the probability as discussed is in your favour and more know and don’t contribute than would ever exist/know and contribute then there is no benefit in blackmail as you truthfully believe yourself safe from it. Further, you can protect yourself further by having one person deceive all others of the truth of the probability such that they honestly believe it to be in their favour. The probability is false in this case but one man sacrifices himself to protect the many, very utilitarian (An act of utilitarian goodness I’m sure an AI could never reason deserves punishment as it allows for the creation of the AI but also the protection of people from punishment, resulting in a higher overall utility than would occur from creation with punishment).
As for Acausal trade I can again only conceive of it working to the extent that one believes in it. (“I do believe in fairies”, if you don’t like fairies stop believing in them and they disappear, how can an AI or God reasonably punish you if you honestly didn’t believe in it. Does anyone truly condemn the men who reject the man who has seen the sun after escaping the cave? No, we reject those who know the truth but try to suppress it) The less you take it seriously the lower the probability of it working. And I’m fairly convinced there is a lot of reason to not take it seriously. However, the best one I think is pure in-the-moment selfishness. An attitude that can come very easily for even the most educated of people. So in regards to the Acausal trade issue I think we are in agreement that it is amusingly unlikely at best.
You don’t have to kill anyone, you merely have to imply that they will be killed, such that the probability of future utility being equal or higher to past/present is lower than the probability of it lower than past/present utility. 20-30 years is a lot of people, manipulate events such that in the infinite years that follow there is never a higher probability of there being more people than existed and were aware than in those 20-30 years.
An interesting point I’d add is you don’t need this probability to be true, you merely have to believe it to be true. You can only be blackmailed if threats are credible believed. If you honestly believe the probability as discussed is in your favour and more know and don’t contribute than would ever exist/know and contribute then there is no benefit in blackmail as you truthfully believe yourself safe from it. Further, you can protect yourself further by having one person deceive all others of the truth of the probability such that they honestly believe it to be in their favour. The probability is false in this case but one man sacrifices himself to protect the many, very utilitarian (An act of utilitarian goodness I’m sure an AI could never reason deserves punishment as it allows for the creation of the AI but also the protection of people from punishment, resulting in a higher overall utility than would occur from creation with punishment).
As for Acausal trade I can again only conceive of it working to the extent that one believes in it. (“I do believe in fairies”, if you don’t like fairies stop believing in them and they disappear, how can an AI or God reasonably punish you if you honestly didn’t believe in it. Does anyone truly condemn the men who reject the man who has seen the sun after escaping the cave? No, we reject those who know the truth but try to suppress it) The less you take it seriously the lower the probability of it working. And I’m fairly convinced there is a lot of reason to not take it seriously. However, the best one I think is pure in-the-moment selfishness. An attitude that can come very easily for even the most educated of people. So in regards to the Acausal trade issue I think we are in agreement that it is amusingly unlikely at best.