I don’t see that “being jerked around by unlikely gods” is necessarily a problem. Doesn’t the good sense in donating to SIAI basically boil down to betting on the most-plausible god?
Doesn’t the good sense in donating to SIAI basically boil down to betting on the most-plausible god?
We constantly face a car that is about to run us over. We can see it and are very confident that the collision will be lethal if we do not jump out of its way. But there are people standing on the road curb yelling, “ignore the car, better think about how to solve this math problem”, or, “you can save a galactic civilization if you sacrifice your life”, or, “it’s just a simulated car, just ignore it”. Those people might be really smart and their arguments convincing, or they outweigh the utility of your own life with their predictions.
The problem is that we are really bad at making risk estimations under uncertainty, given limited resources and time to do so. Those people on the road curb might be right, and the expected utility formula does suggest that on average we are better off believing them, but we also know that they might be wrong, that in no possible branch of the multiverse we are actually going to receive the payoff or that our actions achieve what they claimed.
On one side we got our intuitions that tell us to ignore those people and jump, and on the other side we got higher cognition approved rules and heuristics that tell us to ignore the car.
I don’t see that “being jerked around by unlikely gods” is necessarily a problem. Doesn’t the good sense in donating to SIAI basically boil down to betting on the most-plausible god?
We constantly face a car that is about to run us over. We can see it and are very confident that the collision will be lethal if we do not jump out of its way. But there are people standing on the road curb yelling, “ignore the car, better think about how to solve this math problem”, or, “you can save a galactic civilization if you sacrifice your life”, or, “it’s just a simulated car, just ignore it”. Those people might be really smart and their arguments convincing, or they outweigh the utility of your own life with their predictions.
The problem is that we are really bad at making risk estimations under uncertainty, given limited resources and time to do so. Those people on the road curb might be right, and the expected utility formula does suggest that on average we are better off believing them, but we also know that they might be wrong, that in no possible branch of the multiverse we are actually going to receive the payoff or that our actions achieve what they claimed.
On one side we got our intuitions that tell us to ignore those people and jump, and on the other side we got higher cognition approved rules and heuristics that tell us to ignore the car.