I think you’ve just perfectly illustrated how some Scope Insensitivity can be good thing.
Because a mind with perfect scope sensitivity, will be diverted into chasing impossibly tiny probabilities for impossibly large rewards. If a good rationalist must win, then a good rationalist should commit to avoiding supposed rationality that makes him lose like that.
So, here’s a solution. If a probability is too tiny to be reasonably likely to occur in your lifespan, treat its bait as actually impossible. If you don’t, you’ll inevitably crash into effective ineffectiveness.
I think you’ve just perfectly illustrated how some Scope Insensitivity can be good thing.
Because a mind with perfect scope sensitivity, will be diverted into chasing impossibly tiny probabilities for impossibly large rewards. If a good rationalist must win, then a good rationalist should commit to avoiding supposed rationality that makes him lose like that.
So, here’s a solution. If a probability is too tiny to be reasonably likely to occur in your lifespan, treat its bait as actually impossible. If you don’t, you’ll inevitably crash into effective ineffectiveness.
This seems to suggest a fuzzily-defined hack.
If you don’t have a mathematical descriptor for what you consider “reasonably likely”, then I’m afraid this doesn’t promote us anywhere.