Well, as someone else suggested, you could just ignore all probabilities below a certain noise floor. You don’t necessarily have to assign 0 probability to those things, you could just make it a heuristic to ignore them.
All that does is adopt a different decision theory but not call it that, sidestepping the requirement to formalise and justify it. It’s a patch, not a solution, like solving FAI by saying we can just keep the AI in a box.
Well, as someone else suggested, you could just ignore all probabilities below a certain noise floor. You don’t necessarily have to assign 0 probability to those things, you could just make it a heuristic to ignore them.
All that does is adopt a different decision theory but not call it that, sidestepping the requirement to formalise and justify it. It’s a patch, not a solution, like solving FAI by saying we can just keep the AI in a box.