To refine that a bit, KU affects the absolute probability of your hypotheses, but not the relative probability. So it shouldn’t affect what you believe, but it should affect how strongly you believe it. How strongly you believe something can easily have an impact on what you do: for instance, you should not start WWIII unless you are very certain it is the best course of action.
You can incorporate KU into Bayes by reserving probability mass for stuff you don’t know or haven’t thought of, but you are not forced to. (As a corolllary, Bayesians who offer very high probabilities aren’t doing so). If the arguments for KU are correct, you should do that as well. Probability estimates are less useful than claims of certainty, but Bayesians are using them because their epistemology tells them that certainty isn’t avalaible: similarly absolute probability should be abandoned if it is not available.
To refine that a bit, KU affects the absolute probability of your hypotheses, but not the relative probability. So it shouldn’t affect what you believe, but it should affect how strongly you believe it. How strongly you believe something can easily have an impact on what you do: for instance, you should not start WWIII unless you are very certain it is the best course of action.
You can incorporate KU into Bayes by reserving probability mass for stuff you don’t know or haven’t thought of, but you are not forced to. (As a corolllary, Bayesians who offer very high probabilities aren’t doing so). If the arguments for KU are correct, you should do that as well. Probability estimates are less useful than claims of certainty, but Bayesians are using them because their epistemology tells them that certainty isn’t avalaible: similarly absolute probability should be abandoned if it is not available.