Well the thing about probabilities (in Bayesian statistics) is that they represent the amount of evidence you have for the true state of reality. In general being 50% certain means you have no evidence for your belief, less that 50% means you have evidence against it and greater than 50% means you have evidence for it. You’ll get to it as you read more of An Intuitive Explanation.
The important thing to note is that to be 99% certain something is true as a rationalist you actually have to have evidence for it being true. Rather than feeling that you’re 99% certain, Bayes theorem allows you to see how much evidence you actually have in a purely quantitative way. That’s why there’s so much talk of “calibration” here, it’s an attempt at aligning the feeling of how certain you are with how certain the evidence says you should be.
You can also work out the expected value of what your actions would be if you are wrong. For Hitler, if he thought there was a 1% chance of him being wrong he could work out the expected number of wasted lives as 0.01*11,000,000 which is 110,000 (and that’s using the lower bound of people killed during the holocaust). Hence, if I were Hitler, I wouldn’t risk instigating the holocaust until I had much more information/evidence. Being rational is about looking at the way the world is and acting based on that.
The point is, the most moral thing to do is the most likely thing to be moral. If God turns out to exist (although there are masses of evidence against that) and he asks you why you weren’t a religious fundamentalist, you’ll have a damn good answer.
Glad to be of help!
Well the thing about probabilities (in Bayesian statistics) is that they represent the amount of evidence you have for the true state of reality. In general being 50% certain means you have no evidence for your belief, less that 50% means you have evidence against it and greater than 50% means you have evidence for it. You’ll get to it as you read more of An Intuitive Explanation.
The important thing to note is that to be 99% certain something is true as a rationalist you actually have to have evidence for it being true. Rather than feeling that you’re 99% certain, Bayes theorem allows you to see how much evidence you actually have in a purely quantitative way. That’s why there’s so much talk of “calibration” here, it’s an attempt at aligning the feeling of how certain you are with how certain the evidence says you should be.
You can also work out the expected value of what your actions would be if you are wrong. For Hitler, if he thought there was a 1% chance of him being wrong he could work out the expected number of wasted lives as 0.01*11,000,000 which is 110,000 (and that’s using the lower bound of people killed during the holocaust). Hence, if I were Hitler, I wouldn’t risk instigating the holocaust until I had much more information/evidence. Being rational is about looking at the way the world is and acting based on that.
The point is, the most moral thing to do is the most likely thing to be moral. If God turns out to exist (although there are masses of evidence against that) and he asks you why you weren’t a religious fundamentalist, you’ll have a damn good answer.