So 200:1 is your prior? Then where’s the rest of the calculation? Also, how exactly did you come up with the prior? How did you decide that 200:1 is the right place to stop? Or in other words, can you claim that if a completely rational agent had the same information that you have right now, then that agent would also come up with a prior of 200:1? What you have described is just a way of measuring how much you believe in something. But what I am asking is how do you decide how strong your belief should be.
It’s just the numerical expression of how likely I feel a nuclear attack is. (ETA: I didn’t just pick it out of thin air. I can give reasons but they aren’t mathematically exact. But we could work up to that by considering information about geopolitics, proliferation etc.)
Or in other words, can you claim that if a completely rational agent had the same information that you have right now, then that agent would also come up with a prior of 200:1?
No, I absolutely can’t claim that.
What you have described is just a way of measuring how much you believe in something. But what I am asking is how do you decide how strong your belief should be.
By making a lot of predictions and hopefully getting good at it while paying attention to known biases and discussing the proposition with others to catch your errors and gather new information. If you were hoping there was a perfect method for relating information about extremely complex propositions to their probabilities… I don’t have that. If anyone here does please share. I have missed this!
But theoretically, if we’re even a little bit rational the more updating we do the closer we should get to the the right answer (though I’m not actually sure we’re even this rational). So we pick priors and go from there.
So 200:1 is your prior? Then where’s the rest of the calculation? Also, how exactly did you come up with the prior? How did you decide that 200:1 is the right place to stop? Or in other words, can you claim that if a completely rational agent had the same information that you have right now, then that agent would also come up with a prior of 200:1? What you have described is just a way of measuring how much you believe in something. But what I am asking is how do you decide how strong your belief should be.
It’s just the numerical expression of how likely I feel a nuclear attack is. (ETA: I didn’t just pick it out of thin air. I can give reasons but they aren’t mathematically exact. But we could work up to that by considering information about geopolitics, proliferation etc.)
No, I absolutely can’t claim that.
By making a lot of predictions and hopefully getting good at it while paying attention to known biases and discussing the proposition with others to catch your errors and gather new information. If you were hoping there was a perfect method for relating information about extremely complex propositions to their probabilities… I don’t have that. If anyone here does please share. I have missed this!
But theoretically, if we’re even a little bit rational the more updating we do the closer we should get to the the right answer (though I’m not actually sure we’re even this rational). So we pick priors and go from there.