OK, thanks, but then one of my additional questions is: what is the reasonable threshold for the probability of my belief A given all available evidence B1, B2, .., Bn? And why?
Are you suggesting that beliefs must be binary? Either believed or not? E.g. if the probability of truth is over 50% then you believe it and don’t believe if it’s under 50%? Dispense with the binary and use the probability as your degree of belief. You can act with degrees of uncertainty. Hedge your bets, for example.
Ok, thanks. This is very interesting, and correct in theory (I guess). And I would be very glad to apply it. But before doing my first steps in it on my own by the trial-&-error method, I would like to know some best practices in doing so, if they are available at all. I strongly doubt this is a common practice in a common population and I slightly doubt that it is the common practice also for a “common” attendee of this forum, but I would still like to make this my (usual) habit.
And the greatest issue I see in this is how to talk to common people around me about common uncertain things that are probabilistic if they actually think of the common things as they would be certain. Should I try to gradually and unnoticeably change their paradigm? Or should I use double language: probabilistic inside, but confidential outside?
(I am aware that these questions might be difficult, and I don’t necessarily expect direct answers.)
I’m not sure what to say besides “Bayesian thinking” here. This doesn’t necessarily mean plugging in numbers (although that can help), but develop habits like not neglecting priors or base rates, considering how consistent the supposed evidence is with the converse of the hypotheses and so forth. I think normal, non-rationalist people reason in a Bayesian way at least some of the time. People mostly don’t object to good epistemology, they just use a lot of bad epistemology too. Normal people understand words like “likely” or “uncertain”. These are not alien concepts, just underutilized.
I’m not sure what you mean by “threshold for the probability of belief in A.”
Say A is “I currently have a nose on my face.” You could assign that .99 or .99999 and either expresses a lot of certainty that it’s true, there’s not really a threshold involved.
Say A is “It will snow in Denver on or before October 31st 2021.” Right now, I would assign that a .65 based on my history of living in Denver for 41 years (it seems like it usually does).
But I could go back and look at weather data and see how often that actually happens. Maybe it’s been 39 out of the last 41 years, in which case I should update. Or maybe there’s an El Niño-like weather pattern this year or something like that… so I would adjust up or down accordingly.
The idea being, overtime, encountering evidence and learning to evaluate the quality of the evidence, you would get closer to the “true probability” of whatever A is.
Maybe you’re more asking about how should certain kinds of evidence change the probability of a belief being true? Like how much to update based on evidence presented?
OK, thanks, but then one of my additional questions is: what is the reasonable threshold for the probability of my belief A given all available evidence B1, B2, .., Bn? And why?
Are you suggesting that beliefs must be binary? Either believed or not? E.g. if the probability of truth is over 50% then you believe it and don’t believe if it’s under 50%? Dispense with the binary and use the probability as your degree of belief. You can act with degrees of uncertainty. Hedge your bets, for example.
Ok, thanks. This is very interesting, and correct in theory (I guess). And I would be very glad to apply it. But before doing my first steps in it on my own by the trial-&-error method, I would like to know some best practices in doing so, if they are available at all. I strongly doubt this is a common practice in a common population and I slightly doubt that it is the common practice also for a “common” attendee of this forum, but I would still like to make this my (usual) habit.
And the greatest issue I see in this is how to talk to common people around me about common uncertain things that are probabilistic if they actually think of the common things as they would be certain. Should I try to gradually and unnoticeably change their paradigm? Or should I use double language: probabilistic inside, but confidential outside?
(I am aware that these questions might be difficult, and I don’t necessarily expect direct answers.)
I’m not sure what to say besides “Bayesian thinking” here. This doesn’t necessarily mean plugging in numbers (although that can help), but develop habits like not neglecting priors or base rates, considering how consistent the supposed evidence is with the converse of the hypotheses and so forth. I think normal, non-rationalist people reason in a Bayesian way at least some of the time. People mostly don’t object to good epistemology, they just use a lot of bad epistemology too. Normal people understand words like “likely” or “uncertain”. These are not alien concepts, just underutilized.
I’m not sure what you mean by “threshold for the probability of belief in A.”
Say A is “I currently have a nose on my face.” You could assign that .99 or .99999 and either expresses a lot of certainty that it’s true, there’s not really a threshold involved.
Say A is “It will snow in Denver on or before October 31st 2021.” Right now, I would assign that a .65 based on my history of living in Denver for 41 years (it seems like it usually does).
But I could go back and look at weather data and see how often that actually happens. Maybe it’s been 39 out of the last 41 years, in which case I should update. Or maybe there’s an El Niño-like weather pattern this year or something like that… so I would adjust up or down accordingly.
The idea being, overtime, encountering evidence and learning to evaluate the quality of the evidence, you would get closer to the “true probability” of whatever A is.
Maybe you’re more asking about how should certain kinds of evidence change the probability of a belief being true? Like how much to update based on evidence presented?