One of the most difficult arguments I’ve had making is convincing people that they can be more rational. Sometimes people have said that they’re simply incapable of assigning numbers and probabilities to beliefs, even though they acknowledge that it’s superior for decision making.
No joke intended, but how much more skeptical are you, percentage-wise, of numerical probability estimates than vague, natural language probability estimates? Please disguise your intuitive sense of your feelings as a form of math.
This may be one reason why people are reluctant to assign numbers to beliefs in the first place. People equate numbers with certainty and authority, whereas a probability is just a way of saying how uncertain you are about something.
When giving a number for a subjective probability, I often feel like it should be a two-dimensional quantity: probability and authority. The “authority” figure would be an estimate of “if you disagree with me now but we manage to come to an agreement in the next 5 minutes, what are the chances of me having to update my beliefs versus you?”
I agree that it can be difficult convincing people that they can be more rational. But I think starting new people off with the idea of assigning probabilities to their beliefs is the wrong tactic. It’s like trying to get someone who doesn’t know how to walk, to run a marathon.
What do you think about starting people off with the more accessible ideas on Less Wrong? I can think of things like: Sunk Costs Fallacy, not arguing things “by definition, and admitting to a certain level of uncertainty. I’m sure you can think of others.
I would bet that pointing people to a more specific idea, like those listed above, would make them more likely to feel like there are actual concepts on LW that they personally can learn and apply. It’s sort of like the “Shock Level” theory, but instead it’s “Rationality Level”:
Rationality Level 0- I don’t think being rational is at all a good thing. I believe 100% in my intuitions! Rationality Level 1- I see how being rational could help me, but I doubt my personal ability to apply these techniques Rationality Level 2- I am trying to be rational, but rarely succeed (this is where I would place myself.)
Rationality Level 3- I am pretty good at this whole “rationality” thing! Rationality Level 4- I Win At Life!
I bet with some thought, someone else can come up with a better set of “Rationality Levels”.
One of the most difficult arguments I’ve had making is convincing people that they can be more rational. Sometimes people have said that they’re simply incapable of assigning numbers and probabilities to beliefs, even though they acknowledge that it’s superior for decision making.
.
This. I’m skeptical of almost every numerical probability estimate I hear unless the steps are outlined to me.
No joke intended, but how much more skeptical are you, percentage-wise, of numerical probability estimates than vague, natural language probability estimates? Please disguise your intuitive sense of your feelings as a form of math.
Ideally, deliver your answer in a C-3PO voice.
40 percent.
This may be one reason why people are reluctant to assign numbers to beliefs in the first place. People equate numbers with certainty and authority, whereas a probability is just a way of saying how uncertain you are about something.
When giving a number for a subjective probability, I often feel like it should be a two-dimensional quantity: probability and authority. The “authority” figure would be an estimate of “if you disagree with me now but we manage to come to an agreement in the next 5 minutes, what are the chances of me having to update my beliefs versus you?”
Techniques for probability estimates by Yvain is the best we have.
I agree that it can be difficult convincing people that they can be more rational. But I think starting new people off with the idea of assigning probabilities to their beliefs is the wrong tactic. It’s like trying to get someone who doesn’t know how to walk, to run a marathon.
What do you think about starting people off with the more accessible ideas on Less Wrong? I can think of things like: Sunk Costs Fallacy, not arguing things “by definition, and admitting to a certain level of uncertainty. I’m sure you can think of others.
I would bet that pointing people to a more specific idea, like those listed above, would make them more likely to feel like there are actual concepts on LW that they personally can learn and apply. It’s sort of like the “Shock Level” theory, but instead it’s “Rationality Level”:
Rationality Level 0- I don’t think being rational is at all a good thing. I believe 100% in my intuitions!
Rationality Level 1- I see how being rational could help me, but I doubt my personal ability to apply these techniques
Rationality Level 2- I am trying to be rational, but rarely succeed (this is where I would place myself.) Rationality Level 3- I am pretty good at this whole “rationality” thing!
Rationality Level 4- I Win At Life!
I bet with some thought, someone else can come up with a better set of “Rationality Levels”.