I am using ‘inaccurate’ as equivalent to ‘badly calibrated’ here.
To determine whether a person is well calibrated or isn’t you have to look at multiple predictions of the person. It’s an attribute heuristic for decision making.
On the other hand a single statement such as Alice is 1,60m might be inaccurate. Being inaccurate is a property of a statement and not just a property of how the statement was generated.
But in every day life do you really mentally attempt to assign probability to all variables?
Assigning probabilities to event takes effort. As such it’s not something you can do for two-thousand statements in a day. To be able to assign probabilities it’s also important to precisely define the belief.
If I take a belief like “All people who clicked on ‘Going’ will come to the event tonight”, I can assign a probability. The exercise of assigning that probability makes me think more clearly about the likelihood of it happening.
Thanks for the clarifications. One last question as I am sure all these will come out again and again as I am interacting with the community.
Can you give me a concrete example of a complex, real life problem or decision where you used the assignment of probabilities to your beliefs to an extend that you find satisfactory for making the decision. I am curious to see the mental process of really using this way of thinking. I assume it is a process happening through sound in the imagination and more specifically through language (the internal dialogue). Could you reproduce it for me in writing?
I applied for a job. There was uncertainty around whether or not I get the job. Having an accurate view of the probability of getting the job informs the decision of how important it is to spend additional effort.
I basically came up with a number and then ask myself whether I would be surprised if the event happens or doesn’t happen.
I currently don’t have a more systematic process.
I remember a conversation with a CFAR trainer. I said “I think X is a key skill”. They responded with: “I think it is likely that X is a key skill but I don’t know that it has to be a key skill.”.
We didn’t put numbers on it but having probabilities in the background results in us being able to discuss our disagreement even through we both think “X is likely a key skill”.
I had never someone outside of this community tell me “you are likely right but I don’t see why I should believe that what you are saying is certain”.
The kind of mindset that produces a statement like this is about taking different probabilities seriously.
‘I have reached this mindset through studying views of assumptions and beliefs from other sources. Maybe this is another way to make the realisation.’
It’s more than just a mindset. In this case the result were concrete discoursive practice. There are quite many people who profess to have a mindset that separates shades of gray. The amount of people who tell you voice disagreement when you tell them something they believe is likely to be true and that’s important is much lower.
Can you think of the last time where you cared about an issue and someone professed to believe what you likely believed to be true, that you disagreed with them? And stretch out the example?
Can you think of the last time where you cared about an issue and someone professed to believe what you likely believed to be true, that you disagreed with them?
Do I need to express it in numbers? In my mind I follow and practice, among others, the saying: “Study the assumptions behind your actions. Then study the assumptions behind your assumptions.”
Having said that, I can not think of an example of applying that in a situation where I was in agreement. I am thinking that ‘I would not be in agreement without a reason regarding a belief that I have examined’ but I might be rationalising here. I will try to observe myself on that. Thanks!
I am thinking that ‘I would not be in agreement without a reason regarding a belief that I have examined’ but I might be rationalising here
We both had reasons for believing it to be true. On the other hand human believe things that are wrong. If you ask a Republican and a Democrat whether Trump is good for America they might have both reasons for their belief but they still disagree. That means for each of them there’s a chance of them being wrong despite having reasons for their beliefs.
The reasons he had in this mind pointed to the belief being true but they didn’t provide him the certainty that it’s true.
It was a belief that was important enough for him to be right and not only have reasons for holding his belief.
The practice of putting numbers on a belief forces you to be precise about what you believe.
Let’s say that you believe: “It’s likely that Trump will get impeached.” If Trump actually get’s impeached you will tell yourself “I correctly predicted it, I was right”. If he doesn’t get impeached you are likely to think “When I said likely than it meant that there was a decent chance that he get’s impeached but I didn’t mean to say that the chance was more than 50%.
The number forces precision. The practice of forcing yourself to be precise allows the development of more mental categories.
When Elon Musk started SpaceX he reportedly thought that it had a 10% chance of success. Many people would think of 10% of success as. It’s highly unlikely that the company succeeds. Elon on the other hand thought that given the high stakes 10% chance of success is enough to found SpaceX.
The number forces precision. The practice of forcing yourself to be precise allows the development of more mental categories.
I will have to explore this further. At the moment the method seems to me to just give an illusion of precision which I am not sure is effective. I could say that I assign a 5% probability that the practice is useful to represent my belief. I will now keep interacting with the community and update my belief according to the evidence I see from people that are using it. Is this the right approach?
To determine whether a person is well calibrated or isn’t you have to look at multiple predictions of the person. It’s an attribute heuristic for decision making.
On the other hand a single statement such as Alice is 1,60m might be inaccurate. Being inaccurate is a property of a statement and not just a property of how the statement was generated.
Assigning probabilities to event takes effort. As such it’s not something you can do for two-thousand statements in a day. To be able to assign probabilities it’s also important to precisely define the belief.
If I take a belief like “All people who clicked on ‘Going’ will come to the event tonight”, I can assign a probability. The exercise of assigning that probability makes me think more clearly about the likelihood of it happening.
Thanks for the clarifications. One last question as I am sure all these will come out again and again as I am interacting with the community.
Can you give me a concrete example of a complex, real life problem or decision where you used the assignment of probabilities to your beliefs to an extend that you find satisfactory for making the decision. I am curious to see the mental process of really using this way of thinking. I assume it is a process happening through sound in the imagination and more specifically through language (the internal dialogue). Could you reproduce it for me in writing?
I applied for a job. There was uncertainty around whether or not I get the job. Having an accurate view of the probability of getting the job informs the decision of how important it is to spend additional effort.
I basically came up with a number and then ask myself whether I would be surprised if the event happens or doesn’t happen.
I currently don’t have a more systematic process.
I remember a conversation with a CFAR trainer. I said “I think X is a key skill”. They responded with: “I think it is likely that X is a key skill but I don’t know that it has to be a key skill.”. We didn’t put numbers on it but having probabilities in the background results in us being able to discuss our disagreement even through we both think “X is likely a key skill”.
I had never someone outside of this community tell me “you are likely right but I don’t see why I should believe that what you are saying is certain”.
The kind of mindset that produces a statement like this is about taking different probabilities seriously.
My thought is:
‘I have reached this mindset through studying views of assumptions and beliefs from other sources. Maybe this is another way to make the realisation.’
My doubt is:
‘Maybe I am missing something that the use of probabilities adds to this realisation’
Hope to continue the discussion in the future.
It’s more than just a mindset. In this case the result were concrete discoursive practice. There are quite many people who profess to have a mindset that separates shades of gray. The amount of people who tell you voice disagreement when you tell them something they believe is likely to be true and that’s important is much lower.
Can you think of the last time where you cared about an issue and someone professed to believe what you likely believed to be true, that you disagreed with them? And stretch out the example?
Do I need to express it in numbers? In my mind I follow and practice, among others, the saying: “Study the assumptions behind your actions. Then study the assumptions behind your assumptions.”
Having said that, I can not think of an example of applying that in a situation where I was in agreement. I am thinking that ‘I would not be in agreement without a reason regarding a belief that I have examined’ but I might be rationalising here. I will try to observe myself on that. Thanks!
We both had reasons for believing it to be true. On the other hand human believe things that are wrong. If you ask a Republican and a Democrat whether Trump is good for America they might have both reasons for their belief but they still disagree. That means for each of them there’s a chance of them being wrong despite having reasons for their beliefs.
The reasons he had in this mind pointed to the belief being true but they didn’t provide him the certainty that it’s true.
It was a belief that was important enough for him to be right and not only have reasons for holding his belief.
The practice of putting numbers on a belief forces you to be precise about what you believe.
Let’s say that you believe: “It’s likely that Trump will get impeached.” If Trump actually get’s impeached you will tell yourself “I correctly predicted it, I was right”. If he doesn’t get impeached you are likely to think “When I said likely than it meant that there was a decent chance that he get’s impeached but I didn’t mean to say that the chance was more than 50%.
The number forces precision. The practice of forcing yourself to be precise allows the development of more mental categories.
When Elon Musk started SpaceX he reportedly thought that it had a 10% chance of success. Many people would think of 10% of success as. It’s highly unlikely that the company succeeds. Elon on the other hand thought that given the high stakes 10% chance of success is enough to found SpaceX.
I will have to explore this further. At the moment the method seems to me to just give an illusion of precision which I am not sure is effective. I could say that I assign a 5% probability that the practice is useful to represent my belief. I will now keep interacting with the community and update my belief according to the evidence I see from people that are using it. Is this the right approach?
The word “useful” itself isn’t precise and as such the precision of 5% might be more precise than warranted.
Otherwise having your number and then updating it according to what you see from people using it, is the Bayesian way.
How would you express the belief?