Super unoriginal observation, but I’ve only now found a concise way of putting this:
What’s weird about the vast majority of people is that they (a) would never claim to be among the ∼ 0.1% smartest people of the world, but (b) behave as though they are among the best ∼ 0.1% of the world when it comes to forming accurate beliefs, as expressed by their confidence in their beliefs. (Since otherwise being highly confident in something that lots of smart people disagree with is illogical.)
Someone (Tyler Cowen?) said that most people ought assign much lower confidences to their beliefs, like 52% instead of 99% or whatever. While this is upstream of the same observation, it has never sat right with me. I think it’s because I wouldn’t diagnoze the problem as overconfidence but as [not realizing or ignoring] the implication I’m confident ⟹ I must be way better than almost everyone else at this process.
I realize you’re not exactly saying it outright, but some parts of your comment seem to be gesturing at the idea that smart people should adopt a “modesty norm” among themselves. I think this is a very bad idea for reasons EY already articulated, so I’d just like to clarify whether this is what you believe?
Thanks for making that question explicit! That’s not my position at all. I think many people who read Inadequate Equilibria are, in fact, among the top ∼ 0.1% of people when it comes to forming accurate beliefs. (If you buy into the rationality project at all, then this is much easier than being among the 0.1% most intelligent people.) As such, they can outperform most people and be justified in having reasonably confident beliefs.
This is also how I remember EY’s argument. He was saying that we shouldn’t apply modesty—because—it is possible to know better than the vast majority of people.
A very relevant observation here is that there is real convergence happening among those people. If I take the set of my ~8 favorite public intellectuals, they tend to agree with close to zero exceptions on many of [the issues that I consider not that hard even though tons of people disagree about them]. Even among LW surveys, we had answers that are very different from the population mean.
Anyway, I don’t think this is in any conflict with my original point. If you ask the average person with super confident beliefs, I’m pretty sure they are not likely to have an explicit belief of being among the top ∼ 0.1% when it comes to forming accurate beliefs (and of course, they aren’t), and there’s your inconsistency.
I think there’s a common confusion (and perhaps an inability below a certain cognitive ability) to recognize the difference between belief, policy, and action. For an even-money bet (losing costs the same utility as winning gains), your policy should be to bet the most probable, and your action, for a 52% chance of red, is to bet red.
There are other kinds of bets where probability means to be more proportionate, but a surprising number of actions end up being binary in result, even if they’re highly uncertain when taking the action.
This leads to vastly over-stating one’s confidence, both when justifying decisions and when advising others about policy and actions.
Is that really a relevant phenomenon? Many of the beliefs I was thinking about (say your opinion on immigration) don’t affect real life choices at all, or at least not in a way that provides feedback on whether the belief was true.
Depends on the belief/claim in question. Agreed that many statements aren’t really “beliefs” in terms of propositional credence in expected experience, but really “positions” in terms of not-very-relevant discussions and debates.
Is it really that simple? I’ve seen a lot of ways in which people strongly express beliefs different from those expressed by a large majority of smart people. Most of the apparent reasons do not seem to boil down to overconfidence of any sort, but are related to the fact that expressions of belief are social acts with many consequences. Personally I have a reputation as a “fence-sitter” (apparently this is socially undesirable) since I often present evidence for and against various positions instead of showing “courage of convictions”.
I wouldn’t quite profess that beliefs being expressed are nothing but tokens in a social game and don’t actually matter to how people actually think and act, but I’m pretty sure that they matter a lot less than the form and strength of expression indicates. People do seem to really believe what they say in the moment, but then continue with life without examining the consequences of that belief to their life.
I am not excluding myself from this assessment, but I would expect anyone reading or posting on this site to want to examine consequences of their expressed and unexpressed beliefs substantially more than most.
Someone (Tyler Cowen?) said that most people ought assign much lower confidences to their beliefs, like 52% instead of 99% or whatever.
oops I have just gained the foundational insight for allowing myself to be converted to (explicit probability-tracking-style) Bayesianism; thank you for that
I always thought “belief is when you think something is significantly more likely than not; like 90%, or 75%, or 66%.” No; even just having 2% more confidence is a huge difference given how weak existing evidence is.
If one really rational debate-enjoyer thinks A is 2% likely (compared to the negation of A, which is at negative 2%), that’s better than a hundred million people shouting that the negation of A is 100% likely.
To me, 0.02 is a comparatively tiny difference between likelihood of a proposition and its negation.
If P(A) = 0.51 and P(~A) = 0.49 then almost every decision I make based on A will give almost equal weight to whether it is true or false, and the cognitive process of working through implications on either side are essentially identical to the case P(A) = 0.49 and P(~A) = 0.51. The outcome of the decision will also be the same very frequently, since outcomes are usually unbalanced.
It takes quite a bit of contriving to arrange a situation where there is any meaningful difference between P(A) = 0.51 and P(A) = 0.49 for some real-world proposition A.
Yeah, and this may get at another reason why the proposal doesn’t seem right to me. There’s no doubt that most people would be better calibrated if they adopted it, but 52% and 48% are the same for the average person, so it’s completely impractical.
If anything, the proposal should be ‘if you don’t think you’re particularly smart, your position on almost every controversial topic should be “I have no idea”’. Which still might not be good advice because there is disproportionate overlap between the set of people likely to take the advice and the set of people for whom it doesn’t apply.
If you think it’s very important to think about all the possible adjacent interpretations of a proposition as stated before making up your mind, it can be useful to indicate your initial agreement with the propositions as a small minimum divergence from total uncertainty (the uncertainty representing your uncertainty about whether you’ll come up with better interpretations for the thing you think you’re confident about) on just so many interpretations before you consider more ambitious numbers like 90%.
If you always do this and you wind up being wrong about some belief, then it is at least possible to think that the error you made was failing to list a sufficient number of sufficiently specific adjacent possibilities before asking yourself more seriously about what their true probabilities were. Making distinctions is a really important part of knowing the truth; don’t pin all the hopes of every A-adjacent possibility on just one proposition in the set of A-adjacent possibilities. Two A-adjacent propositions can have great or critically moderate differences in likelihood; thinking only about A can mislead you about A-synonymous things.
Super unoriginal observation, but I’ve only now found a concise way of putting this:
What’s weird about the vast majority of people is that they (a) would never claim to be among the ∼ 0.1% smartest people of the world, but (b) behave as though they are among the best ∼ 0.1% of the world when it comes to forming accurate beliefs, as expressed by their confidence in their beliefs. (Since otherwise being highly confident in something that lots of smart people disagree with is illogical.)
Someone (Tyler Cowen?) said that most people ought assign much lower confidences to their beliefs, like 52% instead of 99% or whatever. While this is upstream of the same observation, it has never sat right with me. I think it’s because I wouldn’t diagnoze the problem as overconfidence but as [not realizing or ignoring] the implication I’m confident ⟹ I must be way better than almost everyone else at this process.
I realize you’re not exactly saying it outright, but some parts of your comment seem to be gesturing at the idea that smart people should adopt a “modesty norm” among themselves. I think this is a very bad idea for reasons EY already articulated, so I’d just like to clarify whether this is what you believe?
Thanks for making that question explicit! That’s not my position at all. I think many people who read Inadequate Equilibria are, in fact, among the top ∼ 0.1% of people when it comes to forming accurate beliefs. (If you buy into the rationality project at all, then this is much easier than being among the 0.1% most intelligent people.) As such, they can outperform most people and be justified in having reasonably confident beliefs.
This is also how I remember EY’s argument. He was saying that we shouldn’t apply modesty—because—it is possible to know better than the vast majority of people.
A very relevant observation here is that there is real convergence happening among those people. If I take the set of my ~8 favorite public intellectuals, they tend to agree with close to zero exceptions on many of [the issues that I consider not that hard even though tons of people disagree about them]. Even among LW surveys, we had answers that are very different from the population mean.
Anyway, I don’t think this is in any conflict with my original point. If you ask the average person with super confident beliefs, I’m pretty sure they are not likely to have an explicit belief of being among the top ∼ 0.1% when it comes to forming accurate beliefs (and of course, they aren’t), and there’s your inconsistency.
I think there’s a common confusion (and perhaps an inability below a certain cognitive ability) to recognize the difference between belief, policy, and action. For an even-money bet (losing costs the same utility as winning gains), your policy should be to bet the most probable, and your action, for a 52% chance of red, is to bet red.
There are other kinds of bets where probability means to be more proportionate, but a surprising number of actions end up being binary in result, even if they’re highly uncertain when taking the action.
This leads to vastly over-stating one’s confidence, both when justifying decisions and when advising others about policy and actions.
Is that really a relevant phenomenon? Many of the beliefs I was thinking about (say your opinion on immigration) don’t affect real life choices at all, or at least not in a way that provides feedback on whether the belief was true.
Depends on the belief/claim in question. Agreed that many statements aren’t really “beliefs” in terms of propositional credence in expected experience, but really “positions” in terms of not-very-relevant discussions and debates.
Is it really that simple? I’ve seen a lot of ways in which people strongly express beliefs different from those expressed by a large majority of smart people. Most of the apparent reasons do not seem to boil down to overconfidence of any sort, but are related to the fact that expressions of belief are social acts with many consequences. Personally I have a reputation as a “fence-sitter” (apparently this is socially undesirable) since I often present evidence for and against various positions instead of showing “courage of convictions”.
I wouldn’t quite profess that beliefs being expressed are nothing but tokens in a social game and don’t actually matter to how people actually think and act, but I’m pretty sure that they matter a lot less than the form and strength of expression indicates. People do seem to really believe what they say in the moment, but then continue with life without examining the consequences of that belief to their life.
I am not excluding myself from this assessment, but I would expect anyone reading or posting on this site to want to examine consequences of their expressed and unexpressed beliefs substantially more than most.
oops I have just gained the foundational insight for allowing myself to be converted to (explicit probability-tracking-style) Bayesianism; thank you for that
I always thought “belief is when you think something is significantly more likely than not; like 90%, or 75%, or 66%.” No; even just having 2% more confidence is a huge difference given how weak existing evidence is.
If one really rational debate-enjoyer thinks A is 2% likely (compared to the negation of A, which is at negative 2%), that’s better than a hundred million people shouting that the negation of A is 100% likely.
To me, 0.02 is a comparatively tiny difference between likelihood of a proposition and its negation.
If P(A) = 0.51 and P(~A) = 0.49 then almost every decision I make based on A will give almost equal weight to whether it is true or false, and the cognitive process of working through implications on either side are essentially identical to the case P(A) = 0.49 and P(~A) = 0.51. The outcome of the decision will also be the same very frequently, since outcomes are usually unbalanced.
It takes quite a bit of contriving to arrange a situation where there is any meaningful difference between P(A) = 0.51 and P(A) = 0.49 for some real-world proposition A.
Yeah, and this may get at another reason why the proposal doesn’t seem right to me. There’s no doubt that most people would be better calibrated if they adopted it, but 52% and 48% are the same for the average person, so it’s completely impractical.
If anything, the proposal should be ‘if you don’t think you’re particularly smart, your position on almost every controversial topic should be “I have no idea”’. Which still might not be good advice because there is disproportionate overlap between the set of people likely to take the advice and the set of people for whom it doesn’t apply.
If you think it’s very important to think about all the possible adjacent interpretations of a proposition as stated before making up your mind, it can be useful to indicate your initial agreement with the propositions as a small minimum divergence from total uncertainty (the uncertainty representing your uncertainty about whether you’ll come up with better interpretations for the thing you think you’re confident about) on just so many interpretations before you consider more ambitious numbers like 90%.
If you always do this and you wind up being wrong about some belief, then it is at least possible to think that the error you made was failing to list a sufficient number of sufficiently specific adjacent possibilities before asking yourself more seriously about what their true probabilities were. Making distinctions is a really important part of knowing the truth; don’t pin all the hopes of every A-adjacent possibility on just one proposition in the set of A-adjacent possibilities. Two A-adjacent propositions can have great or critically moderate differences in likelihood; thinking only about A can mislead you about A-synonymous things.