However, I think this error undermines a significant part of Yudkowsky’s thesis. This example was one of two major anecdotes that Yudkowsky presented to show that he can often know better than experts, and he cited it repeatedly throughout the book. Yet, I think he got it wrong.
It does seem plausible that the Bank of Japan thing was an error. However, I don’t think that would undermine his thesis.
It’s been some time since I read the book so I could be misremembering, but I remember the big idea here to basically be “consider the incentives”. If there are strong incentives to be correct about something (like the price of a stock) you should be weary of disagreeing. On the other hand, if there aren’t strong incentives to be correct about something, you shouldn’t be too hesitant to disagree. More generally, your hesitance should be in proportion to the strength of the incentives.
I think the Bank of Japan example was meant to be illustrative of this idea, not strong evidence for it. As I understand it, the bulk of the evidence is more gears level/theoretical/deductive. Maybe something like this:
We know that people respond to incentives.
Therefore, the more heavily you incentivize people to be correct about something the more likely they are to actually be correct.
Therefore, a thing where people are heavily incentivized to be correct is likely to actually be correct.
Therefore, you should be hesitant to disagree in places where people are heavily incentivized to be correct.
It does seem plausible that the Bank of Japan thing was an error. However, I don’t think that would undermine his thesis.
I agree that this error does not substantially undermine the entire book, much less prove its central thesis false. I still broadly agree with most of the main claims of the book, as I understand them.
I think the claims of the book along the lines of the following quote were definitely undermined in light of this factual error,
We have a picture of the world where it is perfectly plausible for an econblogger to write up a good analysis of what the Bank of Japan is doing wrong, and for a sophisticated reader to reasonably agree that the analysis seems decisive, without a deep agonizing episode of Dunning-Kruger-inspired self-doubt playing any important role in the analysis.
In particular, I think this error highlights that even sophisticated observers can make basic errors in reasoning that demonstrate that they don’t understand a situation well.
In general, I think it’s correct for non-expert observers to be skeptical that they can identify the correct blogger in a complex debate about a topic they don’t fully understand. Moreover, even if sophisticated observers can identify the correct blogger, that doesn’t mean that they’re necessarily correct about their interpretation of what that the blogger is saying. Both of these points are important.
Here is how I am thinking about it. Consider the claim “the Bank of Japan is being way too tight with its monetary policy”. Consider two reasons why that claim might be wrong:
The Bank of Japan pursued a tight monetary policy and they probably know what they’re doing.
Monetary policy is a complex topic that is difficult to reason about.
My read is that Eliezer was only making points about 1, not 2.
It sounds like you are saying that he was making claims about 2. Something like, “don’t be too hesitant to trust your reasoning about complex topics”. But even if he was making this claim about 2, I still don’t think that the Bank of Japan example matters much. It would still just be illustrative, not strong evidence. And as something that is merely illustrative, if it turned out to be wrong it wouldn’t be reason to change one’s mind about the original claim.
It sounds like you are saying that he was making claims about 2.
No, I think he was also wrong about the Bank of Japan’s relative competence. I didn’t argue this next point directly in the post because it would have been harder to argue than the other points I made, but I think Eliezer is just straight up wrong that the Bank of Japan was pursuing a policy prior to 2013 that made Japan forgo trillions of dollars in lost economic growth.
To be clear, I don’t think that the Bank of Japan was following the optimal monetary policy by any means, and I currently think Scott Sumner is right to think they should have printed more money. But Eliezer didn’t just say that the Bank of Japan could switch to a slightly better policy on the margin. He repeated multiple times that it was a major policy failure, that cost trillions of dollars in real value to Japan, and even went on to repeat that claim but in the context of Europe. In my opinion that’s a major difference. I think he’s simply wrong about the cost of the policy, even if he was right about which blogger has the correct theory.
It would still just be illustrative, not strong evidence. And as something that is merely illustrative, if it turned out to be wrong it wouldn’t be reason to change one’s mind about the original claim.
It matters what examples you use to illustrate an argument. Presumably Eliezer tried to remember times in his life in which he was able to know better than a bunch of experts, and this example came to mind as especially salient (hence why he used it as his first example). The fact that he turned out to be mistaken about the example provides significant evidence about how often he’s able to know better than experts about their domain of expertise.
Hm, I think I’m still confused about what thesis you’re pointing to and where, if anywhere, you and I disagree. I think we agree:
That you should be more hesitant to disagree in places where the incentives are strong for others to get things right (like stocks).
That you should be more hesitant to disagree with people who seem smart.
That you should be more hesitant to disagree about topics you are less knowledgeable about.
That you should be more hesitant to disagree about topics that are complex.
That the above is not an exhaustive list of things to consider when thinking about how hesitant you should be to disagree. There’s a lot more to it.
I think Eliezer as well as most reasonable people would agree with the above as well. The difficulty comes when you starting considering a specific example and getting concrete. How smart do we think the people who run the Bank of Japan are? Are they incentivized to do what is best for the country or are other incentives driving their policy? How complex is the topic?
For the Bank of Japan example, it sounds like you (as well as most others) think that Eliezer was too confident. Personally I am pretty agnostic about that point and don’t have much of an opinion.
I just don’t see why it matters. Regardless of whether Eliezer himself is prone to being overconfident, points 1 through 5 still stand, right? Or do you think one/part of Eliezer’s thesis goes beyond them?
Why is blogging being sold as something superior to boring old books and lectures? Blogs have the advantage of speed , right enough, but the BoJ thing has dragged on for decades. And, as you say, if you try to read blogs without having the boring groundwork in place, you might not even understand them..
I’m not buying the simple distinction between correct and incorrect. The price of a stock is just an aggregate of everyone else’s opinion. If that aggregate opinion was always correct, market crashes wouldn’t occur.
It does seem plausible that the Bank of Japan thing was an error. However, I don’t think that would undermine his thesis.
It’s been some time since I read the book so I could be misremembering, but I remember the big idea here to basically be “consider the incentives”. If there are strong incentives to be correct about something (like the price of a stock) you should be weary of disagreeing. On the other hand, if there aren’t strong incentives to be correct about something, you shouldn’t be too hesitant to disagree. More generally, your hesitance should be in proportion to the strength of the incentives.
I think the Bank of Japan example was meant to be illustrative of this idea, not strong evidence for it. As I understand it, the bulk of the evidence is more gears level/theoretical/deductive. Maybe something like this:
We know that people respond to incentives.
Therefore, the more heavily you incentivize people to be correct about something the more likely they are to actually be correct.
Therefore, a thing where people are heavily incentivized to be correct is likely to actually be correct.
Therefore, you should be hesitant to disagree in places where people are heavily incentivized to be correct.
I agree that this error does not substantially undermine the entire book, much less prove its central thesis false. I still broadly agree with most of the main claims of the book, as I understand them.
Which thesis were you referring to?
I think the claims of the book along the lines of the following quote were definitely undermined in light of this factual error,
In particular, I think this error highlights that even sophisticated observers can make basic errors in reasoning that demonstrate that they don’t understand a situation well.
In general, I think it’s correct for non-expert observers to be skeptical that they can identify the correct blogger in a complex debate about a topic they don’t fully understand. Moreover, even if sophisticated observers can identify the correct blogger, that doesn’t mean that they’re necessarily correct about their interpretation of what that the blogger is saying. Both of these points are important.
I see, thanks for clarifying.
Here is how I am thinking about it. Consider the claim “the Bank of Japan is being way too tight with its monetary policy”. Consider two reasons why that claim might be wrong:
The Bank of Japan pursued a tight monetary policy and they probably know what they’re doing.
Monetary policy is a complex topic that is difficult to reason about.
My read is that Eliezer was only making points about 1, not 2.
It sounds like you are saying that he was making claims about 2. Something like, “don’t be too hesitant to trust your reasoning about complex topics”. But even if he was making this claim about 2, I still don’t think that the Bank of Japan example matters much. It would still just be illustrative, not strong evidence. And as something that is merely illustrative, if it turned out to be wrong it wouldn’t be reason to change one’s mind about the original claim.
No, I think he was also wrong about the Bank of Japan’s relative competence. I didn’t argue this next point directly in the post because it would have been harder to argue than the other points I made, but I think Eliezer is just straight up wrong that the Bank of Japan was pursuing a policy prior to 2013 that made Japan forgo trillions of dollars in lost economic growth.
To be clear, I don’t think that the Bank of Japan was following the optimal monetary policy by any means, and I currently think Scott Sumner is right to think they should have printed more money. But Eliezer didn’t just say that the Bank of Japan could switch to a slightly better policy on the margin. He repeated multiple times that it was a major policy failure, that cost trillions of dollars in real value to Japan, and even went on to repeat that claim but in the context of Europe. In my opinion that’s a major difference. I think he’s simply wrong about the cost of the policy, even if he was right about which blogger has the correct theory.
It matters what examples you use to illustrate an argument. Presumably Eliezer tried to remember times in his life in which he was able to know better than a bunch of experts, and this example came to mind as especially salient (hence why he used it as his first example). The fact that he turned out to be mistaken about the example provides significant evidence about how often he’s able to know better than experts about their domain of expertise.
Hm, I think I’m still confused about what thesis you’re pointing to and where, if anywhere, you and I disagree. I think we agree:
That you should be more hesitant to disagree in places where the incentives are strong for others to get things right (like stocks).
That you should be more hesitant to disagree with people who seem smart.
That you should be more hesitant to disagree about topics you are less knowledgeable about.
That you should be more hesitant to disagree about topics that are complex.
That the above is not an exhaustive list of things to consider when thinking about how hesitant you should be to disagree. There’s a lot more to it.
I think Eliezer as well as most reasonable people would agree with the above as well. The difficulty comes when you starting considering a specific example and getting concrete. How smart do we think the people who run the Bank of Japan are? Are they incentivized to do what is best for the country or are other incentives driving their policy? How complex is the topic?
For the Bank of Japan example, it sounds like you (as well as most others) think that Eliezer was too confident. Personally I am pretty agnostic about that point and don’t have much of an opinion.
I just don’t see why it matters. Regardless of whether Eliezer himself is prone to being overconfident, points 1 through 5 still stand, right? Or do you think one/part of Eliezer’s thesis goes beyond them?
Why is blogging being sold as something superior to boring old books and lectures? Blogs have the advantage of speed , right enough, but the BoJ thing has dragged on for decades. And, as you say, if you try to read blogs without having the boring groundwork in place, you might not even understand them..
I don’t think that it is. In this particular case Eliezer was saying he was swayed by some bloggers but not that blogging is superior in general.
It’s not clear that he was correctly swayed, and it’s not clear that his favourite bloggers are reliable access the board.
Even if he had been correctly swayed, it’s still not clear that his favourite bloggers are reliable access the board.
The idea that there is a reliably correct contratianism remains unproven.
I’m not buying the simple distinction between correct and incorrect. The price of a stock is just an aggregate of everyone else’s opinion. If that aggregate opinion was always correct, market crashes wouldn’t occur.