[...] creates an expectation that more context about the referenced authority figure would help us better assess the truth-content of the claim.
[...] creates an expectation that further context would make the quote feel just as stupid as it seemed before.
I want to add 3
[...] contains enough information that you already know the quote is stupid
You implied Yang’s quote is #2, I would say it’s #1. But (more importantly?) I would draw the bigger distinction between #2 and #3, and I don’t see any reason to choose something in #2 as an example of irrationality when examples in #3 are available. This, I think, is still my main argument.
I agree with everything else in your post; in particular, with the importance of context which I hadn’t considered. I concede that my behavior isn’t analogous to that of Yang in this example.
Glad we are converging on some sort of agreement! I find this helpful to talk out with you.
I’m not clear on the distinction between #2 and #3. What’s the difference between predicting the quote will still seem stupid after further research, and finding the quote to be stupid now? By conservation of expected evidence, aren’t they the same thing?
I’m not clear on the distinction between #2 and #3. What’s the difference between predicting the quote will still seem stupid after further research, and finding the quote to be stupid now? By conservation of expected evidence, aren’t they the same thing?
One difference is just uncertainty. Conservation of expected evidence only works if you’re certain. I think I was assuming that your probability on Yang being unable to expand on his claim meaningfully is like ~67%, not like 98%.
What I suspect happened is that he talked to various big names in tech, including the CEOs of companies who make decisions about automation, and they were bullish on the timelines. Would that kind of scenario qualify as being non-stupid?
The other difference is that you can’t demonstrate it. The leading question wasn’t about whether one should update from the quote, it was whether it’s a good idea to choose the quote as a negative example in a book about rationality. Even if Steven Pinker were 100% sure that there is no reason to update here, it’s still not a good example if he can’t prove it. I mean, if you accept that the quote could plausibly be non-stupid, then the honest way to talk about would have been “Here is Yang saying {quote}. If it turns out that ‘some of the smartest people in the world’ is made-up or refers to people aren’t actually impressive, this will have been an example of a vapid appeal-to-authority. I can’t prove that this is so, but here are my reasons for suspecting it.” And again, I can’t imagine it’s hard to find examples of appeals-to-authority that are clear-cut fallacies.
What I suspect happened is that he talked to various big names in tech, including the CEOs of companies who make decisions about automation, and they were bullish on the timelines. Would that kind of scenario qualify as being non-stupid?
No. That’s a good point. It seems like “fallacious argument from authority” lends itself to black-and-white thinking that’s just not appropriate in many cases. Reading the tea leaves has its value. If I had to guess, Pinker was looking for a timely quote by a politician his readers are likely to be sympathetic to, and this one was convenient.
I still think that there are many times when it’s best as a rule to just dismiss statements with the form of “arguments from authority.” This fits the criteria, and it might be that sometimes you throw out the baby with the bathwater this way. Then again, there could be equal value in becoming sensitive to when it’s appropriate to “tune in” to this sort of evidence. That probably depends on the individual and their goals.
Taking your second division:
I want to add 3
You implied Yang’s quote is #2, I would say it’s #1. But (more importantly?) I would draw the bigger distinction between #2 and #3, and I don’t see any reason to choose something in #2 as an example of irrationality when examples in #3 are available. This, I think, is still my main argument.
I agree with everything else in your post; in particular, with the importance of context which I hadn’t considered. I concede that my behavior isn’t analogous to that of Yang in this example.
Glad we are converging on some sort of agreement! I find this helpful to talk out with you.
I’m not clear on the distinction between #2 and #3. What’s the difference between predicting the quote will still seem stupid after further research, and finding the quote to be stupid now? By conservation of expected evidence, aren’t they the same thing?
One difference is just uncertainty. Conservation of expected evidence only works if you’re certain. I think I was assuming that your probability on Yang being unable to expand on his claim meaningfully is like ~67%, not like 98%.
What I suspect happened is that he talked to various big names in tech, including the CEOs of companies who make decisions about automation, and they were bullish on the timelines. Would that kind of scenario qualify as being non-stupid?
The other difference is that you can’t demonstrate it. The leading question wasn’t about whether one should update from the quote, it was whether it’s a good idea to choose the quote as a negative example in a book about rationality. Even if Steven Pinker were 100% sure that there is no reason to update here, it’s still not a good example if he can’t prove it. I mean, if you accept that the quote could plausibly be non-stupid, then the honest way to talk about would have been “Here is Yang saying {quote}. If it turns out that ‘some of the smartest people in the world’ is made-up or refers to people aren’t actually impressive, this will have been an example of a vapid appeal-to-authority. I can’t prove that this is so, but here are my reasons for suspecting it.” And again, I can’t imagine it’s hard to find examples of appeals-to-authority that are clear-cut fallacies.
No. That’s a good point. It seems like “fallacious argument from authority” lends itself to black-and-white thinking that’s just not appropriate in many cases. Reading the tea leaves has its value. If I had to guess, Pinker was looking for a timely quote by a politician his readers are likely to be sympathetic to, and this one was convenient.
I still think that there are many times when it’s best as a rule to just dismiss statements with the form of “arguments from authority.” This fits the criteria, and it might be that sometimes you throw out the baby with the bathwater this way. Then again, there could be equal value in becoming sensitive to when it’s appropriate to “tune in” to this sort of evidence. That probably depends on the individual and their goals.