Breaking my usual lurking habit to explain my downvote. I travel around a lot and compete in various debating competitions, so this topic is close to my heart. I read this as an attempt to raise the epistemic water level.
It is acknowledged but I still find that this post veers wildly off-topic about half way through and extraneously bashes Ramaswarmy in a way I’m not sure is constructive.
The 2nd points harks on something valid which also irks me, but I think Scott beat you to the punch. Even that given though, I don’t think any of these things as given are particularly potent defences against the dark arts as put—either in debates or in life. I think unwillingness, apathy, or lack of capacity is a much bigger barrier to further academic reading than recognition that subject matter experts are more accurate than random Youtube punters.
I wrote a lot more here, but I’m deleting it to instead say that this post lacks focus and breadth—I think it is simultaneously too shallow in the advice that is given (read primary sources, be educated, don’t experience the Dunning Kruger effect) but also too specific and mindkilling in the examples it chooses (a long explanation of why Ramaswarmy is Super Wrong about this one thing he said) to be pedagogic.
Hm… right. I did get feedback warning that the Ramaswamy example was quite distracting (my beta reader reccomended flat eartherism or anti-vaxxing instead). In hindsight it may have been a better choice, but I’m not too familiar with geology or medicine, so I didn’t think I could do the proper rebuttal justice. The example was meant to show how proper understanding of a subject could act as a very strong rebuttal against intuitive bullshit, but then I think I may not have succeeded in making that point. I think this was a case of sunk cost fallacy at work, I already wrote a good part and I opted not to get rid of it.
Oh? I never saw this article before, thank you for linking.
Even that given though, I don’t think any of these things as given are particularly potent defences against the dark arts as put—either in debates or in life.
Persuasive bullshit stands because it is intuitive. Explanation can help preserve that intuitiveness in the face of conflicting arguments. Effective persuasive bullshit is that which requires more work to be rendered unintuitive than to be said.
I think effective defence against the dark arts in debates as opponents is learning how to mitigate back to the null hypothesis of uncertainty as efficiently as possible.
I think effective defence against the dark arts in life in general in my experience usually comes down to recognising motivated beliefs and the attached rhetorical frames used for hypothesis privilege. What distinguishes the dark arts from just saying things is the aspect of deceit.
Hm… not too I understand what you mean. Would you mind illustrating with a few examples of more ‘potent defenses’, as you see them? Always open to having more tools in my toolbox. These methods I presented in my post are just some heuristics that work for me, not an exhaustive list. I would be grateful if you could provide me some.
I did get feedback warning that the Ramaswamy example was quite distracting (my beta reader reccomended flat eartherism or anti-vaxxing instead). In hindsight it may have been a better choice, but I’m not too familiar with geology or medicine, so I didn’t think I could do the proper rebuttal justice.
My response to your Ramaswamy example was to skip ahead without reading it to see if you would conclude with “My counterarguments were bullshit, did you catch it?”.
After going back and skimming a bit, it’s still not clear to me that they’re not.
The uninformed judge cannot tell him from someone with a genuine understanding of geopolitics.
The thing is, this applies to you as well. Looking at this bit, for example:
What about Ukraine? Ukrainians have died in the hundreds of thousands to defend their country. Civil society has mobilized for a total war. Zelensky retains overwhelming popular support, and by and large the populace is committed to a long war.
Is this the picture of a people about to give up? I think not.
This sure sounds like something a bullshit debater would say. Hundreds of thousands of people dying doesn’t really mean a country isn’t about to give up. Maybe it’s the reason they are about to give up; there’s always a line, and whos to say it isn’t in the hundreds of thousands? Zelensky having popular support does seem to support your point, and I could go check primary sources on that, but even if I did your point about “selecting the right facts and omitting others” still stands, and there’s no easy way to find out if you’re full of shit here or not.
So it’s kinda weird to see it presented as if we’re supposed to take your arguments at face value… in a piece purportedly teaching us to defend against the dark art of bullshit. It’s not clear to me how this section even helps even if we do take it at face value. Okay, so Ramaswamy said something you disagree with, and you might even be right and maybe his thoughts don’t hold up to scrutiny? But even if so, that doesn’t mean he’s “using dark arts” any more than he just doesn’t think things through well enough to get to the right answer, and I don’t see what that teaches us about how to avoid BS besides “Don’t trust Ramaswamy”.
To be clear, this isn’t at all “your post sucks, feel bad”. It’s partly genuine curiosity about where you were trying to go with that part, and mostly that you seem to genuinely appreciate feedback.
My own answer to “how to defend against bullshit” is to notice when I don’t know enough on the object level to be able to know for sure when arguments are misleading, and in those cases refrain from pretending that I know more than I do. In order to determine who to take how seriously, I track how much people are able to engage with other worldviews, and which worldviews hold up and don’t require avoidance techniques in order to preserve the worldview.
On phone, don’t know how to format block quotes but:
My response to your Ramaswamy example was to skip ahead without reading it to see if you would conclude with “My counterarguments were bullshit, did you catch it?”.
This was exactly what I did, such a missed opportunity!!
I also agree with other things you said, and to contribute a useful phrase, your response to BS:
” is to notice when I don’t know enough on the object level to be able to know for sure when arguments are misleading, and in those cases refrain from pretending that I know more than I do. In order to determine who to take how seriously, I track how much people are able to engage with other worldviews, and which worldviews hold up and don’t require avoidance techniques in order to preserve the worldview.”
Sounds a bit like Epistemic Learned Helplessness by Scott:
https://slatestarcodex.com/2019/06/03/repost-epistemic-learned-helplessness/
Which I think is good when you are not in a live debate—saying “I dunno, maybe” and then later spending time thinking about it and researching it to see if the argument is true or not, meanwhile not updating.
The difference between what I strive for (and would advocate) and “epistemic learned helplessness” is that it’s not helpless. I do trust myself to figure out the answers to these kinds of things when I need to—or at least, to be able to come to a perspective that is worth contending with.
The solution I’m pointing at is simply humility. If you pretend that you know things you don’t know, you’re setting yourself up for failure. If you don’t wanna say “I dunno, maybe” and can’t say “Definitely not, and here’s why” (or “That’s irrelevant and here’s why” or “Probably not, and here’s why I suspect this despite not having dived into the details”), then you were committing arrogance by getting into a “debate” in the first place.
Very nice! Now… here’s the catch. Some of my arguments relied on dark arts techniques. Others very much don’t. I can support a generally valid claim with an invalid or weak argument. I can do the same with an obviously invalid claim. Can you tell me what specifically I did? No status points for partially correct answers!
Now, regarding learned helplessness. Yes, it’s similar, though I’d put in an important caveat. I consider discerning reliable sources and trusting them to be a rational decision, so I wouldn’t go as far as calling the whole ordeal of finding what is true a lost cause. But then in general I’m taking a similar position as Scott.
edit: oops, my bad, this was meant to be a response to above, I saw this pop up in the message feed without context
Some people on this website get that for some topics, acoup blog does that for history, etc, but it’s really rare, and mostly you end up with “listen to Radio Liberty and Pravda and figure out the truth if you can.”
On a style side, I agree with other commenters that you have selected something where even after all the reading I am severely not convinced your criticism is correct under every possible frame. Picking something like a politician talking about the good they have done, despite actually being corrupt or something much more narrow in focus and black-and-white, leaving you much less surface to defend. Here, it took a lot of text, I am unsure what techniques I have learned since your criticisms require more effort to again check for validity. You explained that sunk cost fallacy pushed you for this example, but it’s still not too late to add a different example, put this one into Google doc and make it optional reading and note your edit. People may read this in the future, and no reason not to ease the concept for them!
I find Money & Macro (economics youtuber with Ph.d in the field) to be a highly reliable source capable of informed and nuanced reporting. Here is, for instance, his take on the Argentine dollarization plan, which I found much more comprehensive than most media sources.
In terms of Ukraine reporting, I rely pretty heavily on Perun, who likewise provides very informative takes with high emphasis on research and prevalent defense theories.
See here, for instance, on his initial reaction to the invasion, and predictions of many of the war’s original dynamics (acute manpower shortages on the part of Russia, effects of graft and corruption, a close match of capabilities and tendency to devolve towards a longer war).
I consider these sources highly reliable, based off their ability to make concrete, verifiable predictions, steer clear of political biases, and provide coherent worldview models. Would you like to check them out and provide your thoughts?
You explained that sunk cost fallacy pushed you for this example, but it’s still not too late to add a different example, put this one into Google doc and make it optional reading and note your edit. People may read this in the future, and no reason not to ease the concept for them!
Maybe a good idea. It depends on whether I can muster the energy for a separate edit, and if I can find a good relevant example. Do you have any suggestions in that regard? I know that unless I stumble across something very good I’m unlikely to make an edit.
Right, about this. So the overall point of the Ramaswamy example was to illustrate how subject specific knowledge is helpful in formulating a rebuttal and distinguishing between bullshit and non-bullshit claims.
See for example, this comment
This sure sounds like something a bullshit debater would say. Hundreds of thousands of people dying doesn’t really mean a country isn’t about to give up. Maybe it’s the reason they are about to give up; there’s always a line, and whos to say it isn’t in the hundreds of thousands? Zelensky having popular support does seem to support your point, and I could go check primary sources on that, but even if I did your point about “selecting the right facts and omitting others” still stands, and there’s no easy way to find out if you’re full of shit here or not.
Yes, that’s the whole point. I didn’t think it was a problem before, but now… well...
I think I’m starting to realize the dilemma I’m in. I aimed to explain something in full object level terms so I can properly explain why subject matter knowledge helps discern between a true and a false claim… but then actually discerning what’s true and what’s false requires subject matter knowledge I can’t properly distill in the span of a few thousand words. Catch-22, oops.
I could bring out the factual evidence and analyze it if you like, but I don’t think that was your intention. In any case, feedback appreciated! Yes, this was definitely an issue, I’ll take more care in future examples.
I think “subject specific knowledge is helpful in distinguishing between bullshit and non-bullshit claims.” is pretty clear on its own, and if you want to add an example it’d be sufficient to do something simple and vague like “If someone cites scientific studies you haven’t had time to read, it can sound like they’ve actually done their research. Except sometimes when you do this you’ll find that the study doesn’t actually support their claim”.
“How to formulate a rebuttal” sounds like a very different thing, depending on what your social goals are with the rebuttal.
I think I’m starting to realize the dilemma I’m in.
Yeah, you’re kinda stuck between “That’s too obvious of a problem for me to fall into!” and “I don’t see a problem here! I don’t believe you!”. I’d personally err on the side of the obvious, while highlighting why the examples I’m picking are so obvious.
I could bring out the factual evidence and analyze it if you like, but I don’t think that was your intention
Yeah, I think that’d require a pretty big conversation and I already agree with the point you’re trying to use it to make.
I think most of the best posts on this website about the dark arts are deep analyses of one particular rhetorical trick and the effect it has on a discussion. For example, Setting the Zero Point or The noncentral fallacy—the worst argument in the world? are both discussions about hypothesis privilege that rely on unstated premises. I think reading these made me earnestly better at recognising and responding to Dark Arts in the real world. Frame Control and its response, Tabooing “Frame Control” are also excellent reads in my opinion.
Hm… right. I think your critiques are pretty on point in that regard. I may have diluted focus too much and sacrificed insight for a broad overview. Focus on a more specific technique is probably better.
I have a few ideas in mind, but I thought I’d get your opinion first. Do you think there’s any part of this post that warrants more detailed explanation/exploration with greater focus?
Breaking my usual lurking habit to explain my downvote. I travel around a lot and compete in various debating competitions, so this topic is close to my heart. I read this as an attempt to raise the epistemic water level.
It is acknowledged but I still find that this post veers wildly off-topic about half way through and extraneously bashes Ramaswarmy in a way I’m not sure is constructive.
The 2nd points harks on something valid which also irks me, but I think Scott beat you to the punch. Even that given though, I don’t think any of these things as given are particularly potent defences against the dark arts as put—either in debates or in life. I think unwillingness, apathy, or lack of capacity is a much bigger barrier to further academic reading than recognition that subject matter experts are more accurate than random Youtube punters.
I wrote a lot more here, but I’m deleting it to instead say that this post lacks focus and breadth—I think it is simultaneously too shallow in the advice that is given (read primary sources, be educated, don’t experience the Dunning Kruger effect) but also too specific and mindkilling in the examples it chooses (a long explanation of why Ramaswarmy is Super Wrong about this one thing he said) to be pedagogic.
Thanks so much for your feedback!
Hm… right. I did get feedback warning that the Ramaswamy example was quite distracting (my beta reader reccomended flat eartherism or anti-vaxxing instead). In hindsight it may have been a better choice, but I’m not too familiar with geology or medicine, so I didn’t think I could do the proper rebuttal justice. The example was meant to show how proper understanding of a subject could act as a very strong rebuttal against intuitive bullshit, but then I think I may not have succeeded in making that point. I think this was a case of sunk cost fallacy at work, I already wrote a good part and I opted not to get rid of it.
Oh? I never saw this article before, thank you for linking.
Hm… not too I understand what you mean. Would you mind illustrating with a few examples of more ‘potent defenses’, as you see them? Always open to having more tools in my toolbox. These methods I presented in my post are just some heuristics that work for me, not an exhaustive list. I would be grateful if you could provide me some.
My response to your Ramaswamy example was to skip ahead without reading it to see if you would conclude with “My counterarguments were bullshit, did you catch it?”.
After going back and skimming a bit, it’s still not clear to me that they’re not.
The thing is, this applies to you as well. Looking at this bit, for example:
This sure sounds like something a bullshit debater would say. Hundreds of thousands of people dying doesn’t really mean a country isn’t about to give up. Maybe it’s the reason they are about to give up; there’s always a line, and whos to say it isn’t in the hundreds of thousands? Zelensky having popular support does seem to support your point, and I could go check primary sources on that, but even if I did your point about “selecting the right facts and omitting others” still stands, and there’s no easy way to find out if you’re full of shit here or not.
So it’s kinda weird to see it presented as if we’re supposed to take your arguments at face value… in a piece purportedly teaching us to defend against the dark art of bullshit. It’s not clear to me how this section even helps even if we do take it at face value. Okay, so Ramaswamy said something you disagree with, and you might even be right and maybe his thoughts don’t hold up to scrutiny? But even if so, that doesn’t mean he’s “using dark arts” any more than he just doesn’t think things through well enough to get to the right answer, and I don’t see what that teaches us about how to avoid BS besides “Don’t trust Ramaswamy”.
To be clear, this isn’t at all “your post sucks, feel bad”. It’s partly genuine curiosity about where you were trying to go with that part, and mostly that you seem to genuinely appreciate feedback.
My own answer to “how to defend against bullshit” is to notice when I don’t know enough on the object level to be able to know for sure when arguments are misleading, and in those cases refrain from pretending that I know more than I do. In order to determine who to take how seriously, I track how much people are able to engage with other worldviews, and which worldviews hold up and don’t require avoidance techniques in order to preserve the worldview.
On phone, don’t know how to format block quotes but: My response to your Ramaswamy example was to skip ahead without reading it to see if you would conclude with “My counterarguments were bullshit, did you catch it?”.
This was exactly what I did, such a missed opportunity!!
I also agree with other things you said, and to contribute a useful phrase, your response to BS: ” is to notice when I don’t know enough on the object level to be able to know for sure when arguments are misleading, and in those cases refrain from pretending that I know more than I do. In order to determine who to take how seriously, I track how much people are able to engage with other worldviews, and which worldviews hold up and don’t require avoidance techniques in order to preserve the worldview.” Sounds a bit like Epistemic Learned Helplessness by Scott: https://slatestarcodex.com/2019/06/03/repost-epistemic-learned-helplessness/ Which I think is good when you are not in a live debate—saying “I dunno, maybe” and then later spending time thinking about it and researching it to see if the argument is true or not, meanwhile not updating.
The difference between what I strive for (and would advocate) and “epistemic learned helplessness” is that it’s not helpless. I do trust myself to figure out the answers to these kinds of things when I need to—or at least, to be able to come to a perspective that is worth contending with.
The solution I’m pointing at is simply humility. If you pretend that you know things you don’t know, you’re setting yourself up for failure. If you don’t wanna say “I dunno, maybe” and can’t say “Definitely not, and here’s why” (or “That’s irrelevant and here’s why” or “Probably not, and here’s why I suspect this despite not having dived into the details”), then you were committing arrogance by getting into a “debate” in the first place.
Easier said than done, of course.
Very nice! Now… here’s the catch. Some of my arguments relied on dark arts techniques. Others very much don’t. I can support a generally valid claim with an invalid or weak argument. I can do the same with an obviously invalid claim. Can you tell me what specifically I did? No status points for partially correct answers!
Now, regarding learned helplessness. Yes, it’s similar, though I’d put in an important caveat. I consider discerning reliable sources and trusting them to be a rational decision, so I wouldn’t go as far as calling the whole ordeal of finding what is true a lost cause. But then in general I’m taking a similar position as Scott.
edit: oops, my bad, this was meant to be a response to above, I saw this pop up in the message feed without context
Finding reliable sources is 99% of the battle, and I have yet to find one which would for sure pass the “too good to check” situation: https://www.astralcodexten.com/p/too-good-to-check-a-play-in-three
Some people on this website get that for some topics, acoup blog does that for history, etc, but it’s really rare, and mostly you end up with “listen to Radio Liberty and Pravda and figure out the truth if you can.”
On a style side, I agree with other commenters that you have selected something where even after all the reading I am severely not convinced your criticism is correct under every possible frame. Picking something like a politician talking about the good they have done, despite actually being corrupt or something much more narrow in focus and black-and-white, leaving you much less surface to defend. Here, it took a lot of text, I am unsure what techniques I have learned since your criticisms require more effort to again check for validity. You explained that sunk cost fallacy pushed you for this example, but it’s still not too late to add a different example, put this one into Google doc and make it optional reading and note your edit. People may read this in the future, and no reason not to ease the concept for them!
Completely fair. Maybe I should share a few then?
I find Money & Macro (economics youtuber with Ph.d in the field) to be a highly reliable source capable of informed and nuanced reporting. Here is, for instance, his take on the Argentine dollarization plan, which I found much more comprehensive than most media sources.
Argentina’s Radical Plan to End Inflation, Explained—YouTube
In terms of Ukraine reporting, I rely pretty heavily on Perun, who likewise provides very informative takes with high emphasis on research and prevalent defense theories.
All Bling, no Basics—Why Ukraine has embarrassed the Russian Military (youtube.com)
See here, for instance, on his initial reaction to the invasion, and predictions of many of the war’s original dynamics (acute manpower shortages on the part of Russia, effects of graft and corruption, a close match of capabilities and tendency to devolve towards a longer war).
I consider these sources highly reliable, based off their ability to make concrete, verifiable predictions, steer clear of political biases, and provide coherent worldview models. Would you like to check them out and provide your thoughts?
Maybe a good idea. It depends on whether I can muster the energy for a separate edit, and if I can find a good relevant example. Do you have any suggestions in that regard? I know that unless I stumble across something very good I’m unlikely to make an edit.
Right, about this. So the overall point of the Ramaswamy example was to illustrate how subject specific knowledge is helpful in formulating a rebuttal and distinguishing between bullshit and non-bullshit claims.
See for example, this comment
Yes, that’s the whole point. I didn’t think it was a problem before, but now… well...
I think I’m starting to realize the dilemma I’m in. I aimed to explain something in full object level terms so I can properly explain why subject matter knowledge helps discern between a true and a false claim… but then actually discerning what’s true and what’s false requires subject matter knowledge I can’t properly distill in the span of a few thousand words. Catch-22, oops.
I could bring out the factual evidence and analyze it if you like, but I don’t think that was your intention. In any case, feedback appreciated! Yes, this was definitely an issue, I’ll take more care in future examples.
I think “subject specific knowledge is helpful in distinguishing between bullshit and non-bullshit claims.” is pretty clear on its own, and if you want to add an example it’d be sufficient to do something simple and vague like “If someone cites scientific studies you haven’t had time to read, it can sound like they’ve actually done their research. Except sometimes when you do this you’ll find that the study doesn’t actually support their claim”.
“How to formulate a rebuttal” sounds like a very different thing, depending on what your social goals are with the rebuttal.
Yeah, you’re kinda stuck between “That’s too obvious of a problem for me to fall into!” and “I don’t see a problem here! I don’t believe you!”. I’d personally err on the side of the obvious, while highlighting why the examples I’m picking are so obvious.
Yeah, I think that’d require a pretty big conversation and I already agree with the point you’re trying to use it to make.
I think most of the best posts on this website about the dark arts are deep analyses of one particular rhetorical trick and the effect it has on a discussion. For example, Setting the Zero Point or The noncentral fallacy—the worst argument in the world? are both discussions about hypothesis privilege that rely on unstated premises. I think reading these made me earnestly better at recognising and responding to Dark Arts in the real world. Frame Control and its response, Tabooing “Frame Control” are also excellent reads in my opinion.
Hm… right. I think your critiques are pretty on point in that regard. I may have diluted focus too much and sacrificed insight for a broad overview. Focus on a more specific technique is probably better.
I have a few ideas in mind, but I thought I’d get your opinion first. Do you think there’s any part of this post that warrants more detailed explanation/exploration with greater focus?