There are a few dark arts strategies that you can use, depending on the anonymity and epistemic hygiene of the forum.
Strategy 1: If someone asks if you are an expert, say you are. Assuming that you are talking about a subject which you actually know about, it is undoubted that there is an expert, and probably a bunch of experts, who agree with you. I happen to think you are wrong about printing of money leading to inflation, and I have some expertise in the field of economics, but I also know that there are definitely people considered experts who agree with you (not a large proportion of Ph.D. economists agree with your statement, but there are some, and many more self-styled economics experts working at various financial institutions).
I think this can be in service of good argumentation so long as you actually present an argument rather than simply relying on your claims of credentials to win the day. It could force your interlocutor to actually engage with your views, so it avoids them making a fallacious inference. This should be used with caution, obviously, as you should only claim credentials if there is an expert who would actually endorse them. Also, I don’t recommend claiming excessive credentials (“yes, I am a climate scientist from MIT and I was on the IPCC”), because that’s providing your argument with undeserved weight. The goal should be to clear their fallacious hurdle and get them to engage with your beliefs on their own merits.
Strategy 2: Say you agree with the hive mind on a related, yet different topic. I admit that I am guilty of this all the time on Reddit and Hacker News, and I am more than willing to use it preemptively as part of my initial post on a topic (generally as a reply to someone else’s point). People will generally not carefully consider arguments that they view as coming from someone from the other team, as we all know so well. But they will possibly consider arguments about tangential topics which they see as coming from within their own team.
My thought on this strategy is three-fold. First, you can’t argue every point in every argument, or else you will spend your entire life arguing. Choose what tiny thing you want to convince someone of, and feel free to let the world change the rest of their opinions in future arguments. It’s better to try to get someone to make a baby-step and hopefully later realize that all of their opinions are built on a foundation of silly putty. Even that much is usually impossible. Second, your arguments on a particular topic in an ideal world shouldn’t be evaluated based on your opinions of other topics, assuming you provide enough argumentation to screen off that information (which should be viewed as at least a minimal amount of information to provide in any case). Third, often you don’t even really have to agree, it can be enough to utter the right shibboleth to signal group membership, and that shibboleth doesn’t even need to point to a particular coherent belief to be at least a little bit disarming.
Depending on the anonymity of the forum, I would adjust how much preemptive agreement I would signal, however. If I’m using my real name, I would only say I agree with things that I actually agree with which are irrelevant, or that I kinda sorta agree with in its most uncontroversial form. This strategy doesn’t actually work in some extremely factious communities, however, since people use it too frequently. If you tell a Tumblr SJW, “I’m totally a feminist, but that particular statistic you are using is actually made up,” it’s just as likely that their memetic antibodies are going to signal you as an anti-feminist terrorist as that they’ll actually consider your evidence. Of course, if they didn’t lead to broken communities, they wouldn’t call them “dark arts.”
Strategy 3: Provide a wall of text with links. On reddit you will sometimes see posts that get bestof’d with a title like ”/u/PollyDoodlePants OBLITERATES climate denier with tons of FACTS” that are actually just posts with minimal argumentation but 30 links to standard arguments from people on their side, ideally without subtlety and nuance. In this form they are more likely to get accolades from people who already agree, rather than actually changing the mind of their debate partner or anyone in the audience. I’ve never actually done this before and consider it kind of pointless and annoying, so I don’t recommend it, but if its just your emotional state you’re concerned about it will probably make you feel better.
One thing I think you should consider, in general, is what Robin Hanson calls pulling policy ropes sideways. In general, argumentation is more fun and more enlightening for counterparties if you have a bunch of opinions you care about that aren’t part of the established lines of war. Look for places they are a relevant third option, or just something to consider for other readers.
If the argument you’re making has already been made in the New York Times, it’s probably too late. You probably aren’t bringing anything to the table that hasn’t been argued at greater length and more effectively elsewhere. If you feel passionate about something that others don’t care about (yet), your comments bring greater diversity of thought and are the true gems of Internet communities. Part of the fascination of Less Wrong is in these kinds of ideas, like AI risk, cryonics, decision theory. They aren’t part of the culture war, so when people like us first read the site, they get a blast of new ideas that aren’t discussed elsewhere. This isn’t a dark art, this is the daylight art that makes communities better, stronger and more interesting. Just, please, for the love of God, don’t try to shoehorn your pet belief into every conversation you have (this is the most common failure mode of this strategy, and what we sometimes see on Less Wrong). And of course, you may think you’re pulling the rope sideways, but sometimes the battle lines actually have been drawn, even if just in your little community (neoreaction on LW), and then this no longer works.
After reading some articles suggested here and learning about dark art, light art, the objective of seeking truth rather than status, debate strategies, and more, I see this whole issue is a world I didn’t know existed at least with this depth.
It is frustrating to see that I have spent my life up to now in the dark, but it is obviously extremely rewarding to discover a space where honesty and truth are the priority rather than personal advantage and abuse of other.
Strategy 1: If someone asks if you are an expert, say you are.
I can do this since it doesn’t require credentials (I am a dropout) but by dedication and work I have some depth about certain topics that can qualify as “expert”.
Strategy 2: Say you agree with the hive mind on a related, yet different topic.
Strategy 3: Provide a wall of text with links.
These are useful when the counterpart is really expecting authority from you, I think many “discreditors” on the internet just have the objective of destroying you no matter what you bring.
Moral: Participate in communities and try to identify members who are really seeking answers and truth!
There are a few dark arts strategies that you can use, depending on the anonymity and epistemic hygiene of the forum.
Strategy 1: If someone asks if you are an expert, say you are. Assuming that you are talking about a subject which you actually know about, it is undoubted that there is an expert, and probably a bunch of experts, who agree with you. I happen to think you are wrong about printing of money leading to inflation, and I have some expertise in the field of economics, but I also know that there are definitely people considered experts who agree with you (not a large proportion of Ph.D. economists agree with your statement, but there are some, and many more self-styled economics experts working at various financial institutions).
I think this can be in service of good argumentation so long as you actually present an argument rather than simply relying on your claims of credentials to win the day. It could force your interlocutor to actually engage with your views, so it avoids them making a fallacious inference. This should be used with caution, obviously, as you should only claim credentials if there is an expert who would actually endorse them. Also, I don’t recommend claiming excessive credentials (“yes, I am a climate scientist from MIT and I was on the IPCC”), because that’s providing your argument with undeserved weight. The goal should be to clear their fallacious hurdle and get them to engage with your beliefs on their own merits.
Strategy 2: Say you agree with the hive mind on a related, yet different topic. I admit that I am guilty of this all the time on Reddit and Hacker News, and I am more than willing to use it preemptively as part of my initial post on a topic (generally as a reply to someone else’s point). People will generally not carefully consider arguments that they view as coming from someone from the other team, as we all know so well. But they will possibly consider arguments about tangential topics which they see as coming from within their own team.
My thought on this strategy is three-fold. First, you can’t argue every point in every argument, or else you will spend your entire life arguing. Choose what tiny thing you want to convince someone of, and feel free to let the world change the rest of their opinions in future arguments. It’s better to try to get someone to make a baby-step and hopefully later realize that all of their opinions are built on a foundation of silly putty. Even that much is usually impossible. Second, your arguments on a particular topic in an ideal world shouldn’t be evaluated based on your opinions of other topics, assuming you provide enough argumentation to screen off that information (which should be viewed as at least a minimal amount of information to provide in any case). Third, often you don’t even really have to agree, it can be enough to utter the right shibboleth to signal group membership, and that shibboleth doesn’t even need to point to a particular coherent belief to be at least a little bit disarming.
Depending on the anonymity of the forum, I would adjust how much preemptive agreement I would signal, however. If I’m using my real name, I would only say I agree with things that I actually agree with which are irrelevant, or that I kinda sorta agree with in its most uncontroversial form. This strategy doesn’t actually work in some extremely factious communities, however, since people use it too frequently. If you tell a Tumblr SJW, “I’m totally a feminist, but that particular statistic you are using is actually made up,” it’s just as likely that their memetic antibodies are going to signal you as an anti-feminist terrorist as that they’ll actually consider your evidence. Of course, if they didn’t lead to broken communities, they wouldn’t call them “dark arts.”
Strategy 3: Provide a wall of text with links. On reddit you will sometimes see posts that get bestof’d with a title like ”/u/PollyDoodlePants OBLITERATES climate denier with tons of FACTS” that are actually just posts with minimal argumentation but 30 links to standard arguments from people on their side, ideally without subtlety and nuance. In this form they are more likely to get accolades from people who already agree, rather than actually changing the mind of their debate partner or anyone in the audience. I’ve never actually done this before and consider it kind of pointless and annoying, so I don’t recommend it, but if its just your emotional state you’re concerned about it will probably make you feel better.
One thing I think you should consider, in general, is what Robin Hanson calls pulling policy ropes sideways. In general, argumentation is more fun and more enlightening for counterparties if you have a bunch of opinions you care about that aren’t part of the established lines of war. Look for places they are a relevant third option, or just something to consider for other readers.
If the argument you’re making has already been made in the New York Times, it’s probably too late. You probably aren’t bringing anything to the table that hasn’t been argued at greater length and more effectively elsewhere. If you feel passionate about something that others don’t care about (yet), your comments bring greater diversity of thought and are the true gems of Internet communities. Part of the fascination of Less Wrong is in these kinds of ideas, like AI risk, cryonics, decision theory. They aren’t part of the culture war, so when people like us first read the site, they get a blast of new ideas that aren’t discussed elsewhere. This isn’t a dark art, this is the daylight art that makes communities better, stronger and more interesting. Just, please, for the love of God, don’t try to shoehorn your pet belief into every conversation you have (this is the most common failure mode of this strategy, and what we sometimes see on Less Wrong). And of course, you may think you’re pulling the rope sideways, but sometimes the battle lines actually have been drawn, even if just in your little community (neoreaction on LW), and then this no longer works.
The likely reputational consequences don’t look too good.
Such a wall of text is unlikely to help you in making friends and influencing people, not to mention again the likely reputational consequences.
Thx for you comment and taking the time!
After reading some articles suggested here and learning about dark art, light art, the objective of seeking truth rather than status, debate strategies, and more, I see this whole issue is a world I didn’t know existed at least with this depth.
It is frustrating to see that I have spent my life up to now in the dark, but it is obviously extremely rewarding to discover a space where honesty and truth are the priority rather than personal advantage and abuse of other.
I can do this since it doesn’t require credentials (I am a dropout) but by dedication and work I have some depth about certain topics that can qualify as “expert”.
These are useful when the counterpart is really expecting authority from you, I think many “discreditors” on the internet just have the objective of destroying you no matter what you bring.
Moral: Participate in communities and try to identify members who are really seeking answers and truth!