Conspiracy Theories as Agency Fictions
Related to: Consider Conspiracies, What causes people to believe in conspiracy theories?
Here I consider in some detail a failure mode that classical rationality often recognizes. Unfortunately nearly all heuristics normally used to detect it seem remarkably vulnerable to misfiring or being exploited by others. I advocate an approach where we try our best to account for the key bias, seeing agency where there is none, while trying to minimize the risk of being tricked into dismissing claims because of boo lights.
What does calling something a “conspiracy theory” tell us?
What is a conspiracy theory? Explanations that invoke plots orchestrated by covert groups are easily called or thought of as such. In a more legal sense conspiracy is an agreement between persons to mislead or defraud others. This simple story gets complicated because people aren’t very clear on what they consider a conspiracy.
To give an example, is explicit negotiation or agreement really necessary to call something a conspiracy? Does silent cooperation on Prisoner’s Dilemma count? What if the players are deceiving themselves that they are really following a different goal and the resulting cooperation is just a side effect? How could we tell the difference and would it matter? The latter is especially interesting if one applies the anthropic principle to social attitudes and norms.
The phrase is also a convenient tool to mark an opponent’s tale as low status and unworthy of further investigation. A boo light easily applied to anything that has people acting in something that can be framed as self-interest and happens to be few inferential jumps away from the audience. Not only is its use in this way well known, this is arguably the primary meaning of calling an argument a conspiracy theory.
We have plenty of historical examples of high-stakes conspiracies so we know they can be the right answer. Noting this and putting aside the misuse of the label, people do engage in crafting conspiracy theories when they just aren’t needed. Entire communities can fixate on them or fail to call such bad thinking out. Why does this happen? Humans being the social animals that we are, the group dynamics at work probably need an article or sequence of their own. It should suffice for now to point to belief as attire, the bandwagon effect and Robin Hanson’s take on status. Let’s rather consider the question of why individuals may be biased towards such explanations. Why do they privilege the hypothesis?
When do they seem more likely than they are?
First off we have a hard time understanding that coordination is hard. Seeing a large pay off available and thinking it easily in reach if “we could just get along” seems like a classical failing. Our pro-social sentiments lead us to downplay such barriers in our future plans. Motivated cognition on behalf of assessing the threat potential of perceived enemies or strangers likely shares this problem. Even if we avoid this, we may still be lost since the second big relevant thing is our tendency for anthropomorphizing things that better not be. Ours is a paranoid brain seeing agency in every shadow or strange sound. The cost of false positives was once reasonably low, while the cost of a false negative very high.
Our minds are also just plain lazy. We are pretty good at modelling other human minds and considering just how hard the task really is, we do a pretty remarkable job of it. If you are stuck in relative ignorance on a subject, say the weather, dancing to appease the sky spirits makes sense. After all the weather is pretty capricious and angry sky spirits is a model that makes as much or more sense as any other model you know. Unlike some other models this one is at least cheap to run on your brain! The modern world is remarkably complex. Do we see ghosts in it?
Our Dunbarian minds probably just plain can’t get how a society can be that complex and unpredictable without it being “planned” by a cabal of Satan or Heterosexual White Males or the Illuminati (but I repeat myself twice) scheming to make weird things happen in our oblivious small stone age tribe. Learning about useful models helps people escape anthropomorphizing human society or the economy or government. The latter is particularly salient. I think most people slip up occasionally in assuming that say something like the United States government can be successfully modelled as a single agent to explain most of its “actions”. To make matters worse it is a common literary device used by pundits.
A mysterious malignant agency or someone keeping a secret playing the role of the villain makes a good story. Humans love stories. Its fun to think in stories. Any real conspiracy revealed will probably be widely publicized. Peter Knight in his 2003 book cites historians who have put forward the idea, that the United States is something of a home for popular conspiracy theories because so many high-level ones have been undertaken and uncovered since the 1960s. We are more likely to hear about real confirmed conspiracies today than ever before.
Wishful thinking also plays a role. A universe where bad things happen because bad people make them to is appealing. Getting rid of bad people, even very bad people, is easy compared to all the different things one has to do to make sure bad things don’t happen in a universe that doesn’t care about us and where really bad things are allowed to happen. Finding bad people whether there are or aren’t is a problematic tendency. The sad thing is that this may also be how we often manage to coordinate. Do all theories of legitimacy also perhaps rest on the same cognitive failings that conspiracy theories do? The difference between a shadowy cabal we need to get rid of and an institution worthy of respect may be just some bad luck.
How this misleads us
Putting aside such wild speculation, what should we take away from this? When do conspiracy theories seem more likely than they are?
The phenomena is unpredictable or can’t be modelled very well
Models used by others are hard to understand or are very counter-intuitive
Thinking about the subject significantly strains cognitive resources
The theory explains why bad things happen or why something went wrong
The theory requires coordination
When you see these features you probably find the theory more plausible than it is.
But how many here are likely to accept “conspiracy theories”? To do so with stuff that actually gets called a conspiracy theory doesn’t fit our tribal attire. Reverse stupidity may be particularly problematic for us on this topic. Being open to thinking conspiracy is recommended. Just remember to compare how probable it is in relation to other explanations. It is important to call out people who misuse the tag for rhetorical gain.
This applies to debunking as well. Don’t go wildly contrarian. But remember that even things that are tagged conspiracy theories are surprisingly popular. How popular might false theories that avoid that tag be? History shows us we don’t have the luxury of hoping that kind of thing just doesn’t happen in human societies. When assessing an explanation sharing the key features that make conspiracy theories seem more plausible than they are, compensate as you would with a conspiracy theory.
But don’t listen to me, I’m talking conspiracy theories.
Note: This article started out as a public draft, feedback to other such drafts is always welcomed. Special thanks to user Villiam_Bur for his commentary and user copt for proofreading and suggestions. Also thanks to the LessWrong IRC chatroom for last minute corrections and stylistic tips.
- Intellectual insularity and productivity by 11 Jun 2012 15:10 UTC; 80 points) (
- 11 Jun 2012 19:50 UTC; 31 points) 's comment on Intellectual insularity and productivity by (
- 1 May 2012 11:02 UTC; 22 points) 's comment on Open Thread, May 1-15, 2012 by (
- 2 Sep 2012 9:17 UTC; 10 points) 's comment on Open Thread, September 1-15, 2012 by (
- [Link] Why don’t people like markets? by 20 Jun 2012 10:15 UTC; 10 points) (
- 11 Jun 2012 14:51 UTC; 5 points) 's comment on Open Thread, June 1-15, 2012 by (
- 19 Jan 2013 13:33 UTC; 2 points) 's comment on [Link] Noam Chomsky Killed Aaron Schwartz by (
- 4 Jul 2012 21:00 UTC; 2 points) 's comment on [Link] Why the kids don’t know no algebra by (
- 9 Jun 2012 15:23 UTC; 0 points) 's comment on Open Thread, May 1-15, 2012 by (
- 20 Jun 2012 10:12 UTC; 0 points) 's comment on Open Thread, June 16-30, 2012 by (
For anyone interested in conspiracy theories, Umberto Eco’s Foucault’s Pendulum is required reading. As the TvTropes article on the book says:
The plot is basically that the narrator and a couple of friends, bored, skeptical intellectuals who work in a publishing company where they deal daily with crackpot conspiracy theorists, decide one day to invent just for kicks the ultimate conspiracy theory, a Plan that explains the whole history of the world. Not spoiling much when I say it doesn’t end well.
Near the end there are some nice reflections on what lies behind the impulse to invent such theories. An excerpt:
I disagree that fictional evidence is a good source of information on a controversial topic. Yes, psychology is always a controversial topic and fiction is actually a good source of information about it. But fiction is generally more reliable on topics it is not explicitly addressing.
Note that Italy is the country in which the Freemasons really did have a plan to take control over the press and then the government.* The country whose future prime minister claimed to have learned from a Ouija board the location of the kidnapped former PM. The publication of the book is closer in time to these events than to current day.
* Large groups of Italians can’t keep secrets for decades, but secrecy turned out not to be necessary.
Another excerpt concerns the Kabbalic interpretation of automobiles; I find it hysterical.
Funny, indeed, but I had to fish your website out of the Google cache to read it!
I think the problem is on your end; I can access it fine via Firefox or elinks, Pingdom has sent me no reports nor Cloudflare, downforeveryoneorjustme says it’s up, and my Google Analytics are reporting a usual amount of traffic to my domain.
I second this recommendation.
As an interesting (to me, at least) aside, Gene Sharp’s research on nonviolent resistance indicates that successful nonviolent resistance invariably involves taking to heart this little idea—that governments are not single agents but systems of many agents pursuing their own ends—and exploiting it to the max.
And of course this doesn’t just apply to governments, but to any organisation. Corporations, religious organisations, and even charities are all (to varying extents) vulnerable to this approach.
Sure. Any time you hear a claim like “The Catholic Church believes this …” or “Microsoft wants thus-and-so …”, you’re hearing someone anthropomorphize an organization.
This needs a lot more work before it’s a great essay. The way I would write it is basically ‘a classic evolutionary explanation of religion is overactive false-positive agent-detection; here’s quotes from several books by the likes of Daniel Dennett etc and also here’s a few studies courtesy of gwern about kids; now that we understand the idea and find it plausible, let me extend it to… conspiracy theories! and so on.’
Aha! So you clearly see the potential for greatness. ;)
I will try to improve it.
Excellent observation, I didn’t think of the obvious parallels. I think someone just considering religion could easily stumble upon these conclusion but that wasn’t the road I travelled. I spent a lot of time comparing various different conspiracy theories and researching the psychology behind them.
I see rather a lot of typos and incomplete sentences.
This jumped out at me. There were several others.
This is filled with incomplete sentences.
Overall, I liked the ideas here, but the writing made them hard to follow. I’m also troubled by the lack of examples (see Douglas_Knight’s comment below).
Based on feedback I’ve changed the above paragraph into:
“Putting aside such wild speculation, what should we take away from this? When do conspiracy theories seem more likely than they are?
The phenomena is unpredictable or can’t be modelled very well
Models used by others are hard to understand or are very counter-intuitive
Thinking about it significantly strains cognitive resources
Explains why bad things happen or why something went wrong
Requires coordination
When you see these features you probably find the theory more plausible than it is. ”
Is this an improvement?
Somewhat.
First bullet: join the two phrases with either “and” or “or”. Also, you seem to have at least two (possibly three) antecedents for “it” in those bullets. I suspect removing all four instances would be clearer.
Great suggestions, thank you. I will try to avoid such mistakes in future writing. I’m just wondering however, how I can get rid of it in this sentence:
“Thinking about it significantly strains cognitive resources”
Don’t remove the sentence; replace “it” with its antecedent. In other words, answer the question “thinking about what?”. Thinking about the conspiracy theory? The actual sequence of events that happened? Or the non-conspiracy explanation for those events? That’s what I meant for all four bullet points.
As a general rule, “it” is fine when the intended antecedent is in the same sentence, and there is only one such antecedent for all instances of “it” in a single sentence. Multiple distinct instances in one sentence, or an unambiguous antecedent earlier in the same paragraph, can often be fine, but should be scrutinized more closely. Antecedents that don’t appear in the same paragraph are generally a bad idea. (As always, there are exceptions and details. But that’s a good starting point.)
Thank you very much for your patience, thinking about language really isn’t my thing, I think the OP is now much better due to your advice.
Please PM me so I can fix them! I’ve been very grateful to the proofreaders so far. :)
By the time I read your comment this sentence was already complete. Are you sure you didn’t misread it? I did several corrections and minor edits since posting the original article so maybe I fixed it and forgot about it.
That paragraph was originality one long sentence, after gwern’s comment I broke it up to make it more readable. Would a list be better?
This was intended as a feature rather than a bug. But if many people are bothered by this maybe should make a follow up post that analyses several examples to see where they conform or not to the features described here.
“Our” is incorrect here. It think you mean “ours”.
Ah! Fixed.
I think starting with religion would be a mistake, for standard ‘politics is the mindkiller’ reasons.
Examples (or links) would be good here, and would calibrate your point.
A very good point! Thank you for the feedback.
I will build up a list in this comment and then link to it from the original article. To keep the list from bloating, I’m going to stick to well known examples:
Hundreds if not thousands of attempted and successful plots to assassinate political leaders from Julius Caesar to King James I.
Many coups d’état
Project MKULTRA
The COINTELPRO operations
Communist infiltration of the United States during WW2 and the Cold War, reaching as far as its most closely guarded secrets
Modern art as a Potemkin village of creativity
Operation Snow White carried out by the Church of Scientology
Nearly all operations of any intelligence agency ever.
What would really be useful would be example of conspiracy theories that were accepted by a fringe group, rejected by the mainstream (for at least a decade, say), and ultimately found to be true. Maybe the COINTELPRO would qualify for this—were some of the targeted groups complaining about FBI targeting them?
Interesting edge case is the whole McCarthy stuff—he was right that there was a quite a bit of communist spying, but he appeared to have no evidence whatsoever for this (and his specific accusations were mostly random). Does accidentally being correct count? Or is this more another case of “reverse stupidity isn’t intelligence”?
Speaking of COINTELPRO...
http://en.wikipedia.org/wiki/Citizens%27_Commission_to_Investigate_the_FBI
Basically, a fringe left-wing group had (mostly) done the job of American institutions when those failed, and revealed a conspiracy to the public. The “commies” were not (just) plotting against the U.S., but protecting it against its own government! That’s wilder than most conspiracy theories.
I’m pretty sure that McCarthy is an example of reverse stupidity. The Soviet Union had plenty of spies in America, just like we had spies in the Soviet Union, but my reading has mostly pointed me towards the hypothesis that McCarthy didn’t have any special knowledge of who the Soviet spies were.
McCarthy was not after spies as in “trained intelligence workers”, he was trying to ferret out a certain subset of his political opponents, whose goals he imagined to be aligned with those of the USSR. Nowdays, of course, most people consider his activities to be anti-American.
McCarthy was being fed info from J. Edgar Hoover, who did have access to the Venona transcripts. I don’t know if he was given the identities of known spies, but he was sent after Hoover’s bureaucratic rivals.
He was after communist agents of influence not spies specifically and his method was going after people visibly spreading communist memes.
That link is a glaring example of the intellectual decline of the American right wing. ESR has said so many dumb things! Radical/trend-setting Western intellectuals do not insist that the Western civilization is evil, held up by slave labour and must atone for its sins, etc. simply because they have some particular anti-Western agenda! This is an essential element of Western culture, I’d say—its self-abnegation, self-doubt, applying higher standards to itself, all that ostensibly “bleak”/”nihilistic”/”ultra-puritan” stuff. We have that relentless drive to fight a war with ourselves. What other culture can mourn and lament its flaws like ours? We even learned to, um, get off on it—in a way.
This is what has been present in it since Christianity’s inception: the radicalism, the urge to “immanentize the eschaton”, the denial of local boundaries and ties in favor of a global Logos which all would live under and by. A certain left-wing tendency is in our figurative (and maybe literal) blood.
McCarthy looks rather comical in this regard; he saw the tip of the iceberg, guessed that there must be more under the water, but overlooked the fact that R’lyeh itself is beneath and his grandparents were fish-people. (All of it for the better, in my honest opinion!)
True, however, Eric’s point was about why this particular element of Western civilization was elevated above all others over that past century.
That explanation is literally impossible. The memes I refer to are Clergy (or Brahmin as Moldbug calls them) memes first and foremost, and the Clergy is a decentralized, informal, horizontal network that operates totally in the open and has deep roots in the West but almost none in Russia.
See: The Open Conspiracy by H.G. Wells. See: Andrew Carnegie, the Carnegie Foundation and the Dodd Report. On the phenomena mentioned above, see: the Frankfurt School. (will add links later, too lazy)
Philosophy? Universalist. “Serious” literature? Universalist. Social sciences departments? Universalist. NGOs and advocacy organizations? Universalist. The Soviet intelligence agencies/KGB looked like rural thugs next to them. Those thugs were rightly seen as rabid and hostile by the Western intelligentsia from WW2 onwards, and it had no way of controlling them, but at no point did they exercise any serious intellectual influence of their own. The dog could hardly control the mutated tail, but the tail did not wag the dog.
I recommend adding human radiation experiments to the list. That one is outstanding because it involved very huge number of personnel, and because of the utterly minuscule backslash when it was declassified. edit: this is a good starting point:
http://en.wikipedia.org/wiki/Human_radiation_experiments_in_the_United_States
It is basically a fact that government can do anything like this, and you wouldn’t know. You can’t either dismiss or confirm the claims of conspiracies without relying on actual evidence; all attempts to do so are a fallacy of trying to come up with some solution when correct solution is not available.
The funny thing (well, maybe not so funny) is that this can hold true for years even when clues are virtually sitting in plain sight, just waiting for someone to join the dots.
For example, of the experiments on that Wikipedia list, I’m most familiar with those Eugene Saenger & co-workers at the University of Cincinnati carried out: administering half-body & total-body irradiation to cancer patients between 1960 & 1971. Between those years Saenger et al. actually published 4 articles about what they did in scientific journals, the first of which appeared in Science! (The references are below to show I’m not making that up.)
The Science article mentions that its two subjects were “patients receiving therapeutic irradiation of the whole body for malignancies”, references a paper presented at a 1961 Conference on Total Body Irradiation at the university, and explicitly acknowledges support from “contract DA-49-146-XZ-029 of the Defense Atomic Support Agency”, a Department of Defense agency. However, this paper wouldn’t have raised red flags for contemporary readers. It implies (most likely falsely) that the irradiation was a normal treatment.
If a sufficiently knowledgeable & careful reader had read the next paper, however, they would’ve noticed something odd. That paper reported results from 7 patients, once again acknowledged support from a DASA grant, and referenced a 1963 technical report by Saenger for that DoD agency. The wrinkle is that all seven patients are listed as having either carcinoma or sarcoma, but (according to pages 66 & 73-75 of a later paper criticizing Sanger) existing research had already indicated that total body irradiation didn’t work adequately for solid tumours like carcinomas & sarcomas. This ought to have been a tip-off that the irradiation wasn’t really about treating patients. (Not to mention that neither the Science paper nor this newer paper discussed the therapeutic effect of the irradiation on the patients’ cancers. Instead they focused on finding potential biomarkers for acute radiation exposure.)
So the papers published about these particular experiments in 1963 & 1966 had enough information to show where they were taking place, who was running them, that they were (partly) funded by the DoD, that they involved irradiating cancer patients, and that this irradiation wasn’t for the subjects’ benefit. Yet the research wasn’t publicly exposed as dodgy until 1971, and then only because a reporter writing a book interviewed Saenger and Saenger’s tongue apparently proved too loose.
H. K. Berry, E. L. Saenger, H. Perry, B. I. Friedman, J. G. Kereiakes, Carolyn Scheel (October 1963). “Deoxycytidine in Urine of Humans after Whole-Body Irradiation”, Science, 142, 396-398.
A. J. Luzzio, B. I. Friedman, J. G. Kereiakes, E. L. Saenger (January 1966). “Specific proteins in serum of total-body irradiated humans”, The Journal of Immunology, 96, 64-67.
I-W. Chen, J. G. Kereiakes, B. I. Friedman, E. L. Saenger (August 1968). “Radiation-Induced Urinary Excretion of Deoxycytidine by Rats and Humans”, ″Radiology″, 91, 343-348.
L. A. Gottschalk, R. Kunkel, T. H. Wohl, E. L. Saenger, C. N. Winget (November 1969). “Total and Half Body Irradiation: Effect on Cognitive and Emotional Processes”, ″Archives of General Psychiatry″, 21, 574-575.
One thing that seems relevant here as a distinguishing factor which helps point out some but by no means all “conspiracy theories” is that often conspiracy theories as such are long term and overarching. So for example, the Bavarian Illuminati were founded in the 1700s. Thus, a conspiracy about the Illuminati will claim that they have been running things behind the scenes for a long time. That drastically reduces their chances. Moreover, while there are many such conspiracy theories, they often label the group behind the curtain differently. Heuristically, conspiracy theories satisfying such properties should be assigned a very low value.
This doesn’t really help though for quite a few conspiracy theories that are commonly ridiculed (e.g. Apollo hoax claims, and 9/11 Truther claims).
The Bavarian Illuminati are (rather ironically) an example of an actual political conspiracy whose beliefs would be pretty unremarkable today. They were liberal humanists; they believed in freedom of religion, reason, improving people’s morals by studying secular ethics, and republican government. Why were they secretive and conspiratorial? Because they were operating in 18th-century Bavaria, a conservative Catholic monarchy where religious dissent was illegal, the secret police investigated social groups to uncover political dissent, and republicanism would mean overthrowing the government.
It’s a silly counterfactual, but I can’t resist imagining that if Weishaupt lived today, he’d post on Less Wrong.
Some significant counterexamples to your heuristic are the criminal organizations with old historical roots, such as the Camorra or the Cosa Nostra. Their operations have been deeply conspiratorial and at the same time immensely influential, at least at the level of local politics, with institutional continuity of roughly the same vintage as that ascribed to the Illuminati.
That’s a really good point. But even then, no one has ever been in doubt about their continued existence, and they’ve never had control of whole continents or the like. But yes, the basic point is sound and does substantially undermine my statement.
A useful heuristic. I think this kind of “secret elite has been ruling since forever” theory is a product of far thinking.
I’d like you to explain how covertly supporting creative stuff, usually without the knowledge of its creators, can possibly make it less creative. What, do you think that because America’s cultural potential was trumpeted by this agency or that, it actually deceived the public about its merit or something? (For the record, I think that the CIA is a criminal conspiracy in most of its well-known aspects. And Hitler ate sugar!)
If that was not your intended meaning, please rephrase it.
Only the sources of money were hidden. The support itself was quite visible, see Konkvistador’s linked article. That attracts more wannabes, bringing the average quality down.
If the linked article is correct, the CIA did more than merely trumpet Abstract Expressionism. They arranged funding (e.g., for exhibitions) that would have otherwise not been present, which does indeed signal greater merit than was actually the case.
The apparent success of AE has been something of a mystery to me, but now I know part of the reason why it succeeded. TL; DR: “artistic merit” is signalling.
But, back then, there was a huge bias against all avant-garde/unconventional art present in the U.S.! Surely the CIA’s promotion effort could hardly outbalance the prevailing cultural attitudes of the time.
You’ve shifted the locus from “[the CIA] deceived the public about its merit...” to “Surely the CIA’s promotion effort could hardly outbalance the prevailing cultural attitudes of the time.”
There is definitely a thinking failure mode associated with conspiracy theories. The trouble is that lots of things have been rejected as “conspiracy theories” and turned out to be true: look at what the Leveson Inquiry is revealing about the Murdoch press’s association with politicians and police, for example. Or consider COINTELPRO.
I rather like this blog post on the subject alongside much of what that blogger has written.
Thank you for the interesting links! I’m well aware of this problem and tried to highlight it:
Perhaps I should have emphasised this point more.
I think I should have read more carefully before responding! Will re-read more carefully.
Could you substantiate the claim that those two examples were “rejected as ‘conspiracy theories’”?
Looking at the press association example, I think that one problem here is that similar ideas are being blurred, and given a single probability instead of separate ones.
A lot of the theories involving press/politician association involve conspiracy to conceal specific, high impact information from the public, or similar levels of dysfunction of the media. Most of these are low probability (I can’t think of any counterexamples offhand); as far as I know either no or a very small percentage of such theories have been demonstrated as true over time.
Different theories involving association have different probabilities. The Leveson Inquiry is providing reasonably strong evidence for influence and close social connections, so the proposition that that existed would seem to have been fairly accurate.
I don’t know what exactly you heard described as a conspiracy theory, in the fairly large space of possible theories, but it seems to me that that example is a good case where it is important to review the evidence for, and recognise fallacies (including overestimation of the probability of agency) in a specific theory, rather than decide what classification of theory it falls into, and judge it based on whether theories in that classification are generally “conspiracy theories”.
Bayes!
This needs examples.
When I think of “conspiracy theories” I think of ones connected to JFK’s assassination or 9/11. The official line is already that these “bad things happen because bad people cause them to happen.” In the case of 9/11, the official line is that it was a conspiracy theory—the disagreement is just about which bad people!
The lack of examples seems to me an extremely bad sign about the process you used to generate this essay.
I feared that if I used an example, it would be the only thing that people would remember. Or worse the only thing they would comment on.
When I think “conspiracy theories” I think Illuminati are ruling the world and this is why my life sucks or that chemtrails are sterilising us or the notion that the Apollo Moon landing was faked or that the US government is concealing information regarding extraterrestrial intelligent life. ect. ect. I generally think the “no! It was the other guy.” conspiracies are better explained by the group dynamics I didn’t want to tackle in this post.
If you are wondering about the process I used. Well I was procrastinating online and ended up reading a bunch of wikipedia articles on conspiracy theories. I had some ideas about why people would find them appealing or plausible and I cross-checked that with some LW material. After that I set out to systematically read some more conspiracy theory summaries (again on wikipedia). I did some thought and a few days later jotted down my ideas in the linked comment. I then proceeded to uhm… acquire ….a book on conspiracy theories. After reading it I mostly forgot about the subject until reading the notes from Peter Thiel’s class. They got me thinking about scapegoating. I wrote something up, edited it a bit, polished and posted.
Then gwern told me it could be better and I started editing it to polish it up more.
This wasn’t serious academic research by any stretch of the imagination. Generally speaking posts generated by similar algorithms seemed well liked in discussion and comment section so I assumed it was ok if I approached this the same way I would writing a blog post.
Mental image: Some contrarian posting here claiming it was a navigational accident.
Oh, you mean airlines or aircraft manufacturing cultivating culture of cutting costs on maintenance or QA which leads to horrible accidents?
I heard an argument for spreading such a consipracy theory: regardless of whether it is true, spreading such a theory helps to move making harder-to-crash planes and automatical safeguards up the priority list, above providing network access to passengers in-flight.
The problem with this argument is that there are costs to causing things to happen via spreading misinformation; you’re essentially biasing other people doing expected utility evaluations by providing inaccurate data to them. People drawing conclusions based on inaccurate data would have other effects; in this example, some people would avoid flying, suffering additional costs. People are also likely to continue to support the goals the conspiracy theory pushes towards past the point that they actually would have the greater expected utility without the conspiracy theory’s influence on probability estimates, causing bad decisions later.
It’s possible that after factoring all this in, it could be worthwhile in some cases. But given the costs involved I think, prior to any deeper study of the situation, it would be more likely harmful than beneficial in this specific example.
I actually agree with most of your argument and, probably, with a conclusion. I just wanted to show shades of gray omitted in the original post.
Actually, I can restate the argument I quoted to be technically true. Or I can restate it as a full-blown conspiracy theory. They can still be made quite close from the point of view of what actually happens, though. I think that the lightest reframing are net-positive perspective changes (but somewhat risky), by the way.
Scenario A. Aircraft manufacturers know full well what is needed to prevent most accidents—both ones now classified as technical failures caused by bad maintenance and those claimed to be human error that are sometimes actually technical malfunctions (succesfully covered up). They don’t implement many of the known-to-them safety features because of the cost, and sometimes deliberately omit cheap safety features to increase renewal of aircraft fleet. They reduce robustness slowly over time in hope that public would get fed up with disasters and require “something to be done”. They know already how to implement things that would be required by new statutes but implementing state-mandated safety features will be a good excuse to increase prices a lot—with a big increase in profit margins.
Scenario B. Technically, the very ability of a plane to be turned into collision course with a well-known big non-moving object (be it a mountain, WTC or anything) is a failure of safety measures and navigation. It should be possible to deliver such protection, and if it is not possible yet, it should be the top priority, way above “Internet on board” or such things. If considering 9/11 a navigation failure makes you want not to fly—well, there are many causes that ultimately lead to risky manoeuvrs. If they still can lead to a disaster in the XXI century, shifting blame doesn’t help—you either accept the risk or not.
Scenario C. http://en.wikipedia.org/wiki/2002_%C3%9Cberlingen_Mid-Air_Collision illustrates more than 9/11. There is a coordination problem: safety protocols that would work fine on their own sometimes lead to a disaster when mixed (the collision was related to a mistake by dispatcher; Tu154 pilot knew that automatic collision prevention and dispatcher commands contradict but Russian rules give precedence to dispatcher and European rules give precedence to automatic system). Also, the transportation market is such that 1-in-N chance of death being replaced with 1-in-2N isn’t easy to prove and it doesn’t lead to people easily paying 50$ more for the flight.
I think part of the problem is that people (both conspiracy theorists and debunkers) tend to confuse prospiracies and conspiracies.
A decent heuristic might be to recognize that any given conspiracy theory is very unlikely to be true, but also that they are probably many things going on that would be correctly considered conspiracy theories if found out even if you can never know what these are.
I don’t think that’s quite it. Huge progress has been made against infectious diseases in spite of the lack of a sentient enemy.
I think that, whether or not it’s easier to succeed against human or non-human problems, that for many people it’s more fun to think about people who deserve to be punished than to think about the specific details needed to solve a problem that’s the result of an uncaring universe.
Pathogens can be modeled as a miniaturized, undisciplined army to useful effect.
Details?
Hiding in rough terrain (unsterilized objects) where it’s more cost-effective to wipe them out with fire or poison, attacking targets of opportunity within the body, scattering when opposed rather than taking the harder road to proactively pursue a strategy. Honey can be used as a disinfectant because bacteria gorge themselves on simple sugars until they literally burst, just like an armed encampment can be put off their guard with an oversupply of food and wine.
More generally, I think people can do just fine at modeling living creatures with intelligence equal to or less than their own. The breakdown is with modeling creatures of greater intelligence, who pursue goals that only make sense in hindsight, and cosmic forces which cannot be outwitted or negotiated with because they do not pursue goals at all.
Hm… When I asked on Wikipedia, I was told it was in part due to the hygroscopicness of honey: not enough water to support bacterial life.
These experimental findings on How honey kills bacteria attribute the bulk of the antibacterial action to a specific protein defensin-1 that the bees put into the honey.
And yet, someone who applied honey (or distilled alcohol!) to a festering wound thinking that it would disorient the encamped pathogens and thereby give the body’s defenders an opportunity to rally, without understanding the actual mechanism, could still be successful.
Sorry, but why are these being upvoted? It’s a cute analogy—but this is far from a “model” as was claimed. Try making some predictions from these explanations and see how horribly awry they go: why just honey? I should be shoving waffles into my wounds whenever I get a scratch, since everyone knows waffles are incredibly tasty. This analogy might work to convince an uncooperative child to take his antibiotics or get a shot, but it’s not going to cut it as a useful model. The only reason this is working is that you already know the answers!
That then boils down to wanting an interesting story doesn’t it?
This is the bit I was arguing with.
I was interpreting it as meaning “some bad things” and it’s possible that you meant “all bad things”.
Oh I do think that overall you are probably right that bad people problems can in real life be as challenging or more challenging than non-human ones.
I should have said that getting “rid of bad people” seems easy. In the context I was talking about an appealing world-view I didn’t mean to say I share it. I was trying to describe what it feels like from the inside.
Is “agentic” a valid category of social explanations? I feel fairly confident that I can phrase almost any positive social explanation in agentic or mechanistic terms. This confusion may well exist at the level of reality: inanimate objects not arranged in very specific ways are not agentic, and those arranged in those very specific ways are, but large collections of agents just don’t behave in ways that are adequately captured by our agent/nonagent binary language.
Of course it may well be that my agency-modelling has failed and this is just your point.
The Heterosexual White Males example rubs me the wrong way. I haven’t heard of what I’d call conspiracy theories about that, and it doesn’t match the ridiculousness of Satan or the Illuminati. It reads like someone who wants to get back at feminists or whomever, you know. A politically motivated and sort of mean-spirited low blow. I mean, maybe there are a bunch of people that believe that on a level that matches the rest of the examples, but this is the vibe I got.
The article deals tightly scapegoating and seeing malignant agency where there is none.
The line was a joke alluding to acceptable targets. However since you responded seriously and with concern I think I should reply in kind.
I find this hard to believe. They aren’t really used in such theories exactly the way a devil would be (oh wait), but I dare say they are invoked in the same way Jews sometimes are. And surely a list of Satan, the Iluminati and the Jews makes intuitive sense? ;) Even the most ardent anti-semite in conversation assures you that while most Jews are annoying they probably aren’t all involved in plots to enslave mankind. The MacDonald inspired anti-semite will further argue that because of their culture they can’t help but subconsciously sabotage wider society for the benefit of their ethnic group. He will also even point out one or two good Jews, usually the kind that exposes the fiendish plots of other Jews.
Are anti-semites conspiracy theorists? Not all of them. One can have hatred or dislike for the Jewish or any other people and avoid spinning any such tales at all. But often conspiracy theories used to support such positions are quite common among them. A different example of this would be the conspiracy theories regarding Armenians. The pattern even holds for Anti-American sentiments.
Keeping this in mind I ask you to search for some conspiracy theories about the origins of AIDS. Mind you these are quite popular in some circles. Are you really claiming you never head of such tales? Don’t White Heterosexual males play the role of Satan or the Jews in them? It seems strange to deny that they indeed to. It also seems hard to dispute that the image evoked by The Man is such a male.
Even if you discount all these example, what about the theories such as that of Babylonian oppression?
These indeed this one exactly fit the bill of my joke and is far from the only one of its kind.
Well, hmm. I’m not really sure that it was in good taste nonetheless. I understand that you’re joking, and that there are conspiracy theories like that. That Jews, the Illuminati, or Heterosexual White Males have a big conspiracy to rule the world is a pretty silly idea, that’s true. Here’s what I think the thing is. Straight white males are the least discriminated against and therefore probably most likely to be dismissive of the idea that racism, sexism ect still exist and such. People don’t really like hearing that their group has it good and that they’re ignorant, and can get defensive. As a reaction they might set themselves against that whole idea and dismiss it whenever possible. That’s why your comment came off that way to me, because that seemed a likely way for it to have come about. And even as just a joke, I don’t think it’s a good idea, because it’s a serious issue and joking about it makes it less serious, I guess? And even if you think that still isn’t reason enough, multiple other people seem to have gotten the same sort of vibe from it, so. That’s my two cents.
Oh, and your first comment, about scapegoating and seeing malignant agency where there is none- is that a jab at me supposedly doing that? Excuse me if it isn’t, I’m looking at it and having trouble coming up with other things it could be… other than maybe saying this is off-topic. But I thought I was pretty careful in the way I phrased things to say what it came off as to me and not what it is.
I don’t doubt they exist at all.
No. Thought I must admit I’m not quite sure which comment you have in mind. I do think I mentioned something like that in the original form of this comment, but it was aimed at categorizing the kinds of conspiracies I linked to and didn’t have anything to do with this fork of the conversation.
Edit: Ugh I’m so stupid, of course you where refering to the first comment in this exchange. I forgot about that line. No it wasn’t targeted at you I was setting up my explanation of why I thought it made a good joke/example. Sorry for the misunderstanding.
Well, I’m not trying to say that you personally doubt they exist.
What I’d meant by the first comment, excuse me if I’d caused confusion by saying comment, is this:
I am well aware that these prejudices exist. I even spot prejudice implicit in this very sentence.
Oh, huh. I didn’t mean to do that. Do you think you could point it out for me? I’m no expert.
And I’m not trying to say that such a large portion of straight white males aren’t aware of these prejudices that you’d need to provide anecdotal evidence to the contrary, haha. ?
That’s a conspiracy theory about whites, not “white heterosexual males”. Most focus on “white heterosexual males” if anything is an anti-conspiracy theory, since what is posited is not coordination but rather more oblivious people who just don’t realize that not everyone is in their position or has their viewpoints. For example, when people speak of “white male privilege” they don’t mean there’s a conspiracy theory to help white males, but rather that white males do have advantages in much of society and we often don’t realize it. Similarly, when people talk about heteronormativity, they are generally talking about people taking for granted certain types of sex and gender roles as universal.
The appropriate analogy might be that there are people who think the Illuminati created the banking crisis. That’s distinct from thinking that specific systemic problems and competence issues created the problem.
The white racist patriarchy is not male at least? I’m sure it will be very disappointed to hear that.
Again, in most forms it isn’t a conspiracy theory- the people advocating it don’t generally argue that there’s an overarching conspiracy as much. Some of them do move to the conspiratorial end, but even then they don’t approach full blown conspiracy in the sense of deliberate hidden coordination.
I was specifically referencing the Rastafarian conspiracy theory I quoted previously.
Ah, yes that would fall into the conspiracy theory outright. There’s no question that there are quite a few conspiracies about “whites” as the explicit conspiracy group. I think my confusion in this context stemmed from your use of patriarchy- as far as I’m aware the Rastafarian conspiracy doesn’t make any point about patriarchy or heterosexuality, which are relevant in the original context.
Huh. That’s interesting. I’ve never seen an emphasis on patriarchy in the Rastafarian material I’ve seen. I’ll have to look into that in more detail. The sources that Wikipedia entry give are a dead link and this which doesn’t seem to mention a patriarchal aspect as far as I can tell.
I’ve had enough of your disingenuous assertions.
Gains Renegade Points
Downvoted. Both needling comments like Konkvistador’s and specifically bringing them up are poking a stick at a beehive, and it’s probably best to precommit to de-escalate whenever possible. Innuendo about forbidden topics aside, I think everyone who’s posting here for some time knows where others stand, and also knows what’s liable to summon others’ inner toddler (which is why it can be so tempting.)
Given that there is a forbidden topic, your strategy is to punish those who challenge raising the topic? This isn’t a strategy likely to decrease the frequency of the undesired behavior—it creates a large incentive to be the first to bring up the topic since your strategy noticeably lacks a threat of punishment for that act.
Well, my intuition was that there’s probably a Schelling point where people make needling, inessential asides in the context of something else, but that Stokes’ comment makes the subtext a text and so goes out of the Schelling point. But these issues are complex and I don’t have any strong argument to back my intuitions here against your reasons, and if I keep on following this train of thought I’ll have gone about ten levels meta deep on nerd drama, and that’s just embarrassing, so dedownvoted.
I think history makes a solid case that there are three Schelling points:
Don’t talk about certain subjects
Marketplace of ideas
Purge the unbelievers
“Keep it to a low roar” is not really a stable dynamic.
In context this has the unfortunate implication that Konkvistador’s ideas and manner of speech aren’t acceptable.
While he identifies himself as apolitical in some lost thread I don’t care dig up it is pretty clear that he at least entertains right wing ideas and is very unmoved by political correctness. Unlike me he never seems to be rude about it though. Intelligent right wing people are a tiny minority here and are even more banished one in the academia we often rely on. Note that we even have LW posters who have in academia personally experienced discrimination and harassment because of their right wing politics.
Considering this shouldn’t we try to not to make a spectacle out of them in this fashion? By picking on a joke line, and discussing it so we are “excluding them from intended audience”.
Funny how no one seems to think ideological diversity is a good idea if one wants to catch bad thinking. At least no one ever lets it show in their actions.
First, I’m not the one who wrote “Politics is the Mindkiller.” If we take the lesson seriously enough to establish a norm that we don’t discuss politics or political theory at all—Konkvistador’s jab at feminism is a violation of the norm. That’s one of the main points of the essay you linked.
Second, the no politics norm is not what I would prefer—in this discussion, I was a proponent of moving towards more open discussion of political theory. I was this close to making a thread, but it became clear that there was no consensus to change the community norm (at best, the community was split—which wasn’t enough to justify any change).
To the extent you assert Konkvistador’s right-wing views are persecuted here, the assertion is false. Consider just about any political conversation by Konkvistador. He’s able to start them with little pushback, and my perception is that he gets more upvotes than his interlocutors. This community is very interested in views like his. It’s not fair to hold me accountable for jerk moves by left-wing academics in the larger world, just like it’s not fair for me to pin the squickiest PUA stuff or Objectivist stuff on you.
Fourth, Konkvistador is a very clear writer and he’s generally very polite. But apolitical is a laughable label. He’s deeply suspicious of the idea “consent of the governed” and opposed to what he thinks feminism is. And he’s not afraid to say it—and he says it fairly well. But that’s not apolitical, no matter how often he asserts otherwise.
This debate might clarify what I mean when I identify myself as apolitical, as CS correctly notes I do. In short it amounts to not participating in the political process or commenting current political struggles, it also for me means not identifying with a political identity. Descriptively I don’t mind it that much if someone describes me “right winger” or some such, but I won’t refer to myself by such labels except in jest. Especially in my internal narrative.
In other words I don’t see myself as a “right winger” while accepting that I do currently hold some right wing ideas. The reason I make this perhaps seemingly trivial difference is because I don’t consider those ideas at the heart of who I am, but mostly hypotheses about how the world works. If I wake up tomorrow and realize they are bull I hope I will have enough virtue to be happy about realizing my mistake.
Also please note that I have a highly eclectic bunch of right wing ideas, mixed in with left wing ones, for example I like the idea of a basic income guarantee (though people like Charles Murray support it as well), I think universal healthcare in my country works pretty well and my stance on marriage (homosexual and otherwise) dosen’t neatly fit there either. I would have a hard time finding a political tribe or label I could identify with even if I wanted to.
Konkvistador,
I’m familiar with that conversation, since you mostly had it with me. :)
The fact that there is no political faction that supports your cluster of political ideas does not mean that you don’t have political opinions or that you don’t push them in this community. Your lack of mainstream partisan identification speaks well of your rationality. But the norm in the community is no political opinions rather than no partisan opinions. To be clear, I disagree with that norm and think that your contributions are a net benefit to the community. But as far as I can tell, the stated norm of the community conflicts with talking about the topics you discuss.
In short, your (deservedly) high status in this community is protecting you from pushback that a newcomer would receive if he posted substantially similar content to what you post.
In this particular case, I think there has been a bit of misunderstanding among your critics. Your reference to Heterosexual White Males was interpreted (by me and others) as a reference to feminism, when you intended to reference conspiracy theories like “CIA caused crack epidemic” or “CIA made AIDS”.
I would not agree that the existing community norm precludes all discussions of policy proposals, even those not affiliated with any partisan group.
I would agree that it precludes discussions of proposals affiliated with any partisan group, even if raised by individuals who don’t identify as members of that group.
That is, if I picked some example to talk about that was, strictly speaking, political, but that no significant political group had made into a partisan point of contention, I would not expect to be censured for it; if I were censured for it I would treat that as evidence that it was partisan in some way I hadn’t previously noticed.
On reflection, I think you explain the data better than I, but I maintain that the equilibrium you describe is not stable.
Specifically, it is not a neutral principal—one side on a substantive disagreement can be suppressed by creation of a social norm that the side is too close to a live partisan political debate while the other side far enough from the live debate not to be suppressed.
(nods) Oh, absolutely.
Or, rather, I agree that some ideas violate the local norm more strongly than others (and, in particular, more than a given opposed idea) and that consequently the local norm isn’t ideologically neutral. There exist partisan positions that LW collectively implicitly supports, and partisan positions that LW collectively implicitly rejects.
Whether that makes the norm unstable in any practical sense, I’m not quite sure, though it seems intuitively plausible. (I agree that the norm is unstable in a technical sense, but I can’t see why anyone ought to care.)
I recognize that there are people here who would at least claim to disagree with you, on grounds I don’t entirely understand but which at least sometimes have to do with the idea that this community is “exceptionally rational” and that this renders us relatively immune to normal primate social dynamics. I’m not one of them. (I’m also not entirely convinced that anyone actually believes this.)
That . . . makes me feel a lot better, actually. I suppose that fragile is a better adjective than unstable for what I was trying to say.
Oh I know you are. Its just the casual reader might not be and I felt a bit uncomfortable being quiet as other people debate me.
This reminds me of a discussion I had on naming the article. I was reluctant to give it a title and prefered “On Conspiracy Theories”, because giving a post a memorable title seems to cause the meaning of the article to over time the meaning of the article will converge with its title.
I suspect this is because we like linking articles, and while people may read a link the first time, they don’t tend to read it the second or third time it is linked. Eventually a phrase that is supposed to be a shorthand for a nuanced argument starts to mean exactly how it is used.
I cited precisely “Politics is the Mindkiller” as an example of this. In the original article Eliezer basically argues that gratuitous politics, political thinking that isn’t outweighed by its value to the art of rationality is to be avoided. This soon came to meant it is forbidden to discuss politics in Main and Discussion articles (though politics does live in the comment sections).
Since the personal is the political, we pretty quickly started applying this kind of thinking to PUA and Gender relations in general as well, though we may not cite is as often.
Is “marketplace of ideas” actually a Schelling point? It seems more like the temporary absence of a a Schelling point (at least, once stripped of idealism).
I got the idea from Yvain here.
But I’m not totally comfortable with its inclusion on my list. For example, both Europe and America operate on the same basic free speech principle (Allow speech unless it is too “dangerous”/”uncivil”), yet the two regimes are substantially different in practice. This discontinuity is a substantial challenge to the accurate of the label “Schelling point” when applied to freedom of speech.
It also sends an unintended signal: “This community is more interested in putting up with core-demographic provincialism for the sake of avoiding flamewars between the majority; folks on the periphery are better off not even trying to point it out, analyze it, or correct it.” I think this is bad for LW in the long run; while it’s definitely beginning to change, the user base is still very homogenous, with some fairly big gaps in knowledge and skills.
Which knowledge? Which skills? Be explicit.
Pretty much anything relating to biology from anything other than a careful reading of pop-sci evolutionary theory, for a start (and even that is often misleading when you try to extrapolate from it to real biological systems, let alone complex things like ecosystems). Given the unabashedly transhumanist and pro-cryonics position of SIAI’s main figures present here, that’s kind of glaring—it comes off as a bit overconfident and a bit naive.
A lot of things that amount to context and particulars of the world we live in. It’s my perception that LWers in general know very little about stuff like ecology, infrastructure, history, culture, and downrate their importance when trying to understand how the world works, how a given pattern has developed, ways in which it might change in the future, or to what degree and how one might seek to deliberately change some facet of that.
At the very best of times it seems like, to the extent this gap is recognized at all, it’s considered a problem for FAI to solve. We don’t need to know any of this stuff or why it’s relevant to stuff like “raising the sanity waterline”, “mitigating global existential risk” or “extrapolating human value”; if it has any relevance at all, our future genie will surely determine that and implement it tidily.
If anything user Konkvistador seems remarkably interested and knowledgeable about history, culture and politics.
I also seems to recall several academically trained biologists, doctors and even ecologists being prominent members of the community. Are you really bothered by a lack of knowledge or skill, or are you bothered by how they are applied?
I would argue that you are actually bothered by LW not paying attention to them and discussing them as you think appropriate. At least that is what I get out of the quote here:
But again Konkvistador dosen’t exactly shy away from the topics I mentioned. He has 6000+ karma, so he’s not exactly a pariah. He often discusses them at length. I even recall a debate about ecology now that I think about it. Do we need a smaller share of people like him?
Okay, but that’s not the demographic/experience gap implied by political matters like that which spurred the thread. Everyone would like more and different scientists here. But come on, that’s not what you mean. When you say the user base is homogeneous in the context of Estoke’s comment you’re not talking about how there are too many computer programmers and not enough ecologists.
I’m not completely sure what you mean by “putting up with core-demographic provincialism”—I assume it’s the “Yay hard sciences boo humanities!” subtext, no?
And I have no idea of what you mean by “flamewars between the majority”—flamewars dividing the majority? Flamewars between LessWrong and the rest of the world?
(For context, I’m French, so I may not have a clear idea of what kind of things signal what in an American context)
I believe the subtext is more about LW’s racial and gender makeup than our favorite parts of academia.
Edit: Though I suppose part of it would be a typical anti-humanities reaction to, say, departments of gender studies, African-American Studies, Queer-studies etc. and their manifestation in, e.g. English departments.
Oh, very much so. It’s a sort of offhand straw-man caricature of attitudes found in certain aspects of humanities scholarship, especially those that purport to “social justice”; its use here suggests to me that Konkvistador either doesn’t understand the arguments, several chains of interpretation away, or has a fairly selective sample set to work from.
I have demonstrated that such conspiracy theories do exist. And many of them are quite well known and popular.
It was a joke (as clearly demonstrated when I equate Satan to the Illuminati to white males)
In light of this I quite honestly now see this as politicking. I’m pretty sure nothing can save this thread from spiralling into gender/sexuality fail. A predictable failure mode of LessWrong I guess.
sigh
The original draft was upvoted to +16 and was sitting in the comment section for a month. I sent versions of the draft and links asking for private commentary and criticism to several LW and non-LW people. I wanted to improve my writing both conceptually and stylistically. I was aiming for responses in the spirit of Cocker’s rules and made it clear that I wanted through criticism of any kind, because this is my first original content main article ever. Many of them responded. Of those that responded to my requests some where women, some where homosexual. None commented or scolded me for the statement.
Even now much later when the editing is done, If anyone had written me a PM asking me to remove the statement because it is hurtful I would have complied. But this isn’t what happened, now is it?
But now that I think about it I’m not sure if anyone contacted was non-white or non-Asian (since they don’t count as diversity any more). Darn maybe Derb had a point. I need to acquire a black friend, lest I fail to atone in a future struggle session.
This wasn’t Derb’s point. There’s a clear distinction between asking someone from a minority racial group to look at an essay and an attempt to cultivate friends in racial groups. Moreover, part of what many found shocking about Derbyshire’s remark (well that specific part of the essay) was that for a very long time the “I’m not against X, some of my best friends are X” has been seen as such a transparent and self-serving defense that to seriously suggest it as useful implied a high degree of obliviousness about how race relations function in the US.
Quite honesty I have a hard time imagining a token friend speaking up for someone wouldn’t help them. Not speaking up for someone is not really the trait of a friend after all. On the other hand befriending someone because of benefits is also less than most virtuous.
Oh the paradoxes of modern living.
Friends can help only if they are present at the debate. If they are absent, and people are primed to see you as an X-hater, then it seems like you are talking about “imaginary friends”. That obviously does not help.
For example in a debate like this, only your friends active on LW would be relevant. And only if they had time to participate in this discussion now.
In real life, the best defense against being labeled as an X-hater is to actively label other people as X-haters, and to act offended every time someone speaks about X.
It’s all about signalling.
On a second thought, having an X friend and making them a part of your identity (e.g. having a photo with them as your avatar, mentioning them often), that would also help. That would give you the first move in the priming combat. (Though it would not work for “X = female”, because that could be reframed as you exploiting the given person.)
Mentioning X friends works best if I’m not perceived as doing so with the intention of establishing my credibility as a non-(X-hater). But with that proviso, it can work pretty well.
Also, for iterated discussions, it can sometimes help to establish a practice of preferentially using groups I’m actually in as examples of negative traits, and only using groups I’m not in as examples when I genuinely am claiming that my groups don’t have those traits. (It’s important when using this approach to avoid being seen as “self-hating” though.)
Of course, if I’m in the position of genuinely believing that group X is either generally inferior to my group, or inferior in certain specific ways that I genuinely consider more important to discuss than other group traits, that’s less available as an option.
I’m pretty sure a supporter of any of the other conspiracy theories Konkvistador casually dismissed in the post would make an analogous complaint.
Though it might be good to tack on “though it doesn’t mean it’s not a valid statement” to the beginning or something. Not that I’m trying to police the way you comment, haha, I’m just trying to say this in a way that doesn’t seem like aggression.
Are conspiracy groups local? Apparently conspiracy have importance only if they have a non-negligible chance to become a powerful influence.