[Link] What do conservatives know that liberals don’t (and vice versa)?
I am a PhD student currently conducting research on political polarization and persuasion. I am running an experiment that requires a database of trivia questions which conservatives are likely to get correct, and liberals are likely to get wrong (and vice versa). Our pilot testing has shown, for example, Democrats (but not Republicans) tend to overestimate the percentage of gun deaths that involve assault-style rifles, while Republicans (but not Democrats) tend overestimate the proportion of illegal immigrants who commit violent crimes. Similarly, Democrats (but not Republicans) tend to overestimate the risks associated with nuclear power, while Republicans (but not Democrats) underestimate the impact of race-based discrimination on hiring outcomes.
Actually designing these questions is challenging, however, because it’s difficult to know which of one’s political beliefs are most likely to be ill-informed. As such, I am running a crowdsourcing contest in which we will pay $100 for any high-quality trivia question submitted (see contest details here: https://redbrainbluebrain.org/). The only requirements are that participants submit a question text, four multiple choice answers, and a credible source. The deadline for submissions is October 15th, 2019 at 11:59 p.m.
My intuition is that the LessWrong community will be particularly good at generating these kinds of questions given their commitment to belief updating and rationality. If you don’t have the time to participate in the contest, I welcome any ideas about potential topics that might be a fruitful source of these kinds of questions. Would also appreciate any clues as to what other communities may be particularly good at generating these kinds of questions.
One suggestion would be to datamine the GSS: look for items which most discriminate between partisan affiliation, which would reflect factual claims.
This is a good idea. Will work on this now. Thanks! For “Knowledge Desert” questions (non-political questions where only one party will have a strong hunch about), I looked at patterns of co-following activity on Twitter and Reddit. So, for example, people who followed conservative Senators/Representatives on Twitter also tended to follow certain kinds of sports (e.g. baseball and UFC), and certain kinds of restaurants (e.g. Bob Evan’s Steakhouse and Cracker Barrel). Similarly, people who subscribed to /r/TheDonald also followed stereotypically conservative lifestyle sub-reddits. I’ve been having trouble finding a similar proxy for “False Belief” questions (political in content, where both parties have strong hunches that they’re correct).
@gwern, I was going through GSS data the other day, but was not quite sure of what you had in mind. There are variables that track political affiliation (e.g. party), and I can find what other variables are strongly predicted by party (e.g. geographical area). Was the idea to just find demographic items that are tightly correlated with partisan affiliation, and brain storm trivia questions based on the demographic characteristics?
There’s prior art.
Stefan Schubert created along with Spencer Greenberg created along with https://programs.clearerthinking.org/political_bias_test.html (https://www.lesswrong.com/posts/3jeRSuLopjjFwFpoG/political-debiasing-and-the-political-bias-test. There was a LessWrong discussion beforehand that informed the questions on that test.
https://www.lesswrong.com/posts/3jeRSuLopjjFwFpoG/political-debiasing-and-the-political-bias-test is an article about his test.
He wrote a guide for creating new tests:
https://docs.google.com/document/d/1OIiIVrNwK6ftRZqLfEmhwXHj0tx-Uhg_N2kfBEFpuP0/edit
Thanks, Christian! I’ve actually been in communication with Spencer and Stefan—they’ve been an immensely helpful resource. We have both stumbled upon a similar, difficult problem. Namely, it’s very rare to be “certain” about the controversial questions (which is the point of the contest). By coincidence, a lot of the questions used in their political bias tests I also came up with for my intervention, but some of the others are actually less straightforwardly verifiable than is let on by the test.
It seems hard to be certain what the true probability happens to be that global warming increases the frequency of floods. On the other hand it’s easy to be certain about the probability that the latest IPCC report gives to that event.
In a similar sense you can ask about “How many percent of Muslims in the UK believe according to the GSS that homosexuality should be punishable by death?”
When talking about deaths due to Chernobyl you can argue whether you should or shouldn’t count those who got depressive and committed suicide afterwards but you get a clear number from the UNSCEAR assessments of the Chernobyl accident.
If you word your questions to say “What does authoritative source X say about Y”, you can have certain answers.