This suggests that studies about partisan confusion about truth are overblown. I haven’t had a chance to look at the actual paper yet, but the upshot is that this study suggests that while there is a lot of prior evidence that people are likely to state strong factual errors supporting their own partisan positions, they are substantially less likely to occur when people are told they will be given money for correct statements. The suggestion is that people know (at some level) that their answers are false and are saying them more as signaling than anything else.
Alternative explanation: They’re shutting up and multiplying.
Most people have gone through the education system. Most people know how to guess the teacher’s password. Most people have learned better than to assume their answers will be counted correct just because they have (in their opinions) good reasons for holding those answers.
Does putting an incentive on getting the answers “right” lead to “right” answers, or does it lead to people answering the way they expect you to treat as being right? My own educational history suggests the latter.
that this study suggests that all the studies about where people are likely to make strong factual errors supporting their own partisan positions are less likely to occur when people are given money for correct answers
Going by the syntax, it seems like you’re saying “that this study suggests that all the studies [about certain things] are less likely to occur [under certain circumstances]”, i.e. the study you’re talking about was about the frequency of other types of studies. This doesn’t seem to make sense.
There are studies which show that people across the political spectrum answer many factual questions in ways that don’t reflect the factual data, and they do so in ways that support their own political ideology if they were true. This research shows that this effect goes down a lot when people are told they will be paid for how many correct answers they give.
This suggests that studies about partisan confusion about truth are overblown. I haven’t had a chance to look at the actual paper yet, but the upshot is that this study suggests that while there is a lot of prior evidence that people are likely to state strong factual errors supporting their own partisan positions, they are substantially less likely to occur when people are told they will be given money for correct statements. The suggestion is that people know (at some level) that their answers are false and are saying them more as signaling than anything else.
Edit:Clarify
Alternative explanation: They’re shutting up and multiplying.
Most people have gone through the education system. Most people know how to guess the teacher’s password. Most people have learned better than to assume their answers will be counted correct just because they have (in their opinions) good reasons for holding those answers.
Does putting an incentive on getting the answers “right” lead to “right” answers, or does it lead to people answering the way they expect you to treat as being right? My own educational history suggests the latter.
I can’t parse this bit:
Going by the syntax, it seems like you’re saying “that this study suggests that all the studies [about certain things] are less likely to occur [under certain circumstances]”, i.e. the study you’re talking about was about the frequency of other types of studies. This doesn’t seem to make sense.
There are studies which show that people across the political spectrum answer many factual questions in ways that don’t reflect the factual data, and they do so in ways that support their own political ideology if they were true. This research shows that this effect goes down a lot when people are told they will be paid for how many correct answers they give.
nod That doesn’t seem to be a possible interpretation of your original sentence.
Is the edited version better?
Yeah, the new version seems quite clear (except that this looks like a typo: “people are likely to make likely to state strong factual errors”).