Maybe the years of public schooling that most of us receive cause us to trust papers so much, that if we see something written down on a paper, we feel uncomfortable opposing it. If you’re threatened with punishment for not regurgitating what is on an authority’s papers daily for that many years of your life, you’re bound to be classically conditioned to behave as if you agree with papers.
So maybe what’s going on is this:
You fill out a scientist’s paper.
The paper tells you your point of view. It looks authoritative because it’s in writing.
You feel uncomfortable disagreeing with the authority’s paper. School taught you this was bad.
Now the authority wants you to support the opinion they think is yours.
You feel uncomfortable with the idea of failing to show the authority that you can support the opinion on the paper. (A teacher would not have approved—and you’d look stupid.)
You might want to tell the authority that it’s not your opinion, but they have evidence that you believe it—it’s in writing.
You behave according to your conditioning by agreeing with the paper, and do as expected by supporting what the researcher thinks your point of view is.
I think this might just be an external behavior meant to maintain approval of an authority, not evidence that they’ve truly changed their minds.
I wonder what would happen if the study were re-done in a really casual way with say, crayon-scrawled questions on scraps of napkins instead of authoritative looking papers.
Also, I wonder how much embarrassment it caused when they seemed to fill out the answers all wrong and how embarrassment might have influenced these people’s behavior. Imagine you’re filling out a paper (reminiscent of taking a test in school) but you filled out the answers all wrong. Horrified by the huge number of mistakes you made, might you try to hide it by pretending you meant to fill them out that way?
It seems to me that this hypothesis is more of a mechanism for choice blindness than an alternate explanation- we already know that human beings will change their minds (and forget they’ve done so) in order to please authority.
(There’s nonfictional evidence for this, but I need to run, so I’ll just mention that we’ve always been at war with Oceania.)
What I’m saying is “Maybe they’re only pretending to have an opinion that’s not theirs.” not “They’ve changed their minds for authority.” so I still think it is an alternate explanation for the results.
IIRC, part of the debriefing protocol for the study involved explaining the actual purpose of the study to the subjects and asking them if there were any questions where they felt the answers had been swapped. If they at that point identified a question as having fallen into that category, it was marked as retrospectively corrected, rather than uncorrected.
Of course, they could still be pretending, perhaps out of embarrassment over having been rooked.
I’m having trouble interpreting what your point is. It seems like you’re saying “because they were encouraged to look for swapped questions before hand, Epiphany’s point might not be valid” however, what I read stated: “After the experiment, the participants were fully debriefed about the true purpose of the experiment.” so it may not have even occurred to most of them to wonder whether the questions had been swapped at the point when they were giving confabulated answers.
Does this clarify anything? It seems somebody got confused. Not sure who.
IIRC, questions that were scored as “uncorrected” were those that, even after debriefing, subjects did not identify as swapped. So if Q1 is scored as uncorrected, part of what happened is that I gave answer A to Q1, it’s swapped for B, I explained why I believe B, I was afterwards informed that some answers were swapped and asked whether there were any questions I thought that was true for, even if I didn’t volunteer that judgment at the time, and I don’t report that this was true of Q1. If I’m only pretending to have an opinion (B) that’s not mine about Q1, the question arises of why I don’t at that time say “Oh, yeah, I thought that was the case about Q1, since I actually believe A, but I didn’t say anything at the time.”
As I say, though, it’s certainly possible… I might continue the pretense of believing B.
An alternate explanation:
Maybe the years of public schooling that most of us receive cause us to trust papers so much, that if we see something written down on a paper, we feel uncomfortable opposing it. If you’re threatened with punishment for not regurgitating what is on an authority’s papers daily for that many years of your life, you’re bound to be classically conditioned to behave as if you agree with papers.
So maybe what’s going on is this:
You fill out a scientist’s paper.
The paper tells you your point of view. It looks authoritative because it’s in writing.
You feel uncomfortable disagreeing with the authority’s paper. School taught you this was bad.
Now the authority wants you to support the opinion they think is yours.
You feel uncomfortable with the idea of failing to show the authority that you can support the opinion on the paper. (A teacher would not have approved—and you’d look stupid.)
You might want to tell the authority that it’s not your opinion, but they have evidence that you believe it—it’s in writing.
You behave according to your conditioning by agreeing with the paper, and do as expected by supporting what the researcher thinks your point of view is.
I think this might just be an external behavior meant to maintain approval of an authority, not evidence that they’ve truly changed their minds.
I wonder what would happen if the study were re-done in a really casual way with say, crayon-scrawled questions on scraps of napkins instead of authoritative looking papers.
Also, I wonder how much embarrassment it caused when they seemed to fill out the answers all wrong and how embarrassment might have influenced these people’s behavior. Imagine you’re filling out a paper (reminiscent of taking a test in school) but you filled out the answers all wrong. Horrified by the huge number of mistakes you made, might you try to hide it by pretending you meant to fill them out that way?
It seems to me that this hypothesis is more of a mechanism for choice blindness than an alternate explanation- we already know that human beings will change their minds (and forget they’ve done so) in order to please authority.
(There’s nonfictional evidence for this, but I need to run, so I’ll just mention that we’ve always been at war with Oceania.)
What I’m saying is “Maybe they’re only pretending to have an opinion that’s not theirs.” not “They’ve changed their minds for authority.” so I still think it is an alternate explanation for the results.
IIRC, part of the debriefing protocol for the study involved explaining the actual purpose of the study to the subjects and asking them if there were any questions where they felt the answers had been swapped. If they at that point identified a question as having fallen into that category, it was marked as retrospectively corrected, rather than uncorrected.
Of course, they could still be pretending, perhaps out of embarrassment over having been rooked.
I’m having trouble interpreting what your point is. It seems like you’re saying “because they were encouraged to look for swapped questions before hand, Epiphany’s point might not be valid” however, what I read stated: “After the experiment, the participants were fully debriefed about the true purpose of the experiment.” so it may not have even occurred to most of them to wonder whether the questions had been swapped at the point when they were giving confabulated answers.
Does this clarify anything? It seems somebody got confused. Not sure who.
IIRC, questions that were scored as “uncorrected” were those that, even after debriefing, subjects did not identify as swapped.
So if Q1 is scored as uncorrected, part of what happened is that I gave answer A to Q1, it’s swapped for B, I explained why I believe B, I was afterwards informed that some answers were swapped and asked whether there were any questions I thought that was true for, even if I didn’t volunteer that judgment at the time, and I don’t report that this was true of Q1.
If I’m only pretending to have an opinion (B) that’s not mine about Q1, the question arises of why I don’t at that time say “Oh, yeah, I thought that was the case about Q1, since I actually believe A, but I didn’t say anything at the time.”
As I say, though, it’s certainly possible… I might continue the pretense of believing B.