That’s an idea, but you’d need to know how they started out. If generally nice people joined one religion and stayed the same, and generally horrible people joined the other and became better people, they might look the same on the bagel test.
True. You could control for that by seeing if established communities are more or less prone to stealing bagels than younger ones, but that would take a lot more data points.
Indeed. Or you could test the people themselves individually. What if you got a bunch of very new converts to various religions, possibly more than just Christianity and Wicca, and tested them on the bagels and gave them a questionnaire containing some questions about morals and some about their conversion and some decoys to throw them off, then called them back again every year for the same tests, repeating for several years?
I don’t really trust self-evaluation for questions like this, unfortunately—it’s too likely to be confounded by people’s moral self-image, which is exactly the sort of thing I’d expect to be affected by a religious conversion. Bagels would still work, though.
Actually, if I was designing a study like this I think I’d sign a bunch of people up ostensibly for longitudial evaluation on a completely different topic—and leave a basket of bagels in the waiting room.
What about a study ostensibly of the health of people who convert to new religions? Bagels in the waiting room, new converts, random not-too-unpleasant medical tests for no real reason? Repeat yearly?
The moral questionnaire would be interesting because people’s own conscious ethics might reflect something cool and if you’re gonna test it anyway… but on the other hand, yeah. I don’t trust them to evaluate how moral they are, either. But if people signal what they believe is right, then that means you do know what they think is good. You could use that to see a shift from no morals at all to believing morals are right and good to have. And just out of curiosity, I’d like to see if they shifted from deontologist to consequentialist ethics, or vice versa.
People don’t necessarily signal what they think is right; sometimes they signal attitudes they think other people want them to possess. Admittedly, in a homogenous environment that can cause people to eventually endorse what they’ve been signaling.
Yes, definitely. Or in a waiting room. “Oops, sorry, we’re running a little late. Wait here in this deserted waiting room till five minutes from now, bye. :)” Otherwise, they might not see them.
That’s an idea, but you’d need to know how they started out. If generally nice people joined one religion and stayed the same, and generally horrible people joined the other and became better people, they might look the same on the bagel test.
True. You could control for that by seeing if established communities are more or less prone to stealing bagels than younger ones, but that would take a lot more data points.
Indeed. Or you could test the people themselves individually. What if you got a bunch of very new converts to various religions, possibly more than just Christianity and Wicca, and tested them on the bagels and gave them a questionnaire containing some questions about morals and some about their conversion and some decoys to throw them off, then called them back again every year for the same tests, repeating for several years?
I don’t really trust self-evaluation for questions like this, unfortunately—it’s too likely to be confounded by people’s moral self-image, which is exactly the sort of thing I’d expect to be affected by a religious conversion. Bagels would still work, though.
Actually, if I was designing a study like this I think I’d sign a bunch of people up ostensibly for longitudial evaluation on a completely different topic—and leave a basket of bagels in the waiting room.
What about a study ostensibly of the health of people who convert to new religions? Bagels in the waiting room, new converts, random not-too-unpleasant medical tests for no real reason? Repeat yearly?
The moral questionnaire would be interesting because people’s own conscious ethics might reflect something cool and if you’re gonna test it anyway… but on the other hand, yeah. I don’t trust them to evaluate how moral they are, either. But if people signal what they believe is right, then that means you do know what they think is good. You could use that to see a shift from no morals at all to believing morals are right and good to have. And just out of curiosity, I’d like to see if they shifted from deontologist to consequentialist ethics, or vice versa.
Yeah, that all sounds good to me.
People don’t necessarily signal what they think is right; sometimes they signal attitudes they think other people want them to possess. Admittedly, in a homogenous environment that can cause people to eventually endorse what they’ve been signaling.
Hm, you’d probably want the bagels to be off in a small side room so that the patients can feel alone while considering whether or not to steal one.
Yes, definitely. Or in a waiting room. “Oops, sorry, we’re running a little late. Wait here in this deserted waiting room till five minutes from now, bye. :)” Otherwise, they might not see them.