It’s common in certain types of polemic. People hold (or claim to hold) beliefs to signal group affiliation, and the more outlandishly improbable the beliefs become, the more effective they are as a signal.
It becomes a competition: Whoever professes beliefs which most strain credibility is the most loyal.
Data on that question would be an interesting thing to gather, though I might guess they would take attempts to measure their belief as somehow a manifestation of the conspiracy. (Everything is evidence for the conspiracy.)
Possibly just an aesthetic preference. You probably have a point.
I think such people might exist when the possibilities for prediction are relatively constrained, but even then, some fraction of their consistent wrongness would be a matter of luck, and couldn’t be used for prediction.
In fact, when the possibilities for prediction are relatively constrained, but there are a lot of people making predictions, and the system is complicated enough that you can’t expect most people to be mostly right, we’d have some people being consistently wrong by chance alone.
This seems like a very unlikely sort of phenomena, reversed stupidity != intelligence etc. Why would you expect such people?
It’s common in certain types of polemic. People hold (or claim to hold) beliefs to signal group affiliation, and the more outlandishly improbable the beliefs become, the more effective they are as a signal.
It becomes a competition: Whoever professes beliefs which most strain credibility is the most loyal.
I think that most people who tell pollsters they believe conspiracy theories wouldn’t bet on them.
Data on that question would be an interesting thing to gather, though I might guess they would take attempts to measure their belief as somehow a manifestation of the conspiracy. (Everything is evidence for the conspiracy.)
The != operator suggest bidirectionality, but it’s really unidirectional. Intelligence can be reversed stupidity if it wants to be.
Possibly just an aesthetic preference. You probably have a point.
I think such people might exist when the possibilities for prediction are relatively constrained, but even then, some fraction of their consistent wrongness would be a matter of luck, and couldn’t be used for prediction.
In fact, when the possibilities for prediction are relatively constrained, but there are a lot of people making predictions, and the system is complicated enough that you can’t expect most people to be mostly right, we’d have some people being consistently wrong by chance alone.