I guess we can agree that the most rational response would be to enter a state of aporia until sufficient evidence is at hand.
Not really; consider how much effort is worth investigating the question of whether Barack Obama is actually secretly Transgender, in different scenarios:
You just thought about it, but don’t have any special reason to privilege that hypothesis
Someone mentioned the idea a a thought experiment on LessWrong.com, but doesn’t seem to think it’s even remotely likely
Someone on the internet seems to honestly believe it (but may be a troll or time cube guy-level crazy)
A vocal group on the internet seems to believe it
Several people you know in real-life seem to believe it
If you think that even in the first case you should investigate, then you’re going to spend your life running over every hypothesis that catches your fancy, regardless of how likely or useful it is. If you believe that in some cases it deserves a bit of investigation, but not others, you’re going to need a few extra rules of thumbs, even before looking as the evidence.
Couldn’t the problem be solved by dividing my convictions into two groups:
Those that I need in order to survive and prosper in my life.
Those that I don’t need in order to survive and prosper in my life.
Then I could go into aporia for all those who belong to group 2, while allowing more gut feeling for those in group 1.
The Charlie Hebdo question doesn’t affect my life quality, so I for that case I could afford the epistemological “luxury” of aporia.
The Charlie Hebdo question doesn’t affect my life quality
I would disagree with that. If you meet someone and they tell you about a bunch of conspiracy theories they believe, your estimate of their relative sanity will be dependant on how plausible you think those theories are, and that may impact your life. So, if the vast majority of conspiracy theories are false, but you believe that many have a chance and it’s impossible to know, you will accept people as normal who are deluded. (I’m not saying that they deserve anything in particular because they’re wrong, but it seems better for you to know rather than not.)
This also allows you to dismiss certain opinions that do matter to you (USE WITH CAUTION!) when the holders also hold many other theries that don’t matter. Disclaimer: this refers to theories that you only hear from conspiracy theorists, and can’t find “normal” people who believe them. Just because a conspiracy theorist believes something does not make it false. But there are some things that I hear and say “well, all the major advocates also think 9/11 was an inside job/insert conspiracy theory here, so I can safely ignore that”, even though the conspiracy theories in question may not be relevant to me. I try to find at least one “clean” advocate for something before taking it seriously as an idea.
tl;dr “The Charlie Hebdo question” and the class of similar ones are relevant to assessing others’ rationality.
Not really; consider how much effort is worth investigating the question of whether Barack Obama is actually secretly Transgender, in different scenarios:
You just thought about it, but don’t have any special reason to privilege that hypothesis
Someone mentioned the idea a a thought experiment on LessWrong.com, but doesn’t seem to think it’s even remotely likely
Someone on the internet seems to honestly believe it (but may be a troll or time cube guy-level crazy)
A vocal group on the internet seems to believe it
Several people you know in real-life seem to believe it
If you think that even in the first case you should investigate, then you’re going to spend your life running over every hypothesis that catches your fancy, regardless of how likely or useful it is. If you believe that in some cases it deserves a bit of investigation, but not others, you’re going to need a few extra rules of thumbs, even before looking as the evidence.
I definitely see your point.
Couldn’t the problem be solved by dividing my convictions into two groups:
Those that I need in order to survive and prosper in my life.
Those that I don’t need in order to survive and prosper in my life.
Then I could go into aporia for all those who belong to group 2, while allowing more gut feeling for those in group 1. The Charlie Hebdo question doesn’t affect my life quality, so I for that case I could afford the epistemological “luxury” of aporia.
I would disagree with that. If you meet someone and they tell you about a bunch of conspiracy theories they believe, your estimate of their relative sanity will be dependant on how plausible you think those theories are, and that may impact your life. So, if the vast majority of conspiracy theories are false, but you believe that many have a chance and it’s impossible to know, you will accept people as normal who are deluded. (I’m not saying that they deserve anything in particular because they’re wrong, but it seems better for you to know rather than not.)
This also allows you to dismiss certain opinions that do matter to you (USE WITH CAUTION!) when the holders also hold many other theries that don’t matter. Disclaimer: this refers to theories that you only hear from conspiracy theorists, and can’t find “normal” people who believe them. Just because a conspiracy theorist believes something does not make it false. But there are some things that I hear and say “well, all the major advocates also think 9/11 was an inside job/insert conspiracy theory here, so I can safely ignore that”, even though the conspiracy theories in question may not be relevant to me. I try to find at least one “clean” advocate for something before taking it seriously as an idea.
tl;dr “The Charlie Hebdo question” and the class of similar ones are relevant to assessing others’ rationality.