It’s unclear to me if we currently have any internet- institutions that is working on verification.
Sometimes it feels to me that publicly staying away from tribalism is impossible. At the moment you disagree with a sacred belief of some tribe, the tribe members will start calling you a member of the opposite tribe. Even worse, the members of the opposite tribe may take this as an encouragement to join you. If you depend on volunteers, this already disrupt your balance. If you somehow succeed to resist this, and also disagree with a sacred belief of the other tribe, both tribes will simply call you an idiot. And even that may not help you achieve the image of neutrality, because the first tribe may continue to claim that you are deep inside a fan of the second tribe; and the second tribe may continue to claim that you are deep inside a fan of the first tribe. (Because, from inside of a mindkilled person everyone who disagrees has to be, in some way, mindkilled for the enemy.)
I find myself thinking that I personally, somehow, am a great evaluator of truth.
So do I, but various people whom I consider mindkilled in return consider me mindkilled for the opposite site, and I am quite aware that from an outside view this just seems like two people accusing each other of the same thing, so why should I be the one who is right? I take some comfort in knowing that I am accused of many contradictory things, which is a weak evidence that the accusations are bullshit, but this is a kind of reverse-stupidity reasoning.
At the bottom of fact-checking, you need to compare the map with the territory. Just comparing two maps won’t do enough. It can tell you which maps are more similar and which are less, perhaps you could do some kind of cluster analysis on them… but you would still need to have deep trust in some specific map, to decide that this cluster is the correct one; or perhaps medium-level trust in a group of independent maps, which you would find belonging to the same cluster, which would tell you that this is the correct one. -- I am not sure if something like this can literally be done, but it feels like a good metaphor for how I evaluate the truth of things where I can’t see the territory. Or at least this is what I tell myself.
Do you have personal habits of truth-seeking or evaluation information?
In political debates very rarely, because it’s time consuming, and when I already have enough data to make a conclusion, no one cares anymore. And outside of the LW community, almost no one would care about the data anyway. This is a thing I believe contributes a lot to irrationality: changing topics too quickly, so that after you collect some data and make some verification, it’s no longer important because the debate has already moved on a different topic. (One of the reasons I think having too much new content on LW is actualy not a good thing. It then becomes more important to respond quickly than to respond correctly.)
Also, some people do the annoying thing when you show them the facts contradicting their beliefs, and they tell you “okay, you are technically right about this detail, but you are still wrong, because...” and then comes a tone argument, or shifting a goalpost, or some kind of mind-reading… simply, even being able to prove facts beyond reasonable doubt doesn’t help you win a debate. Yeah, winning debate is not the most important thing, but it would feel nice to receive some reward for doing the fact checking. Instead, the people who were factually wrong still provide each other emotional rewards for being the good guys, despite being technically wrong on some unimportant detail. Well, reinforcements matter; doing rationality alone is difficult.
Sometimes it feels to me that publicly staying away from tribalism is impossible. At the moment you disagree with a sacred belief of some tribe, the tribe members will start calling you a member of the opposite tribe. Even worse, the members of the opposite tribe may take this as an encouragement to join you. If you depend on volunteers, this already disrupt your balance. If you somehow succeed to resist this, and also disagree with a sacred belief of the other tribe, both tribes will simply call you an idiot. And even that may not help you achieve the image of neutrality, because the first tribe may continue to claim that you are deep inside a fan of the second tribe; and the second tribe may continue to claim that you are deep inside a fan of the first tribe. (Because, from inside of a mindkilled person everyone who disagrees has to be, in some way, mindkilled for the enemy.)
So do I, but various people whom I consider mindkilled in return consider me mindkilled for the opposite site, and I am quite aware that from an outside view this just seems like two people accusing each other of the same thing, so why should I be the one who is right? I take some comfort in knowing that I am accused of many contradictory things, which is a weak evidence that the accusations are bullshit, but this is a kind of reverse-stupidity reasoning.
At the bottom of fact-checking, you need to compare the map with the territory. Just comparing two maps won’t do enough. It can tell you which maps are more similar and which are less, perhaps you could do some kind of cluster analysis on them… but you would still need to have deep trust in some specific map, to decide that this cluster is the correct one; or perhaps medium-level trust in a group of independent maps, which you would find belonging to the same cluster, which would tell you that this is the correct one. -- I am not sure if something like this can literally be done, but it feels like a good metaphor for how I evaluate the truth of things where I can’t see the territory. Or at least this is what I tell myself.
In political debates very rarely, because it’s time consuming, and when I already have enough data to make a conclusion, no one cares anymore. And outside of the LW community, almost no one would care about the data anyway. This is a thing I believe contributes a lot to irrationality: changing topics too quickly, so that after you collect some data and make some verification, it’s no longer important because the debate has already moved on a different topic. (One of the reasons I think having too much new content on LW is actualy not a good thing. It then becomes more important to respond quickly than to respond correctly.)
Also, some people do the annoying thing when you show them the facts contradicting their beliefs, and they tell you “okay, you are technically right about this detail, but you are still wrong, because...” and then comes a tone argument, or shifting a goalpost, or some kind of mind-reading… simply, even being able to prove facts beyond reasonable doubt doesn’t help you win a debate. Yeah, winning debate is not the most important thing, but it would feel nice to receive some reward for doing the fact checking. Instead, the people who were factually wrong still provide each other emotional rewards for being the good guys, despite being technically wrong on some unimportant detail. Well, reinforcements matter; doing rationality alone is difficult.