I was re-reading the meditations of moloch the other day, and it dawned on me that our situation is kind of relevant when it comes to information spreading on the internet.
The current state of competition on the internet seems to be quite clearly in disalignment with what we deem as good, like truth or insight. It feels like we are mid-way towards an equilibrium that is far worse. Unless something is done, we should expect the volume of fake narratives, fake news and lies of all sorts to grow going forward. On the positive side, we should expect to see far more information that confirms with our group’s opinions and rattles our emotions, especially anger or awe.
And this seem to me to be serious. Beliefs matter. People act on what they think they know about the world. To bring the incentives a bit more back in alignment, we likely need some new institutions. It’s unclear to me if we currently have any internet- institutions that is working on verification. Wikipedia might count, but it is quite weak and easily subverted. The factchecker-websites have been helpful, but they seem to be overrun, and mostly used when it confirms a group’s beliefs, ironically.
On a website like Quora it feels like total entropy. Like the entire internet is suffering a sort of eternal september, and in some way it is, with 2 out of the 3 billion coming online since 2009. And everywhere you turn, people despair about lies and not knowing who to trust. There is a civilizational need (and perhaps also a market) for truth.
So, I wanted to ask you, how do we fundamentally confirm that something is true?
And if we had that method, how would an institution strong enough to actual alter the incentives of online publisers look like?
I can thing a few candidate themes
Source-reputation: how would we go about analyzing and ranking websites for their reputation for truth?
The science method: conjecture, criticism and testing seems viable. Can this be applied universally?
The bayesian method: every time a claim was made, we updated the probability of its truth, weighted for source or strength of information. Unclear if it is viable to boil down texts to essence of the belief or claim, even less clear if comparing is even possible.
(PS: I find myself thinking that I personally, somehow, am a great evaluator of truth. If I really am, or you are. There should be some very simple habit to discover from that, that maybe can be applied widely.
Yet what I do seem mundane. I curate my information sources: SSC, Marginalrevolution, WBW, Overcoming Bias, LW. (But also Reddit, Twitter, Quora.) And I even observe myselfupvoting things I agree with and emotionally engage with on Reddit, without any source-checking. I sort of rely on previous knowledge, I think, critique what I just read using existing knowledge and making a snap judgement.
Do you have personal habits of truth-seeking or evaluation information?)
It’s unclear to me if we currently have any internet- institutions that is working on verification.
Sometimes it feels to me that publicly staying away from tribalism is impossible. At the moment you disagree with a sacred belief of some tribe, the tribe members will start calling you a member of the opposite tribe. Even worse, the members of the opposite tribe may take this as an encouragement to join you. If you depend on volunteers, this already disrupt your balance. If you somehow succeed to resist this, and also disagree with a sacred belief of the other tribe, both tribes will simply call you an idiot. And even that may not help you achieve the image of neutrality, because the first tribe may continue to claim that you are deep inside a fan of the second tribe; and the second tribe may continue to claim that you are deep inside a fan of the first tribe. (Because, from inside of a mindkilled person everyone who disagrees has to be, in some way, mindkilled for the enemy.)
I find myself thinking that I personally, somehow, am a great evaluator of truth.
So do I, but various people whom I consider mindkilled in return consider me mindkilled for the opposite site, and I am quite aware that from an outside view this just seems like two people accusing each other of the same thing, so why should I be the one who is right? I take some comfort in knowing that I am accused of many contradictory things, which is a weak evidence that the accusations are bullshit, but this is a kind of reverse-stupidity reasoning.
At the bottom of fact-checking, you need to compare the map with the territory. Just comparing two maps won’t do enough. It can tell you which maps are more similar and which are less, perhaps you could do some kind of cluster analysis on them… but you would still need to have deep trust in some specific map, to decide that this cluster is the correct one; or perhaps medium-level trust in a group of independent maps, which you would find belonging to the same cluster, which would tell you that this is the correct one. -- I am not sure if something like this can literally be done, but it feels like a good metaphor for how I evaluate the truth of things where I can’t see the territory. Or at least this is what I tell myself.
Do you have personal habits of truth-seeking or evaluation information?
In political debates very rarely, because it’s time consuming, and when I already have enough data to make a conclusion, no one cares anymore. And outside of the LW community, almost no one would care about the data anyway. This is a thing I believe contributes a lot to irrationality: changing topics too quickly, so that after you collect some data and make some verification, it’s no longer important because the debate has already moved on a different topic. (One of the reasons I think having too much new content on LW is actualy not a good thing. It then becomes more important to respond quickly than to respond correctly.)
Also, some people do the annoying thing when you show them the facts contradicting their beliefs, and they tell you “okay, you are technically right about this detail, but you are still wrong, because...” and then comes a tone argument, or shifting a goalpost, or some kind of mind-reading… simply, even being able to prove facts beyond reasonable doubt doesn’t help you win a debate. Yeah, winning debate is not the most important thing, but it would feel nice to receive some reward for doing the fact checking. Instead, the people who were factually wrong still provide each other emotional rewards for being the good guys, despite being technically wrong on some unimportant detail. Well, reinforcements matter; doing rationality alone is difficult.
You shouldn’t think of people (aka internet users) as an undifferentiated mass. There are multiple competitions on the ’net for different population segments. For example, SSC isn’t really competing with the see-Kylie-Jenner-naked people. There isn’t going to be one single equilibrium.
Like the entire internet is suffering a sort of eternal september
Oh, dear. I hate to break it to you, but...
how do we fundamentally confirm that something is true
In the usual way. You imagine people selling bridges didn’t exist before the ’net? What do you think the whole science thing is about? Internet actually makes it a lot easier to check whether something being told to you is a lie.
Yet what I do seem mundane.
Yes, and that’s fine. Information hygiene is mundane, like brushing your teeth—or resisting the urge to burrow into a hospital’s infectious-waste trash pile.
Saving the world from bad information is a… dangerous approach.
I was re-reading the meditations of moloch the other day, and it dawned on me that our situation is kind of relevant when it comes to information spreading on the internet.
The current state of competition on the internet seems to be quite clearly in disalignment with what we deem as good, like truth or insight. It feels like we are mid-way towards an equilibrium that is far worse. Unless something is done, we should expect the volume of fake narratives, fake news and lies of all sorts to grow going forward. On the positive side, we should expect to see far more information that confirms with our group’s opinions and rattles our emotions, especially anger or awe.
And this seem to me to be serious. Beliefs matter. People act on what they think they know about the world. To bring the incentives a bit more back in alignment, we likely need some new institutions. It’s unclear to me if we currently have any internet- institutions that is working on verification. Wikipedia might count, but it is quite weak and easily subverted. The factchecker-websites have been helpful, but they seem to be overrun, and mostly used when it confirms a group’s beliefs, ironically.
On a website like Quora it feels like total entropy. Like the entire internet is suffering a sort of eternal september, and in some way it is, with 2 out of the 3 billion coming online since 2009. And everywhere you turn, people despair about lies and not knowing who to trust. There is a civilizational need (and perhaps also a market) for truth.
So, I wanted to ask you, how do we fundamentally confirm that something is true? And if we had that method, how would an institution strong enough to actual alter the incentives of online publisers look like?
I can thing a few candidate themes
Source-reputation: how would we go about analyzing and ranking websites for their reputation for truth?
The science method: conjecture, criticism and testing seems viable. Can this be applied universally?
The bayesian method: every time a claim was made, we updated the probability of its truth, weighted for source or strength of information. Unclear if it is viable to boil down texts to essence of the belief or claim, even less clear if comparing is even possible.
(PS: I find myself thinking that I personally, somehow, am a great evaluator of truth. If I really am, or you are. There should be some very simple habit to discover from that, that maybe can be applied widely.
Yet what I do seem mundane. I curate my information sources: SSC, Marginalrevolution, WBW, Overcoming Bias, LW. (But also Reddit, Twitter, Quora.) And I even observe myselfupvoting things I agree with and emotionally engage with on Reddit, without any source-checking. I sort of rely on previous knowledge, I think, critique what I just read using existing knowledge and making a snap judgement.
Do you have personal habits of truth-seeking or evaluation information?)
Sometimes it feels to me that publicly staying away from tribalism is impossible. At the moment you disagree with a sacred belief of some tribe, the tribe members will start calling you a member of the opposite tribe. Even worse, the members of the opposite tribe may take this as an encouragement to join you. If you depend on volunteers, this already disrupt your balance. If you somehow succeed to resist this, and also disagree with a sacred belief of the other tribe, both tribes will simply call you an idiot. And even that may not help you achieve the image of neutrality, because the first tribe may continue to claim that you are deep inside a fan of the second tribe; and the second tribe may continue to claim that you are deep inside a fan of the first tribe. (Because, from inside of a mindkilled person everyone who disagrees has to be, in some way, mindkilled for the enemy.)
So do I, but various people whom I consider mindkilled in return consider me mindkilled for the opposite site, and I am quite aware that from an outside view this just seems like two people accusing each other of the same thing, so why should I be the one who is right? I take some comfort in knowing that I am accused of many contradictory things, which is a weak evidence that the accusations are bullshit, but this is a kind of reverse-stupidity reasoning.
At the bottom of fact-checking, you need to compare the map with the territory. Just comparing two maps won’t do enough. It can tell you which maps are more similar and which are less, perhaps you could do some kind of cluster analysis on them… but you would still need to have deep trust in some specific map, to decide that this cluster is the correct one; or perhaps medium-level trust in a group of independent maps, which you would find belonging to the same cluster, which would tell you that this is the correct one. -- I am not sure if something like this can literally be done, but it feels like a good metaphor for how I evaluate the truth of things where I can’t see the territory. Or at least this is what I tell myself.
In political debates very rarely, because it’s time consuming, and when I already have enough data to make a conclusion, no one cares anymore. And outside of the LW community, almost no one would care about the data anyway. This is a thing I believe contributes a lot to irrationality: changing topics too quickly, so that after you collect some data and make some verification, it’s no longer important because the debate has already moved on a different topic. (One of the reasons I think having too much new content on LW is actualy not a good thing. It then becomes more important to respond quickly than to respond correctly.)
Also, some people do the annoying thing when you show them the facts contradicting their beliefs, and they tell you “okay, you are technically right about this detail, but you are still wrong, because...” and then comes a tone argument, or shifting a goalpost, or some kind of mind-reading… simply, even being able to prove facts beyond reasonable doubt doesn’t help you win a debate. Yeah, winning debate is not the most important thing, but it would feel nice to receive some reward for doing the fact checking. Instead, the people who were factually wrong still provide each other emotional rewards for being the good guys, despite being technically wrong on some unimportant detail. Well, reinforcements matter; doing rationality alone is difficult.
You shouldn’t think of people (aka internet users) as an undifferentiated mass. There are multiple competitions on the ’net for different population segments. For example, SSC isn’t really competing with the see-Kylie-Jenner-naked people. There isn’t going to be one single equilibrium.
Oh, dear. I hate to break it to you, but...
In the usual way. You imagine people selling bridges didn’t exist before the ’net? What do you think the whole science thing is about? Internet actually makes it a lot easier to check whether something being told to you is a lie.
Yes, and that’s fine. Information hygiene is mundane, like brushing your teeth—or resisting the urge to burrow into a hospital’s infectious-waste trash pile.
Saving the world from bad information is a… dangerous approach.