Solutions do not have to be perfect to be useful. Trust can be built up over time.
misinformation is impossible to combat
If you take the US government who at the same time tells Facebook not to delete the anti-vaccine misinformation that the US government is spreading while telling Facebook to delete certain anti-vaccine misinformation that the US government doesn’t like, it’s obvious that the institutions aren’t trustworthy and thus they have a hard time fighting misinformation.
If the US government would stop lying, it would find it a lot easier to fight misinformation.
Outside of the government, nobody tries to fund an organization that values truth and that has a mission to fight misinformation with sufficient capital. All the organizations that have “fighting misinformation” on their banner are highly partisan and not centered around valuing truth.
The fact that nobody in the media asked Kamala Harris about whether Joe Biden was wrong to spread antivax misinformation as commander in chief, tell you a lot about how much the mainstream media cares about misinformation.
“Fighting misinformation” often means “Rejecting views which goes against ones own political narrative”. Scientists care the most about truth, and they aren’t afraid of challenging and questioning what is known. People heavily invested in politics don’t actually care all the much about truth, they just pretend to do so because it sounds noble. The “truthseeking” kind of person is a bit of a weirdo, not a lot of them exists.
It’s sad that the problem has gotten bad enough that even people on here have recognized it, but It’s nice not seeing comments which essentially say “Official sources are untrustworthy? That’s a bold claim. Give me evidence, from a source that I consider official and trustworthy, of course.”
But I really want to point out that it’s mathematically impossible to combat falsehood, and that it doesn’t matter if you call it “misinformation” or “disinformation”. The very approach fundamentally misunderstands how knowledge works.
1: Science is about refining our understanding, which means challenging it rather than attacking anyone who disagrees. It must be “open to modification” rather than “closed”. 2: It’s impossible to know if there’s any unknown unknowns that one is missing. Absolute certaincy does not seem to exist in knowledge. 3: Many things depend on definitions, which are arbitrary. “Is X a mental illness?” is decided not by X, but by a persons relationship to X. 4: Any conversation which has some intellectual weight is going to be difficult, and unless you can understand what the other person is saying, you cannot know if they’re wrong. 5: Language seems to have a lot of relativity and unspoken assumptions. If you say “Death is bad” you may mean “For a human, the idea of death is uncomfortable if it prevents them from something that they’re capable of doing”. Arguing against “Death is bad” is trivial, “If death didn’t exist, neither would the modern man, for we evolved through darwinism”. 6: People who know better always outnumber those who only have superficial understandings. Consensuses favor quantity over quality, but those who are ahead of the rest must necessarily be a minority who possess obscure knowledge which is difficult to communicate. I’d go as far as saying that getting average people involved in science was a mistake. 99% of people aren’t knowledable enough to understand the vaccine, so their position on the matter depends on the political bias of the authority they trust, which makes everything they have to say about the topic worthless. Except of couse statements like “The government stated X, now they’re staying Y, so they either lied in the past or they’re lying now” which is just basic logic
There no reason why you would need absolute certainty to make progress in fighting misinformation.
When the US government wanted to Facebook not to delete the misinformation they spread to get people in the Philippines to oppose the Chinese vaccine, they did not argue to Facebook that their misinformation was truthful.
If it’s not about truth value, then it’s not about misinformation. It’s more about manipulation and the harmfulness of certain information, no?
My point is about the imperfections//limitations of language. If I say “the vaccine is safe”, how safe does it have to be for my statement to be true? Is an one-in-a-million risk a proof by contradiction, or is it evidence of safety? Where’s the cut-off for ‘safety’?
I do think fighting “bad-faith manipulation” is doable at times, but I don’t think you can label anything as being true/false for certain.
Another point, which I should have mentioned earlier, is that removing false information can be harmful. Better to let it stay along with the counter-arguments which are posted, so that observers can read both sides and judge for themselves. Believing in something false is a human right. Imagine, for isntance, if believing (or not believing) in god was actually illegal
If you actually want to fight misinformation you need to to more than focusing on single claims. You actually need to speak about a domain of knowledge in a trustworthy way instead of just making claims for propaganda purposes.
A list of experiences of people with specific journalists doesn’t give you certainty about the habits of the journalists but it’s better than nothing. Additionally, it can pressure the journalists into behaving better because they don’t want to be shamed.
Solutions do not have to be perfect to be useful. Trust can be built up over time.
If you take the US government who at the same time tells Facebook not to delete the anti-vaccine misinformation that the US government is spreading while telling Facebook to delete certain anti-vaccine misinformation that the US government doesn’t like, it’s obvious that the institutions aren’t trustworthy and thus they have a hard time fighting misinformation.
If the US government would stop lying, it would find it a lot easier to fight misinformation.
Outside of the government, nobody tries to fund an organization that values truth and that has a mission to fight misinformation with sufficient capital. All the organizations that have “fighting misinformation” on their banner are highly partisan and not centered around valuing truth.
The fact that nobody in the media asked Kamala Harris about whether Joe Biden was wrong to spread antivax misinformation as commander in chief, tell you a lot about how much the mainstream media cares about misinformation.
“Fighting misinformation” often means “Rejecting views which goes against ones own political narrative”. Scientists care the most about truth, and they aren’t afraid of challenging and questioning what is known. People heavily invested in politics don’t actually care all the much about truth, they just pretend to do so because it sounds noble. The “truthseeking” kind of person is a bit of a weirdo, not a lot of them exists.
It’s sad that the problem has gotten bad enough that even people on here have recognized it, but It’s nice not seeing comments which essentially say “Official sources are untrustworthy? That’s a bold claim. Give me evidence, from a source that I consider official and trustworthy, of course.”
But I really want to point out that it’s mathematically impossible to combat falsehood, and that it doesn’t matter if you call it “misinformation” or “disinformation”. The very approach fundamentally misunderstands how knowledge works.
1: Science is about refining our understanding, which means challenging it rather than attacking anyone who disagrees. It must be “open to modification” rather than “closed”.
2: It’s impossible to know if there’s any unknown unknowns that one is missing. Absolute certaincy does not seem to exist in knowledge.
3: Many things depend on definitions, which are arbitrary. “Is X a mental illness?” is decided not by X, but by a persons relationship to X.
4: Any conversation which has some intellectual weight is going to be difficult, and unless you can understand what the other person is saying, you cannot know if they’re wrong.
5: Language seems to have a lot of relativity and unspoken assumptions. If you say “Death is bad” you may mean “For a human, the idea of death is uncomfortable if it prevents them from something that they’re capable of doing”. Arguing against “Death is bad” is trivial, “If death didn’t exist, neither would the modern man, for we evolved through darwinism”.
6: People who know better always outnumber those who only have superficial understandings. Consensuses favor quantity over quality, but those who are ahead of the rest must necessarily be a minority who possess obscure knowledge which is difficult to communicate. I’d go as far as saying that getting average people involved in science was a mistake. 99% of people aren’t knowledable enough to understand the vaccine, so their position on the matter depends on the political bias of the authority they trust, which makes everything they have to say about the topic worthless. Except of couse statements like “The government stated X, now they’re staying Y, so they either lied in the past or they’re lying now” which is just basic logic
There no reason why you would need absolute certainty to make progress in fighting misinformation.
When the US government wanted to Facebook not to delete the misinformation they spread to get people in the Philippines to oppose the Chinese vaccine, they did not argue to Facebook that their misinformation was truthful.
If it’s not about truth value, then it’s not about misinformation. It’s more about manipulation and the harmfulness of certain information, no?
My point is about the imperfections//limitations of language. If I say “the vaccine is safe”, how safe does it have to be for my statement to be true? Is an one-in-a-million risk a proof by contradiction, or is it evidence of safety? Where’s the cut-off for ‘safety’?
I do think fighting “bad-faith manipulation” is doable at times, but I don’t think you can label anything as being true/false for certain.
Another point, which I should have mentioned earlier, is that removing false information can be harmful. Better to let it stay along with the counter-arguments which are posted, so that observers can read both sides and judge for themselves. Believing in something false is a human right. Imagine, for isntance, if believing (or not believing) in god was actually illegal
If you actually want to fight misinformation you need to to more than focusing on single claims. You actually need to speak about a domain of knowledge in a trustworthy way instead of just making claims for propaganda purposes.
A list of experiences of people with specific journalists doesn’t give you certainty about the habits of the journalists but it’s better than nothing. Additionally, it can pressure the journalists into behaving better because they don’t want to be shamed.