I like this post a lot, but I have a bit I want to push back on/add nuance towards, which is how the social web behaves when presented with “factionally inconsistent” true information. In the presented hypothetical world controlled by greens, correct blue observations are discounted and hidden, (and the reverse also holds in the reversed case). However, I don’t think the information environment of the current world resembles that very much, the faction boundaries are much less distinct and coherent, often are only alliances of convenience, and the overall social reality field is less “static, enemy territory” than is presented as.
This is important because: - freedom of speech means in practice anyone can say anything - saying factionally-unpopular things can be status-conferring because the actual faction borders are unclear and people can flip sides. - sharing the other faction’s information in a way that makes them look bad can convey status to you for your faction - the other faction can encode true information into what you think is clearly false, and when you then share it to dunk on them, you inadvertently give that true information to others.
this all culminates in a sort of recursive societal waluigi effect where the more that one faction tries to clamp down on a narrative, the more every other faction will inadvertently be represented within the structure of that clamped narrative, and all the partisan effects will replicate themselves inside that structure at every level of complexity.
If factional allegiances trump epistemic accuracy, then you will not have the epistemics to notice when your opponents are saying true things, and so if you try to cherrypick false things to make them look worse, you will accidentally convey true things without realizing it.
Let’s give an example:
Say we have a biased green scientist who wants to “prove greens are always right” and he has that three sided die that comes up green 1⁄3 of the time. He wants to report “correct greens” and “incorrect blues” to prove his point. When a roll he expects to be green comes up green, he reports it, when a roll he expects to be green comes up blue, he also reports it as evidence blue is wrong, because it gives the “wrong answer” to his green-centric-query. if he’s interpreting everything from a green-centric lens, then he will not notice he is doing this.
“the sky clearly blue-appearing to causal observation, which confirms my theory that the sky is green under these conditions I have specified, it merely appears blue for the same reason blues are always wrong”
but if you’re a green who cares about epistemics, or a blue who is looking for real evidence, that green just gave you a bunch of evidence without noticing he was doing it. There are enough people in the world who are just trying to cherrypick for their respective factions, that they will not notice they’re leaking correct epistemics where everyone else can see. This waluigi effect goes in every direction, you can’t point to the other faction and describe how they’re wrong without describing them, which, if they’re right about something, will get slipped in without you realizing it. This is part of why truth is an asymmetric weapon.
The described “blue-green factions divided” world feels sort of “1984″ to our world’s “Brave New World”, in a 1984-esque world, where saying “the sky is blue iff the sky is blue, the sky is green iff the sky is green” would get you hung as a traitor to the greens, the issues described in this thread would likely be more severe and closer to the presented description, but in our world, where “getting hung as a traitor” is, for most people outside of extremely adverse situations, “a bunch of angry people quote tweet and screenshot you and post about you and repeat “lol look how wrong they are” hundreds of times where everyone can see exactly what you’re saying”, well that’s basically just free advertising for what you consider true information, and the people who care about truth will be looking for it, not for color coding.
I like this post a lot, but I have a bit I want to push back on/add nuance towards, which is how the social web behaves when presented with “factionally inconsistent” true information. In the presented hypothetical world controlled by greens, correct blue observations are discounted and hidden, (and the reverse also holds in the reversed case). However, I don’t think the information environment of the current world resembles that very much, the faction boundaries are much less distinct and coherent, often are only alliances of convenience, and the overall social reality field is less “static, enemy territory” than is presented as.
This is important because:
- freedom of speech means in practice anyone can say anything
- saying factionally-unpopular things can be status-conferring because the actual faction borders are unclear and people can flip sides.
- sharing the other faction’s information in a way that makes them look bad can convey status to you for your faction
- the other faction can encode true information into what you think is clearly false, and when you then share it to dunk on them, you inadvertently give that true information to others.
this all culminates in a sort of recursive societal waluigi effect where the more that one faction tries to clamp down on a narrative, the more every other faction will inadvertently be represented within the structure of that clamped narrative, and all the partisan effects will replicate themselves inside that structure at every level of complexity.
If factional allegiances trump epistemic accuracy, then you will not have the epistemics to notice when your opponents are saying true things, and so if you try to cherrypick false things to make them look worse, you will accidentally convey true things without realizing it.
Let’s give an example:
Say we have a biased green scientist who wants to “prove greens are always right” and he has that three sided die that comes up green 1⁄3 of the time. He wants to report “correct greens” and “incorrect blues” to prove his point. When a roll he expects to be green comes up green, he reports it, when a roll he expects to be green comes up blue, he also reports it as evidence blue is wrong, because it gives the “wrong answer” to his green-centric-query. if he’s interpreting everything from a green-centric lens, then he will not notice he is doing this.
“the sky clearly blue-appearing to causal observation, which confirms my theory that the sky is green under these conditions I have specified, it merely appears blue for the same reason blues are always wrong”
but if you’re a green who cares about epistemics, or a blue who is looking for real evidence, that green just gave you a bunch of evidence without noticing he was doing it. There are enough people in the world who are just trying to cherrypick for their respective factions, that they will not notice they’re leaking correct epistemics where everyone else can see. This waluigi effect goes in every direction, you can’t point to the other faction and describe how they’re wrong without describing them, which, if they’re right about something, will get slipped in without you realizing it. This is part of why truth is an asymmetric weapon.
The described “blue-green factions divided” world feels sort of “1984″ to our world’s “Brave New World”, in a 1984-esque world, where saying “the sky is blue iff the sky is blue, the sky is green iff the sky is green” would get you hung as a traitor to the greens, the issues described in this thread would likely be more severe and closer to the presented description, but in our world, where “getting hung as a traitor” is, for most people outside of extremely adverse situations, “a bunch of angry people quote tweet and screenshot you and post about you and repeat “lol look how wrong they are” hundreds of times where everyone can see exactly what you’re saying”, well that’s basically just free advertising for what you consider true information, and the people who care about truth will be looking for it, not for color coding.