Accents are a good example. It’s easy to offend someone or to make incorrect predictions based on “has a British accent”, when you really only know some patterns of pronunciation. In some contexts, that’s a fine compression; way easier to process, communicate and remember. In other contexts, you’re better off highlighting and acknowledging that your data supports many interpretations, and you should be preserve that uncertainty in your communication and predictions.
“casual” vs “precise” are themselves lossy compression of fuzzy concepts, and what I really mean is that the use of compression is valid and helpful sometimes, and harmful and misleading at other times. My point is that the distinction is _NOT_ primarily about how tight the cluster or how close the match to some dimensions of reality in the abstract. The acceptability of the compression is about context and uses for the compressed or less-compressed information, and whether the lost details are important for the purpose of the communication or prediction. It’s whether it meets the needs of the model, not how close it is to “reality”.
Note also that I recognize that no model and no communication is actually full-fidelity. Everything any agent knows is compressed and simplified from reality. The question is how much further compression is valuable for what purposes.
Essentialism is wrong. Conceptual compression and simplified modeling is always necessary, and sometimes even an extreme compaction is good enough for a purpose.
Accents are a good example. It’s easy to offend someone or to make incorrect predictions based on “has a British accent”, when you really only know some patterns of pronunciation. In some contexts, that’s a fine compression; way easier to process, communicate and remember. In other contexts, you’re better off highlighting and acknowledging that your data supports many interpretations, and you should be preserve that uncertainty in your communication and predictions.
“casual” vs “precise” are themselves lossy compression of fuzzy concepts, and what I really mean is that the use of compression is valid and helpful sometimes, and harmful and misleading at other times. My point is that the distinction is _NOT_ primarily about how tight the cluster or how close the match to some dimensions of reality in the abstract. The acceptability of the compression is about context and uses for the compressed or less-compressed information, and whether the lost details are important for the purpose of the communication or prediction. It’s whether it meets the needs of the model, not how close it is to “reality”.
Note also that I recognize that no model and no communication is actually full-fidelity. Everything any agent knows is compressed and simplified from reality. The question is how much further compression is valuable for what purposes.
Essentialism is wrong. Conceptual compression and simplified modeling is always necessary, and sometimes even an extreme compaction is good enough for a purpose.