This seems related to something I’ve been thinking about recently: That the concept of “belief” would benefit from an analysis along the lines of How an Algorithm Feels from the Inside. What we describe as our “beliefs” are sometimes a map of the world (in the beliefs-paying-rent sense), and sometimes a signal to our social group that we share their map of the world, and sometimes a declaration of values, and probably sometimes other (often contradictory) things as well. But we act as if there’s a single mental concept underlying them. The ambiguities are hard to shake out, I think because the signal version is only useful if it pretends to be the map version.
(I feel sour about human nature whenever I start thinking about this, because it leaves me feeling like almost all communication is either speaking in bad faith, or displaying a complete lack of intellectual integrity, or both)
This seems related to something I’ve been thinking about recently: That the concept of “belief” would benefit from an analysis along the lines of How an Algorithm Feels from the Inside. What we describe as our “beliefs” are sometimes a map of the world (in the beliefs-paying-rent sense), and sometimes a signal to our social group that we share their map of the world, and sometimes a declaration of values, and probably sometimes other (often contradictory) things as well. But we act as if there’s a single mental concept underlying them. The ambiguities are hard to shake out, I think because the signal version is only useful if it pretends to be the map version.
(I feel sour about human nature whenever I start thinking about this, because it leaves me feeling like almost all communication is either speaking in bad faith, or displaying a complete lack of intellectual integrity, or both)