Welcome to the Rationality reading group. This week we discuss the sequence Fake Beliefs which introduces the concept of belief in belief and demonstrates the phenomenon in a number of contexts, most notably as it relates to religion. This sequence also foreshadows the mind-killing effects of tribalism and politics, introducing some of the language (e.g. Green vs. Blue) which will be used later.
This post summarizes each article of the sequence, linking to the original LessWrong posting where available, and offers a few relevant notes, thoughts, and ideas for further investigation. My own thoughts and questions for discussion are in the comments.
Reading: Sequence B: Fake Beliefs (p43-77)
B. Fake Beliefs
11. Making beliefs pay rent (in anticipated experiences). Beliefs networks which have no connection to anticipated experience we call “floating” beliefs. Floating beliefs provide no benefit as they do not constrain predictions in any way. Ask about a belief what you expect to see, if the belief is true. Or better yet what you expect not to see: what evidence would falsify the belief. Every belief should flow to a specific guess of anticipation, and should continue to pay rent in future anticipations. If a belief turns deadbeat, evict it. (p45-48)
12. A fable of science and politics. Cautions, though a narrative story, the dangers of that come from feeling attachment to beliefs. Introduces the Greens vs Blues, a fictional debate illustrating the biases which emerge from the tribalism of group politics. (p49-53)
13. Belief in belief. Through the story of someone who claims a dragon lives in their garage, a invisible, inaudible, impermeable dragon which defies all attempts at detection, we are introduced to the concept of belief in belief. The dragon claimant believes that there is a fire-breathing flying animal in his garage, but simultaneously expects to make no observations that would confirm that belief. The belief in belief turns into a form of mental jujutsu where mental models are transfigured in the face of experiment so as to predict whatever would be expected if the belief were not, in fact, true. (p54-58)
14. Bayesian judo. A humorous story illustrating the inconsistency of belief in belief, and the mental jujutsu required to maintain such beliefs. (p59-60)
15. Pretending to be wise. There’s a difference between: (1) passing neutral judgment; (2) declining to invest marginal resources in investigating the sides of a debate; and (3) pretending that either of the above is a mark of deep wisdom, maturity, and a superior vantage point. Propounding neutrality is just as attackable as propounding any particular side. (p61-64)
16. Religion’s claim to be non-disprovable. It is only a recent development in Western thought that religion is something which cannot be proven or disproven. Many examples are provided of falsifiable beliefs which were once the domain of religion. (p65-68)
17. Professing and cheering. Much of modern religion can be thought of as communal profession of belief – actions and words which signal your belief to others. (p69-71)
18. Belief as attire. It is very easy for a human being to genuinely, passionately, gut-level belong to a group. Identifying with a tribe is a very strong emotional force. And once you get people to identify with a tribe, the beliefs which are attire of that tribe will be spoken with the full passion of belonging to that tribe. (p72-73)
19. Applause lights. Sometimes statements are made in the form of proposals when themselves present no meaningful suggestion, e.g. “We need to balance the risks and opportunities of AI.” It’s not so much a propositional statement, as the equivalent of the “Applause” light that tells a studio audience when to clap. Most applause lights can be detected by a simple reversal test: “We shouldn’t balance the risks and opportunities of AI.” Since the reversal sounds abnormal, the unreversed statement is probably normal, implying it does not convey new information. (p74-77)
This has been a collection of notes on the assigned sequence for this week. The most important part of the reading group though is discussion, which is in the comments section. I pose some questions for you there, and I invite you to add your own. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!
The next reading will cover Sequence C: Noticing Confusion (p79-114). The discussion will go live on Wednesday, 20 May 2015 at or around 6pm PDT (hopefully), right here on the discussion forum of LessWrong.
Rationality Reading Group: Fake Beliefs (p43-77)
This is part of a semi-monthly reading group on Eliezer Yudkowsky’s ebook, Rationality: From AI to Zombies. For more information about the group, see the announcement post.
Welcome to the Rationality reading group. This week we discuss the sequence Fake Beliefs which introduces the concept of belief in belief and demonstrates the phenomenon in a number of contexts, most notably as it relates to religion. This sequence also foreshadows the mind-killing effects of tribalism and politics, introducing some of the language (e.g. Green vs. Blue) which will be used later.
This post summarizes each article of the sequence, linking to the original LessWrong posting where available, and offers a few relevant notes, thoughts, and ideas for further investigation. My own thoughts and questions for discussion are in the comments.
Reading: Sequence B: Fake Beliefs (p43-77)
B. Fake Beliefs
11. Making beliefs pay rent (in anticipated experiences). Beliefs networks which have no connection to anticipated experience we call “floating” beliefs. Floating beliefs provide no benefit as they do not constrain predictions in any way. Ask about a belief what you expect to see, if the belief is true. Or better yet what you expect not to see: what evidence would falsify the belief. Every belief should flow to a specific guess of anticipation, and should continue to pay rent in future anticipations. If a belief turns deadbeat, evict it. (p45-48)
12. A fable of science and politics. Cautions, though a narrative story, the dangers of that come from feeling attachment to beliefs. Introduces the Greens vs Blues, a fictional debate illustrating the biases which emerge from the tribalism of group politics. (p49-53)
13. Belief in belief. Through the story of someone who claims a dragon lives in their garage, a invisible, inaudible, impermeable dragon which defies all attempts at detection, we are introduced to the concept of belief in belief. The dragon claimant believes that there is a fire-breathing flying animal in his garage, but simultaneously expects to make no observations that would confirm that belief. The belief in belief turns into a form of mental jujutsu where mental models are transfigured in the face of experiment so as to predict whatever would be expected if the belief were not, in fact, true. (p54-58)
14. Bayesian judo. A humorous story illustrating the inconsistency of belief in belief, and the mental jujutsu required to maintain such beliefs. (p59-60)
15. Pretending to be wise. There’s a difference between: (1) passing neutral judgment; (2) declining to invest marginal resources in investigating the sides of a debate; and (3) pretending that either of the above is a mark of deep wisdom, maturity, and a superior vantage point. Propounding neutrality is just as attackable as propounding any particular side. (p61-64)
16. Religion’s claim to be non-disprovable. It is only a recent development in Western thought that religion is something which cannot be proven or disproven. Many examples are provided of falsifiable beliefs which were once the domain of religion. (p65-68)
17. Professing and cheering. Much of modern religion can be thought of as communal profession of belief – actions and words which signal your belief to others. (p69-71)
18. Belief as attire. It is very easy for a human being to genuinely, passionately, gut-level belong to a group. Identifying with a tribe is a very strong emotional force. And once you get people to identify with a tribe, the beliefs which are attire of that tribe will be spoken with the full passion of belonging to that tribe. (p72-73)
19. Applause lights. Sometimes statements are made in the form of proposals when themselves present no meaningful suggestion, e.g. “We need to balance the risks and opportunities of AI.” It’s not so much a propositional statement, as the equivalent of the “Applause” light that tells a studio audience when to clap. Most applause lights can be detected by a simple reversal test: “We shouldn’t balance the risks and opportunities of AI.” Since the reversal sounds abnormal, the unreversed statement is probably normal, implying it does not convey new information. (p74-77)
This has been a collection of notes on the assigned sequence for this week. The most important part of the reading group though is discussion, which is in the comments section. I pose some questions for you there, and I invite you to add your own. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!
The next reading will cover Sequence C: Noticing Confusion (p79-114). The discussion will go live on Wednesday, 20 May 2015 at or around 6pm PDT (hopefully), right here on the discussion forum of LessWrong.