Maybe the lowest-hanging fruit was already picked. If someone tried to write Sequences 2.0, what would it be about? Cognitive biases that Eliezer skipped?
Something I feel Yudkowsky doesn’t really talks about enough in the Sequences is how to be rational in a group, as part of a group and as a group. There is some material in there and HPMOR also offers some stuff, but there’s very little that is as formalized as the ideas around “Politics is the Mindkiller/Spiders/Hard Mode,” or “the Typical Mind Fallacy.”
Something Yudkowsky also mentions is that what he writes about rationality is his path. Some things generalize (most people have the same cognitive biases, but in different amounts). From reading the final parts of the Sequences and the final moments of HPMOR I get the vibe that Yudkowsky really wants people to develop their own path. Alicorn did this and Yvain also did/does it to some extent (and I’m reading the early non-Sequence posts and I think that MBlume also did this a bit), but it’s something that could be written more about. Now, I agree that this is hard, the lowest fruit probably is already picked and it’s not something everyone can do. But I find it hard to believe that there are just 3 or 4 people who can actually do this. The bonobo rationalists on tumblr are, in their own, weird way, trying to find a good way to exist in the world in relation to other people. Some of this is formalized, but most of it exists in conversations on tumblr (which is an incredibly annoying medium, both to read and to share). Other people/places from the Map probably do stuff like that as well. I take this as evidence that there is still fruit low enough to pick without needing a ladder.
Something I feel Yudkowsky doesn’t really talks about enough in the Sequences is how to be rational in a group, as part of a group and as a group.
I’ve been working on a series of posts centered around this—social rationality, if you will. So far, the best source for such materials remains Yvain’s writings on the topic on his blog; he really nails the art of having sane discussions. He popularised some ways of framing debate tactics such as motte-and-bailey, steelmanning, bravery debates and so on, which entered the SSC jargon.
I’m interested in expanding on that theme with topics such as emphasis fights (“yes, but”-ing) or arguing in bad faith, as examples of failure modes in collective truth-seeking, but in the end it all hinges on an ideally shared perception of morality, or of standards to hold oneself to. My approach relies heavily on motives and on my personal conception of morality, which is why it’s difficult to teach it without looking like I preach it. (At least Eliezer didn’t look too concerned about this one, though, but not everyone has the fortune to be him.) Besides, it’s a very complex and murky field, one best learned through experience and examples.
Something I feel Yudkowsky doesn’t really talks about enough in the Sequences is how to be rational in a group, as part of a group and as a group. There is some material in there and HPMOR also offers some stuff, but there’s very little that is as formalized as the ideas around “Politics is the Mindkiller/Spiders/Hard Mode,” or “the Typical Mind Fallacy.”
Something Yudkowsky also mentions is that what he writes about rationality is his path. Some things generalize (most people have the same cognitive biases, but in different amounts). From reading the final parts of the Sequences and the final moments of HPMOR I get the vibe that Yudkowsky really wants people to develop their own path. Alicorn did this and Yvain also did/does it to some extent (and I’m reading the early non-Sequence posts and I think that MBlume also did this a bit), but it’s something that could be written more about. Now, I agree that this is hard, the lowest fruit probably is already picked and it’s not something everyone can do. But I find it hard to believe that there are just 3 or 4 people who can actually do this. The bonobo rationalists on tumblr are, in their own, weird way, trying to find a good way to exist in the world in relation to other people. Some of this is formalized, but most of it exists in conversations on tumblr (which is an incredibly annoying medium, both to read and to share). Other people/places from the Map probably do stuff like that as well. I take this as evidence that there is still fruit low enough to pick without needing a ladder.
I’ve been working on a series of posts centered around this—social rationality, if you will. So far, the best source for such materials remains Yvain’s writings on the topic on his blog; he really nails the art of having sane discussions. He popularised some ways of framing debate tactics such as motte-and-bailey, steelmanning, bravery debates and so on, which entered the SSC jargon.
I’m interested in expanding on that theme with topics such as emphasis fights (“yes, but”-ing) or arguing in bad faith, as examples of failure modes in collective truth-seeking, but in the end it all hinges on an ideally shared perception of morality, or of standards to hold oneself to. My approach relies heavily on motives and on my personal conception of morality, which is why it’s difficult to teach it without looking like I preach it. (At least Eliezer didn’t look too concerned about this one, though, but not everyone has the fortune to be him.) Besides, it’s a very complex and murky field, one best learned through experience and examples.