I think this post is emblematic of the problem I have with most of Val’s writing: there are useful nuggets of insight here and there, but you’re meant to swallow them along with a metric ton of typical mind fallacy, projection, confirmation bias, and manipulative narrativemancy.
Elsewhere, Val has written words approximated by ~”I tried for years to fit my words into the shape the rationalists wanted me to, and now I’ve given up and I’m just going to speak my mind.”
This is what it sounds like when you are blind to an important distinction. Trying to hedge magic things that you do not grok, engaging in cargo culting. If it feels like tediously shuffling around words and phrases that all mean exactly the same thing, you’re missing the vast distances on the axis that you aren’t perceiving.
The core message of “hey, you might well be caught up in a false narrative that is doing emotional work for you via providing some sense of meaning or purpose and yanking you around by your panic systems, and recognizing that fact can allow you to do anything else” is a good one, and indeed it’s one that many LessWrongers need. It’s even the sort of message that needs some kind of shock along with it, to make readers go “oh shit, that might actually be me.”
But that message does not need to come along with a million little manipulations. That message isn’t improved by attempts to hypnotize the audience, or set up little narrative traps.
e.g. starting with “There’s a kind of game, here, and it’s rude to point out, and you’re not supposed to name it, but I’m going to.” <—I’m one of the cool ones who sees the Matrix! I’m brave and I’m gonna buck the rules! (Reminiscent of a right-wing radio host going “you get punished if you say X” and then going on to spend twenty minutes on X without being punished. It’s a cheap attempt to inflate the importance of the message and the messenger.)
e.g. “I really do respect the right for folk to keep playing it if they want” <—More delegitimization, more status moves. A strong implication along the lines of “the illusion that I, Val, have correctly identified is the only thing happening here.” Not even a token acknowledgement of the possibility that perhaps some of it is not this particular game; no thought given to the possibility that maybe Val is flawed in a way that is not true of all the other LWers. Like the Mythbusters leaping from “well, we couldn’t recreate it” to “therefore, it’s impossible and it never happened, myth BUSTED.”
(I’m really really really tired of the dynamic where someone notices that they’ve been making Mistake X for many years and then just presumes that everyone else is, too, and just blind to it in the same way that they themselves were. It especially rankles when they’re magnanimous about it.)
e.g. “You have to live in a kind of mental illusion to be in terror of the end of the world.” <—More projection, more typical minding, more ~”I’ve comprehended all of the gears here and there’s no way anything else could lead to appropriate terror of the end of the world. The mistake I made is the mistake everyone’s making (but don’t worry, I’m here to guide you out with my superior wisdom, being as I am ahead of you on this one.” See also the actual quote “for what it’s worth, as someone who turned off the game and has reworked his body’s use of power quite a lot, it’s pretty obvious to me that this isn’t how it works,” which, like basically everything else here, is conspicuously missing a pretty damn important for me. The idea that other people might be doing something other than what Val comprehends seems literally not to occur to him.
e.g. “I mean this with respect and admiration. It’s very skillful. Eliezer has incredible mastery in how he weaves terror and insight together.” <—Look! See how I’m above it all, and in a position to evaluate what’s going on? Pay no attention to the fact that this incidentally raises my apparent status, btw.
e.g. “In case that was too opaque for you just yet, I basically just said ‘Your thoughts will do what they can to distract you from your true underlying fear.’ … This is slow work. Unfortunately your ‘drug’ supply is internal, so getting sober is quite a trick.” <—If your experience doesn’t match my predictions, it’s because you’re unskillful, and making [mistake]...but don’t worry, with my “yet” I will subtly imply that if you just keep on listening to my voice, you will eventually see the light. Pay no attention to the fully general counterevidence-dismissing system I’m setting up.
Again, it’s a shame, because bits like “If your body’s emergency mobilization systems are running in response to an issue, but your survival doesn’t actually depend on actions on a timescale of minutes, then you are not perceiving reality accurately” are well worth considering. But the essay sort of forces you to step into Val’s (broken, self-serving, overconfident) frame in order to catch those nuggets. And, among readers who are consciously wise or unconsciously allergic to the sort of manipulation he’s trying to pull, many of them will simply bounce off the thing entirely, and notcatch those useful nuggets.
It didn’t have to be this way. It didn’t have to be arrogant and project-y and author-elevating and oh-so-cynical-and-aloof. There’s another version of this essay out there in possibility space that contains all of the good insights and none of the poison.
But that’s not what we got. Instead, we got a thing that (it seems to me (though I could be wrong)) had the net effect of marginally shifting LW’s discourse in the wrong direction, by virtue of being a popular performance piece wrapped around an actually useful insight or two. It normalizes a kind of sloppy failure-to-be-careful-and-clear that is antithetical to the mission of becoming less wrong. I think this essay lowered the quality of thinking on the site, even as it performed the genuinely useful service of opening some eyes to the problem Val has identified.
(Because no, of course Val was not alone in this issue, it really is a problem that affects Lots Of Humans, it’s just not the only thing going on. Some humans really do just … not have those particular flaws. When you’re colorblind, you can’t see that there are colors that you can’t see, and so it’s hard to account for them, especially if you’re not even bothering to try.)
I’ve been reflecting on this since it was posted. Coming back to it from time to time.
I just wanted to make a note saying: received. I believe I see your point, and I’ve been taking it in.
(I also disagree with some of the narrativemancy you employ here about me. It’s difficult for me to publicly agree with anything in your comment here because of a similar mechanism that you’re objecting to in the OP. I wish I didn’t need to add this note. I’d rather just say “I’m hearing something meaningful in what you’re saying and have been taking it in.” That’s the part that matters. But I can do so only if I also register that I very much disagree with your model of what I was trying to do, and in some cases I strongly disagree with your analysis of what I was in fact doing. Thankfully I can still learn from your message anyway. I just wanted to say — more to LW than to you, really — that I’ve heard your objection and have been taking the truth I can find in it seriously.)
Stumbled upon this. Duncan overlooked the fact that there are good reasons to use a very forceful framing when you’re doing an intervention to help an addict. He got offended by the generalizations because of his tendency to take all such generalizations personally. This distortion causes him to do things like claiming you made “not even a token acknowledgement of the possibility that perhaps some of it is not this particular game”, which is flatly, unambiguously false, because your entire last section of the post was practically filled with such token acknowledgements.
Why does Duncan do this? Because he has an elaborately constructed self-narrative that he is in love with, and when people disagree with his self-narrative, he feels profoundly invalidated and underestimated, and has a distinct habit of phrasing this as “you don’t exist, Duncan”. Actually it’s just a very particular form of vulnerable narcissism, and many aspects of his self-narrative are unambiguously untrue if you just look.
In short, you’re good, and he’s not actually making a point that’s worth taking in. If anything your intervention is not forceful enough. There are many people in this comment section behaving like abject drug addicts and completely failing to realize it.
Though I do feel I ought to add that Buddhism is itself another addiction of this sort, much like Christianity is. Where Christianity rejects worldliness, Buddhism rejects samsara. Both of these are actually rejections of embodiment, even if they do sometimes use embodiment instrumentally. If you’re interested in Eastern philosophical traditions, I strongly recommend Chinese chan over Japanese zen. And when it does come to zen, rinzai zen is a better choice than soto zen.
Edit: upon reflection I agree that the comment was too combative. I still qualitatively endorse most of the claims made, though I think the harshness is misleading. For example, I think the “vulnerable narcissism” thing is technically true, but misleading because it is mitigated by a sufficient level of principled virtue that its connotations are simply too harsh for the description to properly apply. In short, it’s a characterisation that is more technically accurate than emotively accurate.
and [Duncan is] not actually making a point that’s worth taking in.
I disagree. I learned something about what he’s been trying to say to me for years from his reply here.
One of my ongoing frustrations in my life has been, I’ve been very right about some things that really matter a lot, but because I was irritated or triggered or added elements that people could tell were off the mark, they dismissed the part that was actually important.
I think Duncan made a move like that here. He said something unskillfully that nonetheless had truth in it. In his unskillfulness he also added some framing effects that I frankly resent and made it quite hard for me to be seen as taking in. And yet, there’s still something to it. I suspect that at least some of the many people who upvoted his review were responding to that truth.
I can’t really control whether other people respectfully listen to the message underneath my difficulty in expressing it. But I can at least try to offer that listening to others. And benefit from the underlying truth they’re trying to express. And maybe even express it differently so that more people can hear it!
I like the world where that attitude is much more common. I can’t make it more common, but I can at least try to live by it myself, and possibly that’ll inspire something similar by example.
For clarification, I don’t think Duncan is actually playing the game your post was describing, but I think virtually all the other commenters who objected to your post were. I think this justifies the forceful framing of the overall intervention, all the more so because Duncan is averse to generalizations anyway and thus unlikely to be swayed by them when it comes to his own self-concept.
But also, Duncan Sabien’s psychology simply isn’t as rare as he seems to believe. I actually like him in a personal sense and find him interesting as an exemplar of a particular worldview that Duncan distills to an unusual purity. But the worldview is not particularly uncommon, and Duncan stands out only be the extent to which he takes it. Roughly speaking, the worldview comes from a merger of technocratic liberalism (think Keynesianism and Chicago school, both originating ultimately from Fabian socialism), itself tinged heavily by social justice (which traces partly to ecumenism via the social gospel movement and partly to the New Left) and a sort of hybrid of libertarianism and mainstream Republicanism that forms the “right wing” in Silicon Valley and to an extent California more broadly, and definitely forms the “right wing” in LessWrong and SSC circles. The mainstream Republicanism is basically the exoteric counterpart to neoconservatism, which is founded by the Trotskyites James Burnham and Irving Kristol, and the libertarianism comes from Rothbard’s alliance with the old right, who of course drew heavily upon Ayn Rand’s synthesis of Misesianism with a Marxian sociology that begat right-wing syndicalism, agorism, etc. — so basically most of it just comes from Karl Marx when you trace the lineages back.
But the point I’m getting at is not that most of the ideologyspace of LessWrong and SSC and their peripheries trace largely to Karl Marx, the point is that it’s a highly specific form of Marxism, ie. there are identifiable ideological and cultural currents that have shaped Duncan Sabien’s way of thinking, which you can plainly by see by how he still today straddles the line between the social justice tinged liberal wing of LessWrong/SSC culture and the neocon-paleolibertarian wing of LessWrong/SSC culture. Add to this a certain level of neurodivergence, but also a frankly wholesome libidinous love of movement, and you basically wind up with Duncan Sabien. Like, it’s a highly specific combination, but it’s also basically a hegemonic politics, so if we loosen up the category to include not just Duncan Sabien but also people who are broadly like him, then we wind up with hundreds of thousands of highly educated, well-connected people who possess a lot of cultural capital.
My main gripe with Duncan — keeping in mind that I actually like the guy — is that he does not know himself. He has undoubtedly reflected on his own intellectual background, in the sense of thinking about when and how he changed his mind about important things, etc., but he has not done the work of studying his own intellectual lineage in depth. If he did, he’d see that his habitual “you don’t exist, Duncan” framing is missing the mark entirely.
Are there any similar versions of this post on LW which express the same message, but without the patronising tone of Valentine? Would that be valuable?
I think this post is emblematic of the problem I have with most of Val’s writing: there are useful nuggets of insight here and there, but you’re meant to swallow them along with a metric ton of typical mind fallacy, projection, confirmation bias, and manipulative narrativemancy.
Elsewhere, Val has written words approximated by ~”I tried for years to fit my words into the shape the rationalists wanted me to, and now I’ve given up and I’m just going to speak my mind.”
This is what it sounds like when you are blind to an important distinction. Trying to hedge magic things that you do not grok, engaging in cargo culting. If it feels like tediously shuffling around words and phrases that all mean exactly the same thing, you’re missing the vast distances on the axis that you aren’t perceiving.
The core message of “hey, you might well be caught up in a false narrative that is doing emotional work for you via providing some sense of meaning or purpose and yanking you around by your panic systems, and recognizing that fact can allow you to do anything else” is a good one, and indeed it’s one that many LessWrongers need. It’s even the sort of message that needs some kind of shock along with it, to make readers go “oh shit, that might actually be me.”
But that message does not need to come along with a million little manipulations. That message isn’t improved by attempts to hypnotize the audience, or set up little narrative traps.
e.g. starting with “There’s a kind of game, here, and it’s rude to point out, and you’re not supposed to name it, but I’m going to.” <—I’m one of the cool ones who sees the Matrix! I’m brave and I’m gonna buck the rules! (Reminiscent of a right-wing radio host going “you get punished if you say X” and then going on to spend twenty minutes on X without being punished. It’s a cheap attempt to inflate the importance of the message and the messenger.)
e.g. “I really do respect the right for folk to keep playing it if they want” <—More delegitimization, more status moves. A strong implication along the lines of “the illusion that I, Val, have correctly identified is the only thing happening here.” Not even a token acknowledgement of the possibility that perhaps some of it is not this particular game; no thought given to the possibility that maybe Val is flawed in a way that is not true of all the other LWers. Like the Mythbusters leaping from “well, we couldn’t recreate it” to “therefore, it’s impossible and it never happened, myth BUSTED.”
(I’m really really really tired of the dynamic where someone notices that they’ve been making Mistake X for many years and then just presumes that everyone else is, too, and just blind to it in the same way that they themselves were. It especially rankles when they’re magnanimous about it.)
e.g. “You have to live in a kind of mental illusion to be in terror of the end of the world.” <—More projection, more typical minding, more ~”I’ve comprehended all of the gears here and there’s no way anything else could lead to appropriate terror of the end of the world. The mistake I made is the mistake everyone’s making (but don’t worry, I’m here to guide you out with my superior wisdom, being as I am ahead of you on this one.” See also the actual quote “for what it’s worth, as someone who turned off the game and has reworked his body’s use of power quite a lot, it’s pretty obvious to me that this isn’t how it works,” which, like basically everything else here, is conspicuously missing a pretty damn important for me. The idea that other people might be doing something other than what Val comprehends seems literally not to occur to him.
e.g. “I mean this with respect and admiration. It’s very skillful. Eliezer has incredible mastery in how he weaves terror and insight together.” <—Look! See how I’m above it all, and in a position to evaluate what’s going on? Pay no attention to the fact that this incidentally raises my apparent status, btw.
e.g. “In case that was too opaque for you just yet, I basically just said ‘Your thoughts will do what they can to distract you from your true underlying fear.’ … This is slow work. Unfortunately your ‘drug’ supply is internal, so getting sober is quite a trick.” <—If your experience doesn’t match my predictions, it’s because you’re unskillful, and making [mistake]...but don’t worry, with my “yet” I will subtly imply that if you just keep on listening to my voice, you will eventually see the light. Pay no attention to the fully general counterevidence-dismissing system I’m setting up.
Again, it’s a shame, because bits like “If your body’s emergency mobilization systems are running in response to an issue, but your survival doesn’t actually depend on actions on a timescale of minutes, then you are not perceiving reality accurately” are well worth considering. But the essay sort of forces you to step into Val’s (broken, self-serving, overconfident) frame in order to catch those nuggets. And, among readers who are consciously wise or unconsciously allergic to the sort of manipulation he’s trying to pull, many of them will simply bounce off the thing entirely, and not catch those useful nuggets.
It didn’t have to be this way. It didn’t have to be arrogant and project-y and author-elevating and oh-so-cynical-and-aloof. There’s another version of this essay out there in possibility space that contains all of the good insights and none of the poison.
But that’s not what we got. Instead, we got a thing that (it seems to me (though I could be wrong)) had the net effect of marginally shifting LW’s discourse in the wrong direction, by virtue of being a popular performance piece wrapped around an actually useful insight or two. It normalizes a kind of sloppy failure-to-be-careful-and-clear that is antithetical to the mission of becoming less wrong. I think this essay lowered the quality of thinking on the site, even as it performed the genuinely useful service of opening some eyes to the problem Val has identified.
(Because no, of course Val was not alone in this issue, it really is a problem that affects Lots Of Humans, it’s just not the only thing going on. Some humans really do just … not have those particular flaws. When you’re colorblind, you can’t see that there are colors that you can’t see, and so it’s hard to account for them, especially if you’re not even bothering to try.)
I’ve been reflecting on this since it was posted. Coming back to it from time to time.
I just wanted to make a note saying: received. I believe I see your point, and I’ve been taking it in.
(I also disagree with some of the narrativemancy you employ here about me. It’s difficult for me to publicly agree with anything in your comment here because of a similar mechanism that you’re objecting to in the OP. I wish I didn’t need to add this note. I’d rather just say “I’m hearing something meaningful in what you’re saying and have been taking it in.” That’s the part that matters. But I can do so only if I also register that I very much disagree with your model of what I was trying to do, and in some cases I strongly disagree with your analysis of what I was in fact doing. Thankfully I can still learn from your message anyway. I just wanted to say — more to LW than to you, really — that I’ve heard your objection and have been taking the truth I can find in it seriously.)
Stumbled upon this. Duncan overlooked the fact that there are good reasons to use a very forceful framing when you’re doing an intervention to help an addict. He got offended by the generalizations because of his tendency to take all such generalizations personally. This distortion causes him to do things like claiming you made “not even atokenacknowledgement of the possibility that perhaps some of it is not this particular game”, which is flatly, unambiguously false, because your entire last section of the post was practically filled with such token acknowledgements.Why does Duncan do this? Because he has an elaborately constructed self-narrative that he is in love with, and when people disagree with his self-narrative, he feels profoundly invalidated and underestimated, and has a distinct habit of phrasing this as “you don’t exist, Duncan”. Actually it’s just a very particular form of vulnerable narcissism, and many aspects of his self-narrative are unambiguously untrue if you just look.In short, you’re good, and he’s not actually making a point that’s worth taking in. If anything your intervention is not forceful enough. There are many people in this comment section behaving like abject drug addicts and completely failing to realize it.Though I do feel I ought to add that Buddhism is itself another addiction of this sort, much like Christianity is. Where Christianity rejects worldliness, Buddhism rejects samsara. Both of these are actually rejections of embodiment, even if they do sometimes use embodiment instrumentally. If you’re interested in Eastern philosophical traditions, I strongly recommend Chinese chan over Japanese zen. And when it does come to zen, rinzai zen is a better choice than soto zen.Edit: upon reflection I agree that the comment was too combative. I still qualitatively endorse most of the claims made, though I think the harshness is misleading. For example, I think the “vulnerable narcissism” thing is technically true, but misleading because it is mitigated by a sufficient level of principled virtue that its connotations are simply too harsh for the description to properly apply. In short, it’s a characterisation that is more technically accurate than emotively accurate.
Thank you.
I disagree. I learned something about what he’s been trying to say to me for years from his reply here.
One of my ongoing frustrations in my life has been, I’ve been very right about some things that really matter a lot, but because I was irritated or triggered or added elements that people could tell were off the mark, they dismissed the part that was actually important.
I think Duncan made a move like that here. He said something unskillfully that nonetheless had truth in it. In his unskillfulness he also added some framing effects that I frankly resent and made it quite hard for me to be seen as taking in. And yet, there’s still something to it. I suspect that at least some of the many people who upvoted his review were responding to that truth.
I can’t really control whether other people respectfully listen to the message underneath my difficulty in expressing it. But I can at least try to offer that listening to others. And benefit from the underlying truth they’re trying to express. And maybe even express it differently so that more people can hear it!
I like the world where that attitude is much more common. I can’t make it more common, but I can at least try to live by it myself, and possibly that’ll inspire something similar by example.
For clarification, I don’t think Duncan is actually playing the game your post was describing, but I think virtually all the other commenters who objected to your post were. I think this justifies the forceful framing of the overall intervention, all the more so because Duncan is averse to generalizations anyway and thus unlikely to be swayed by them when it comes to his own self-concept.
But also, Duncan Sabien’s psychology simply isn’t as rare as he seems to believe. I actually like him in a personal sense and find him interesting as an exemplar of a particular worldview that Duncan distills to an unusual purity. But the worldview is not particularly uncommon, and Duncan stands out only be the extent to which he takes it. Roughly speaking, the worldview comes from a merger of technocratic liberalism (think Keynesianism and Chicago school, both originating ultimately from Fabian socialism), itself tinged heavily by social justice (which traces partly to ecumenism via the social gospel movement and partly to the New Left) and a sort of hybrid of libertarianism and mainstream Republicanism that forms the “right wing” in Silicon Valley and to an extent California more broadly, and definitely forms the “right wing” in LessWrong and SSC circles. The mainstream Republicanism is basically the exoteric counterpart to neoconservatism, which is founded by the Trotskyites James Burnham and Irving Kristol, and the libertarianism comes from Rothbard’s alliance with the old right, who of course drew heavily upon Ayn Rand’s synthesis of Misesianism with a Marxian sociology that begat right-wing syndicalism, agorism, etc. — so basically most of it just comes from Karl Marx when you trace the lineages back.
But the point I’m getting at is not that most of the ideologyspace of LessWrong and SSC and their peripheries trace largely to Karl Marx, the point is that it’s a highly specific form of Marxism, ie. there are identifiable ideological and cultural currents that have shaped Duncan Sabien’s way of thinking, which you can plainly by see by how he still today straddles the line between the social justice tinged liberal wing of LessWrong/SSC culture and the neocon-paleolibertarian wing of LessWrong/SSC culture. Add to this a certain level of neurodivergence, but also a frankly wholesome libidinous love of movement, and you basically wind up with Duncan Sabien. Like, it’s a highly specific combination, but it’s also basically a hegemonic politics, so if we loosen up the category to include not just Duncan Sabien but also people who are broadly like him, then we wind up with hundreds of thousands of highly educated, well-connected people who possess a lot of cultural capital.
My main gripe with Duncan — keeping in mind that I actually like the guy — is that he does not know himself. He has undoubtedly reflected on his own intellectual background, in the sense of thinking about when and how he changed his mind about important things, etc., but he has not done the work of studying his own intellectual lineage in depth. If he did, he’d see that his habitual “you don’t exist, Duncan” framing is missing the mark entirely.
Are there any similar versions of this post on LW which express the same message, but without the patronising tone of Valentine? Would that be valuable?
ICYMI—Kaj mentioned https://www.lesswrong.com/posts/byewoxJiAfwE6zpep/reality-revealing-and-reality-masking-puzzles in a comment below—I think Anna does a pretty good job of that in that post