[Note: somewhat taking you up on the Crocker’s rules]
Peterson’s truth-seeking and data-processing juice is in super-heavy weight class, comparable to Eliezer etc. Please don’t make the mistake of lightly saying he’s “wrong on many things”.
At the level of analysis in your post and the linked Medium article, I don’t think you can safely say Peterson is “technically wrong” about anything; it’s overwhelmingly more likely you just didn’t understand what he means. [it’s possible to make more case-specific arguments here but I think the outside view meta-rationality should be enough...]
If you want to me to accept JBP as an authority on technical truth (like Eliezer or Scott are), then I would like to actually see some case-specific arguments. Since I found the case-specific arguments to go against Peterson on the issues where I disagree, I’m not really going to change my mind on the basis of just your own authority backing Peterson’s authority.
For example: the main proof Peterson cites to show he was right about C-16 being the end of free speech is the Lindsay Shepherd fiasco. Except her case wasn’t even in the relevant jurisdiction, which the university itself admitted! The Shepherd case was about C-16, but no one thinks (anymore) that she was in any in violation of C-16 or could be punished under it. I’ll admit JBP was right when Shepherd is dragged to jail by the secret police.
Where I think Peterson goes wrong most often is when he overgeneralizes from the small and biased sample of his own experience. Eating nothing but chicken and greens helped cure Peterson’s own rare autoimmune disease, so now everyone should stop eating carbs forever. He almost never qualifies his opinions or the advice he gives, or specifies that it only applies to a specific group. Here’s a good explanation by Scott why this approach is a problem.
This leads me to the main issue where I’d really like to know if Peterson is technically right or wrong: how much of a threat are the “postmodernist neo-Marxists” to our civilization? Peterson’s answer is “100%, gulags on the way”, but he’s also a professor at a liberal university. That’s where the postmodernists are, but it’s not really representative of where civilization is. I think it would be very hard for anyone to extrapolate carefully about society at large from such an extreme situation, and I haven’t seen evidence that Peterson can be trusted to do so.
[Please delete this thread if you think this is getting out of hand. Because it might :)]
I’m not really going to change my mind on the basis of just your own authority backing Peterson’s authority.
See right here, you haven’t listened. What I’m saying is that there is some fairly objective quality which I called “truth-seeking juice” about people like Peterson, Eliezer and Scott which you can evaluate by yourself. But you are just dug yourself into the same trap a little bit more. From what you write, your heuristics for evaluating sources seem to be a combination of authority and fact-checking isolated pieces (regardless of how much you understand the whole picture). Those are really bad heuristics!
The only reason why Eliezer and Scott seem trustworthy to you is that their big picture is similar to your default, so what they say is automatically parsed as true/sensible. They make tons of mistakes and might fairly be called “technically wrong on many things”. And yet you don’t care because you when you feel their big picture is right, those mistakes feel to you like not-really-mistakes.
On a charitable interpretation of pop Bayesianism, its message is:
Everyone needs to understand basic probability theory!
That is a sentiment I agree with violently. I think most people could understand probability, and it should be taught in high school. It’s not really difficult, and it’s incredibly valuable. For instance, many public policy issues can’t properly be understood without probability theory.
Unfortunately, if this is the pop Bayesians’ agenda, they aren’t going at it right. They preach almost exclusively a formula called Bayes’ Rule. (The start of Julia Galef’s video features it in neon.) That is not a good way to teach probability.
What about if you go read that, and try to mentally swap places. The degree to which Chapman doesn’t get Eliezer’s big picture is probably similar to the degree to which you don’t get Peterson’s big picture, with similar results.
I’m worried we may be falling into an argument about definitions, which seems to happen a lot around JBP. Let me try to sharpen some distinctions.
In your quote, Chapman disagrees with Eliezer about his general approach, or perhaps about what Eliezer finds meaningful, but not about matters of fact. I disagree with JBP about matters of fact.
My best guess at what “truth-seeking juice” means comes in two parts: a desire to find the truth, and a methodology for doing so. All three of Eliezer/Scott/JBP have the first part down, but their methodologies are very different. Eliezer’s strength is overcoming bias, and Scott’s strength is integrating scientific evidence, and I believe they’re very good at it because I’ve seen them do it a lot and be wrong about facts very very rarely. In this post I actually disagree with Eliezer about a matter of fact (how many people before modernity were Biblical literalists), and I do so with some trepidation.
JBP’s methodology is optimized for finding his own truth, the metaphorical kind. Like Scott has a track record of being right on science debates, JBP has a track record of all his ideas fitting into a coherent and inspirational worldview—his big picture. When I say he’s wrong I don’t mean his big picture is bad. I mean he’s wrong about facts, and that the Peterson mask is dangerous when one needs to get the facts right.
I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.
I’m not sure...
a) how to handle this sort of disagreeing on vantage points, where it’s hard to disentangle ‘person has an important frame that you’re not seeing that is worth at least having the ability to step inside’ vs ‘person is just wrong’ and ‘person is trying to help you step inside a frame’ vs ‘person is making an opaque-and-wrong appeal to authority’ (or various shades of similar issues).
or, on the meta level:
b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but… just aren’t very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I’m thinking a lot about right now in the general case).
As of now I have not read anything about Peterson besides this post and one friend’s facebook review of his book, so I don’t have a horse in the object level discussion.
Jacob: I think what Squirrel is saying is that your focus on the object level claims from within your current frame is causing you to miss important insights you could grok if you were trying harder to step inside Jordan’s frame (as opposed to what you are currently doing, which looks more like “explaining his frame from inside your frame.”)
[To be clear, your frame, which I share, seems like a really super great way to see the world and possibly literally the best one, but I think the mental skill of deeply inhabiting other worldviews is important, albeit for reasons I’d probably need to spend 10 hours thinking about in order to fully justify]
[[Also, insofar as chains of authority is worth listening to and insofar as I get any authority cred, I think Squirrel is pretty worth listening to as filter for directing your attention at things that might be nonsense or might be weirdly important]]
Squirrel: I’d tentatively guess that you’d make better headway trying to describe Jordan’s frame and what value you got out of it than the hard-to-tell-from-argument-by-authority thing you’re currently doing, although also I think it may have been correct to do the first two comments you did before getting to that point anyway, dunno.
Meta: I think it’s a reasonable norm on LW for to expect people to acquire the “absorb weird frames you don’t understand” skill, but also a reasonable to have the default frame be “the sort of approach outlined in the sequences”, and try as best you can to make foreign frames legible within that paradigm.
Ray, are you 100% sure that’s what is actually going on?
Let’s introduce some notation, following the OP: there are (at least) two relevant frameworks of truth, the technical, which we’ll denote T, and the metaphorical, M. In this community we should be able to agree what T is, and I may or may not be confused about what M is and how it relates to T. I wrote this post specifically to talk about M, but I don’t think that’s where Squirrel and I are in disagreement.
My post explicitly said that I think that Peterson is M.right even though he’s T.wrong-on-many-things. Squirrel didn’t say they (he? she? ze?) “got some value” out of Peterson in the M-framework. They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct. Well, Eliezer told me how to assess whether someone is T.correct—I look at the evidence in the object-level claims.
If someone thinks I’m doing T wrong and misapplying rationality, I’m going to need specifics. Ditto if someone thinks that Eliezer is also T.wrong-on-many-things and I don’t notice that because I’m deluding myself, So far, I’m the only one who came up with an example of where I think that Eliezer it T.wrong.
My point when talking about Squirrel’s authority isn’t to belittle them, but to say that changing my mind would require a bit more effort, if anyone feels up to it. It should be obvious that my own framework is such that saying “truth juice” is unlikely to move me. I want to be moved! I’ve been spelling out the details not because I want to fight over C-16 or low carb breakfasts, but to make it easier for people who want to convince me or change my framework to see where the handles are. And I’ve tried to introduce specific language so we don’t talk past each other (Rule 10: be precise in your speech).
Of course, that doesn’t make me entitled to people’s efforts. If you have something more fun to do on a Sunday, no hard feelings :)
Ray, are you 100% sure that’s what is actually going on?
Nope! (It was my best guess, which is why I used some words like “seems” and “I think that Squirrel is saying”)
But, sounds from the other comment I got it about right.
I agree that persuading someone to step harder into a frame requires a fair bit of effort than what Squirrel has done so far (so far I’ve never seen anyone convince someone of this sort of thing in one sitting, and always seems to require direct chains of trust, often over years, but I think the art of talking about this usefully has a lot of room for progress)
They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct.
Frustrating, that’s not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful:
I don’t think you can safely say Peterson is “technically wrong” about anything
I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below.
I’m sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that’s what I think is actually stopping people from thinking here.
Yeah, these are issues outside of his cognitive expertise and it’s quite clear that he’s getting them wrong… you are mostly accusing him of getting things wrong about which he never cared in the first place.
What exactly did you think I meant when I said he’s “technically wrong about many things” and you told me to be careful? I meant something very close to what your quote says, I don’t even know if we’re disagreeing about anything.
And by the way, there is plenty of room for disagreement. alkjash just wrote what I thought you were going to, a detailed point-by-point argument for why Peterson isn’t, in fact, wrong. There’s a big difference between alkjash’s “Peterson doesn’t say what you think he says” and “Peterson says what you think and he’s wrong, but it’s not important to the big picture”. If Peterson really says “humans can’t do math without terminal values” that’s a very interesting statement, certainly not one that I can judge as obviously wrong.
I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.
I think you should consider the possibility that the not-very-positive reaction your comments about Peterson here have received may have a cause other than status-fighting.
(LW is one of the less status-crazy places I’m familiar with. The complaints about Peterson in this discussion do not look to me as if they are primarily motivated by status concerns. Some of your comments about him seem needlessly status-defensive, though.)
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
The same is true of other things: blog/Twitter followers, Facebook likes etc. are important inasmuch as they give me the ability to spread my message to more people. If I never said anything controversial for fear of losing measurable status, I would be foregoing all the benefits of acquiring it in the first place.
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
Getting laid, for one thing.
And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to.
Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he’d agree that that’s not terribly usual.
Yeah, I thought it could be something like that. I don’t live in Berkeley, and no woman who has ever slept with me cared one jot about my LW karma.
With that said, the kind of status that can be gained or lost by debating the technical correctness of claims JBP makes with someone you don’t know personally seems too far removed from anyone’s actual social life to have an impact on getting laid one way or another.
Peterson gets many things wrong—not just technically wrong, but deeply wrong, wrong on the level of “ancient aliens built the pyramids”. He’s far to willing to indulge in mysticism, and has a fundamental lack of skepticism or anything approaching appropriate rigor when it comes to certain pet ideas.
He isn’t an intellectual super-heavy weight, he’s Deepak Chopra for people who know how to code. We can do better.
Rationalists have also been known to talk about some kookysoundingstuff. Here’s Val from CFAR describing something that sounds a lot like Peterson’s “synchronicity”:
After a sequence of mythic exploration and omens, it seemed clear to me that I needed to visit New York City. I was actually ready to hop on a plane the day after we’d finished with a CFAR workshop… but a bunch of projects showed up as important for me to deal with over the following week. So I booked plane tickets for a week later.
When I arrived, it turned out that the Shaolin monk who teaches there was arriving back from a weeks-long trip from Argentina that day.
This is a kind of thing I’ve come to expect from mythic mode. I could have used murphyjitsu to hopefully notice that maybe the monk wouldn’t be there and then called to check, and then carefully timed my trip to coincide with when he’s there. But from inside mythic mode, that wouldn’t have mattered: either it would just work out (like it did); or it was fated within the script that it wouldn’t work out, in which case some problem I didn’t anticipate would appear anyway (e.g., I might have just failed to think of the monk possibly traveling). My landing the same day he returned, as a result of my just happening to need to wait a week… is the kind of coincidence one just gets used to after a while of operating mythically.
I would guess that the same people who objected to those paragraphs, also object to similar paragraphs by Peterson (at least I object to both on similar grounds).
Cool examples, thanks! Yeah, these are issues outside of his cognitive expertise and it’s quite clear that he’s getting them wrong.
Note that I never said that Peterson isn’t making mistakes (I’m quite careful with my wording!). I said that his truth-seeking power is in the same weight class, but obviously he has a different kind of power than LW-style. E.g. he’s less able to deal with cognitive bias.
But if you are doing “fact-checking” in LW style, you are mostly accusing him of getting things wrong about which he never cared in the first place.
Like when Eliezer is using phlogiston as an example in the Sequences and gets the historical facts wrong. But that doesn’t make Eliezer wrong in any meaningful sense, because that’s not what he was talking about.
There’s some basic courtesy in listening to someone’s message, not words.
Sorry, but I think that is a lame response. It really, really isn’t just lack of expertise—it’s a matter of Peterson’s abandonment of skepticism and scholarly integrity. I’m sorry, but you don’t need to be a historian to tell that the ancient Egyptians didn’t know about the structure of DNA. You don’t need to be a statistician to know that coincidences don’t disprove scientific materialism. Peterson is a PhD who know the level of due diligence needed to publish in peer reviewed journals from experience. He knows better but did it anyway.
But if you are doing “fact-checking” in LW style, you are mostly accusing him of getting things wrong about which he never cared in the first place.
He cares enough to tell his students, explicitly, that he “really does believe” that ancient art depicts DNA—repeatedly! - and put it in public youtube videos with his real name and face.
Like when Eliezer is using phlogiston as an example in the Sequences and gets the historical facts wrong.
It’s more like if Eliezer used the “ancient aliens built the pyramids” theory as an example in one of the sequences in a way that made it clear that he really does believe aliens built the pyramids. It’s stupid to believe it in the first place, and it’s stupid to use it as an example.
There’s some basic courtesy in listening to someone’s message, not words.
Then what makes Peterson so special? Why should I pay more attention to him than, say, Deepak Chopra? Or an Islamist Cleric? Or a postmodernist gender studies professor who thinks western science is just a tool of patriarchal oppression? Might they also have messages that are “metaphorically true” even though their words are actually bunk? If Peterson gets the benefit of the doubt when he says stupid things, why shouldn’t everybody else? If uses enough mental gymnastics, almost anything can be made to be “metaphorically true”.
Peterson’s fans are too emotionally invested in him to really consider what he’s saying rationally—akin to religious believers. Yes, he gives his audience motivation and meaning—much in the same way religion does for other demographics- but that can be a very powerful emotional blinder. If you really think that something gives your life meaning and motivation, you’ll overlook its flaws, even when it means weakening your epistemology.
It’s not surprising when religious believers to retreat to the claim that their holy texts are “metaphorically true” when they’re confronted with the evidence that their text is literally false—but it’s embarrassing to see a supposed rationalist do the same when someone criticizes their favorite guru. We’re supposed to know better.
This is what the whole discussion is about. You are setting boundaries that are convenient for you, and refuse to think further. But some people in that reference class you are now denigrating as a whole are different from others. Some actually know their stuff and are not charlatans. Throwing a tantrum about it doesn’t change it.
(I upvoted that comment, but:) Truth-seeking is more than avoiding bias, just as typing is more than not hitting the wrong keys and drawing is more than not making your lines crooked when you want them straight.
Someone might have deep insight into human nature; or outstanding skill in finding mathematical proofs; or a mind exceptionally fertile in generating new ideas, some of which turn out to be right; or an encyclopaedic knowledge of certain fields. Any of those would enhance their truth-seeking ability considerably. If they happen not to be particularly good at avoiding bias, that will worsen their truth-seeking ability. But they might still be better overall than someone with exceptional ability to avoid bias but without their particular skills.
[Note: somewhat taking you up on the Crocker’s rules]
Peterson’s truth-seeking and data-processing juice is in super-heavy weight class, comparable to Eliezer etc. Please don’t make the mistake of lightly saying he’s “wrong on many things”.
At the level of analysis in your post and the linked Medium article, I don’t think you can safely say Peterson is “technically wrong” about anything; it’s overwhelmingly more likely you just didn’t understand what he means. [it’s possible to make more case-specific arguments here but I think the outside view meta-rationality should be enough...]
If you want to me to accept JBP as an authority on technical truth (like Eliezer or Scott are), then I would like to actually see some case-specific arguments. Since I found the case-specific arguments to go against Peterson on the issues where I disagree, I’m not really going to change my mind on the basis of just your own authority backing Peterson’s authority.
For example: the main proof Peterson cites to show he was right about C-16 being the end of free speech is the Lindsay Shepherd fiasco. Except her case wasn’t even in the relevant jurisdiction, which the university itself admitted! The Shepherd case was about C-16, but no one thinks (anymore) that she was in any in violation of C-16 or could be punished under it. I’ll admit JBP was right when Shepherd is dragged to jail by the secret police.
Where I think Peterson goes wrong most often is when he overgeneralizes from the small and biased sample of his own experience. Eating nothing but chicken and greens helped cure Peterson’s own rare autoimmune disease, so now everyone should stop eating carbs forever. He almost never qualifies his opinions or the advice he gives, or specifies that it only applies to a specific group. Here’s a good explanation by Scott why this approach is a problem.
This leads me to the main issue where I’d really like to know if Peterson is technically right or wrong: how much of a threat are the “postmodernist neo-Marxists” to our civilization? Peterson’s answer is “100%, gulags on the way”, but he’s also a professor at a liberal university. That’s where the postmodernists are, but it’s not really representative of where civilization is. I think it would be very hard for anyone to extrapolate carefully about society at large from such an extreme situation, and I haven’t seen evidence that Peterson can be trusted to do so.
[Please delete this thread if you think this is getting out of hand. Because it might :)]
See right here, you haven’t listened. What I’m saying is that there is some fairly objective quality which I called “truth-seeking juice” about people like Peterson, Eliezer and Scott which you can evaluate by yourself. But you are just dug yourself into the same trap a little bit more. From what you write, your heuristics for evaluating sources seem to be a combination of authority and fact-checking isolated pieces (regardless of how much you understand the whole picture). Those are really bad heuristics!
The only reason why Eliezer and Scott seem trustworthy to you is that their big picture is similar to your default, so what they say is automatically parsed as true/sensible. They make tons of mistakes and might fairly be called “technically wrong on many things”. And yet you don’t care because you when you feel their big picture is right, those mistakes feel to you like not-really-mistakes.
Here’s an example of someone who doesn’t automatically get Eliezer’s big picture, and thinks very sensibly from their own perspective:
What about if you go read that, and try to mentally swap places. The degree to which Chapman doesn’t get Eliezer’s big picture is probably similar to the degree to which you don’t get Peterson’s big picture, with similar results.
I’m worried we may be falling into an argument about definitions, which seems to happen a lot around JBP. Let me try to sharpen some distinctions.
In your quote, Chapman disagrees with Eliezer about his general approach, or perhaps about what Eliezer finds meaningful, but not about matters of fact. I disagree with JBP about matters of fact.
My best guess at what “truth-seeking juice” means comes in two parts: a desire to find the truth, and a methodology for doing so. All three of Eliezer/Scott/JBP have the first part down, but their methodologies are very different. Eliezer’s strength is overcoming bias, and Scott’s strength is integrating scientific evidence, and I believe they’re very good at it because I’ve seen them do it a lot and be wrong about facts very very rarely. In this post I actually disagree with Eliezer about a matter of fact (how many people before modernity were Biblical literalists), and I do so with some trepidation.
JBP’s methodology is optimized for finding his own truth, the metaphorical kind. Like Scott has a track record of being right on science debates, JBP has a track record of all his ideas fitting into a coherent and inspirational worldview—his big picture. When I say he’s wrong I don’t mean his big picture is bad. I mean he’s wrong about facts, and that the Peterson mask is dangerous when one needs to get the facts right.
I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.
I’m not sure...
a) how to handle this sort of disagreeing on vantage points, where it’s hard to disentangle ‘person has an important frame that you’re not seeing that is worth at least having the ability to step inside’ vs ‘person is just wrong’ and ‘person is trying to help you step inside a frame’ vs ‘person is making an opaque-and-wrong appeal to authority’ (or various shades of similar issues).
or, on the meta level:
b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but… just aren’t very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I’m thinking a lot about right now in the general case).
As of now I have not read anything about Peterson besides this post and one friend’s facebook review of his book, so I don’t have a horse in the object level discussion.
Jacob: I think what Squirrel is saying is that your focus on the object level claims from within your current frame is causing you to miss important insights you could grok if you were trying harder to step inside Jordan’s frame (as opposed to what you are currently doing, which looks more like “explaining his frame from inside your frame.”)
[To be clear, your frame, which I share, seems like a really super great way to see the world and possibly literally the best one, but I think the mental skill of deeply inhabiting other worldviews is important, albeit for reasons I’d probably need to spend 10 hours thinking about in order to fully justify]
[[Also, insofar as chains of authority is worth listening to and insofar as I get any authority cred, I think Squirrel is pretty worth listening to as filter for directing your attention at things that might be nonsense or might be weirdly important]]
Squirrel: I’d tentatively guess that you’d make better headway trying to describe Jordan’s frame and what value you got out of it than the hard-to-tell-from-argument-by-authority thing you’re currently doing, although also I think it may have been correct to do the first two comments you did before getting to that point anyway, dunno.
Meta: I think it’s a reasonable norm on LW for to expect people to acquire the “absorb weird frames you don’t understand” skill, but also a reasonable to have the default frame be “the sort of approach outlined in the sequences”, and try as best you can to make foreign frames legible within that paradigm.
Ray, are you 100% sure that’s what is actually going on?
Let’s introduce some notation, following the OP: there are (at least) two relevant frameworks of truth, the technical, which we’ll denote T, and the metaphorical, M. In this community we should be able to agree what T is, and I may or may not be confused about what M is and how it relates to T. I wrote this post specifically to talk about M, but I don’t think that’s where Squirrel and I are in disagreement.
My post explicitly said that I think that Peterson is M.right even though he’s T.wrong-on-many-things. Squirrel didn’t say they (he? she? ze?) “got some value” out of Peterson in the M-framework. They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct. Well, Eliezer told me how to assess whether someone is T.correct—I look at the evidence in the object-level claims.
If someone thinks I’m doing T wrong and misapplying rationality, I’m going to need specifics. Ditto if someone thinks that Eliezer is also T.wrong-on-many-things and I don’t notice that because I’m deluding myself, So far, I’m the only one who came up with an example of where I think that Eliezer it T.wrong.
My point when talking about Squirrel’s authority isn’t to belittle them, but to say that changing my mind would require a bit more effort, if anyone feels up to it. It should be obvious that my own framework is such that saying “truth juice” is unlikely to move me. I want to be moved! I’ve been spelling out the details not because I want to fight over C-16 or low carb breakfasts, but to make it easier for people who want to convince me or change my framework to see where the handles are. And I’ve tried to introduce specific language so we don’t talk past each other (Rule 10: be precise in your speech).
Of course, that doesn’t make me entitled to people’s efforts. If you have something more fun to do on a Sunday, no hard feelings :)
Nope! (It was my best guess, which is why I used some words like “seems” and “I think that Squirrel is saying”)
But, sounds from the other comment I got it about right.
I agree that persuading someone to step harder into a frame requires a fair bit of effort than what Squirrel has done so far (so far I’ve never seen anyone convince someone of this sort of thing in one sitting, and always seems to require direct chains of trust, often over years, but I think the art of talking about this usefully has a lot of room for progress)
Frustrating, that’s not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful:
I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below.
I’m sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that’s what I think is actually stopping people from thinking here.
Your reply below says:
What exactly did you think I meant when I said he’s “technically wrong about many things” and you told me to be careful? I meant something very close to what your quote says, I don’t even know if we’re disagreeing about anything.
And by the way, there is plenty of room for disagreement. alkjash just wrote what I thought you were going to, a detailed point-by-point argument for why Peterson isn’t, in fact, wrong. There’s a big difference between alkjash’s “Peterson doesn’t say what you think he says” and “Peterson says what you think and he’s wrong, but it’s not important to the big picture”. If Peterson really says “humans can’t do math without terminal values” that’s a very interesting statement, certainly not one that I can judge as obviously wrong.
I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.
I think you should consider the possibility that the not-very-positive reaction your comments about Peterson here have received may have a cause other than status-fighting.
(LW is one of the less status-crazy places I’m familiar with. The complaints about Peterson in this discussion do not look to me as if they are primarily motivated by status concerns. Some of your comments about him seem needlessly status-defensive, though.)
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
The same is true of other things: blog/Twitter followers, Facebook likes etc. are important inasmuch as they give me the ability to spread my message to more people. If I never said anything controversial for fear of losing measurable status, I would be foregoing all the benefits of acquiring it in the first place.
Getting laid, for one thing.
And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to.
Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he’d agree that that’s not terribly usual.
Yeah, I thought it could be something like that. I don’t live in Berkeley, and no woman who has ever slept with me cared one jot about my LW karma.
With that said, the kind of status that can be gained or lost by debating the technical correctness of claims JBP makes with someone you don’t know personally seems too far removed from anyone’s actual social life to have an impact on getting laid one way or another.
Perhaps you can explain what Peterson really means when he says that he really believes that the double helix structure of DNA is being depicted in ancient Egyptian and Chinese art.
What does he really means when he says, “Proof itself, of any sort, is impossible, without an axiom (as Godel proved). Thus faith in God is a prerequisite for all proof.”?
Why does he seems to believe in Jung’s paranormal concept of “synchronicity”?
Why does he think quantum mechanics means consciousness creates reality, and confuse the Copenhagen interpretation with Wheeler’s participatory anthropic principle?
Peterson gets many things wrong—not just technically wrong, but deeply wrong, wrong on the level of “ancient aliens built the pyramids”. He’s far to willing to indulge in mysticism, and has a fundamental lack of skepticism or anything approaching appropriate rigor when it comes to certain pet ideas.
He isn’t an intellectual super-heavy weight, he’s Deepak Chopra for people who know how to code. We can do better.
Rationalists have also been known to talk about some kooky sounding stuff. Here’s Val from CFAR describing something that sounds a lot like Peterson’s “synchronicity”:
I would guess that the same people who objected to those paragraphs, also object to similar paragraphs by Peterson (at least I object to both on similar grounds).
Cool examples, thanks! Yeah, these are issues outside of his cognitive expertise and it’s quite clear that he’s getting them wrong.
Note that I never said that Peterson isn’t making mistakes (I’m quite careful with my wording!). I said that his truth-seeking power is in the same weight class, but obviously he has a different kind of power than LW-style. E.g. he’s less able to deal with cognitive bias.
But if you are doing “fact-checking” in LW style, you are mostly accusing him of getting things wrong about which he never cared in the first place.
Like when Eliezer is using phlogiston as an example in the Sequences and gets the historical facts wrong. But that doesn’t make Eliezer wrong in any meaningful sense, because that’s not what he was talking about.
There’s some basic courtesy in listening to someone’s message, not words.
Sorry, but I think that is a lame response. It really, really isn’t just lack of expertise—it’s a matter of Peterson’s abandonment of skepticism and scholarly integrity. I’m sorry, but you don’t need to be a historian to tell that the ancient Egyptians didn’t know about the structure of DNA. You don’t need to be a statistician to know that coincidences don’t disprove scientific materialism. Peterson is a PhD who know the level of due diligence needed to publish in peer reviewed journals from experience. He knows better but did it anyway.
He cares enough to tell his students, explicitly, that he “really does believe” that ancient art depicts DNA—repeatedly! - and put it in public youtube videos with his real name and face.
It’s more like if Eliezer used the “ancient aliens built the pyramids” theory as an example in one of the sequences in a way that made it clear that he really does believe aliens built the pyramids. It’s stupid to believe it in the first place, and it’s stupid to use it as an example.
Then what makes Peterson so special? Why should I pay more attention to him than, say, Deepak Chopra? Or an Islamist Cleric? Or a postmodernist gender studies professor who thinks western science is just a tool of patriarchal oppression? Might they also have messages that are “metaphorically true” even though their words are actually bunk? If Peterson gets the benefit of the doubt when he says stupid things, why shouldn’t everybody else? If uses enough mental gymnastics, almost anything can be made to be “metaphorically true”.
Peterson’s fans are too emotionally invested in him to really consider what he’s saying rationally—akin to religious believers. Yes, he gives his audience motivation and meaning—much in the same way religion does for other demographics- but that can be a very powerful emotional blinder. If you really think that something gives your life meaning and motivation, you’ll overlook its flaws, even when it means weakening your epistemology.
It’s not surprising when religious believers to retreat to the claim that their holy texts are “metaphorically true” when they’re confronted with the evidence that their text is literally false—but it’s embarrassing to see a supposed rationalist do the same when someone criticizes their favorite guru. We’re supposed to know better.
This is what the whole discussion is about. You are setting boundaries that are convenient for you, and refuse to think further. But some people in that reference class you are now denigrating as a whole are different from others. Some actually know their stuff and are not charlatans. Throwing a tantrum about it doesn’t change it.
Then what the heck do you mean by “equal in truth-seeking ability”?
(I upvoted that comment, but:) Truth-seeking is more than avoiding bias, just as typing is more than not hitting the wrong keys and drawing is more than not making your lines crooked when you want them straight.
Someone might have deep insight into human nature; or outstanding skill in finding mathematical proofs; or a mind exceptionally fertile in generating new ideas, some of which turn out to be right; or an encyclopaedic knowledge of certain fields. Any of those would enhance their truth-seeking ability considerably. If they happen not to be particularly good at avoiding bias, that will worsen their truth-seeking ability. But they might still be better overall than someone with exceptional ability to avoid bias but without their particular skills.