If you want to me to accept JBP as an authority on technical truth (like Eliezer or Scott are), then I would like to actually see some case-specific arguments. Since I found the case-specific arguments to go against Peterson on the issues where I disagree, I’m not really going to change my mind on the basis of just your own authority backing Peterson’s authority.
For example: the main proof Peterson cites to show he was right about C-16 being the end of free speech is the Lindsay Shepherd fiasco. Except her case wasn’t even in the relevant jurisdiction, which the university itself admitted! The Shepherd case was about C-16, but no one thinks (anymore) that she was in any in violation of C-16 or could be punished under it. I’ll admit JBP was right when Shepherd is dragged to jail by the secret police.
Where I think Peterson goes wrong most often is when he overgeneralizes from the small and biased sample of his own experience. Eating nothing but chicken and greens helped cure Peterson’s own rare autoimmune disease, so now everyone should stop eating carbs forever. He almost never qualifies his opinions or the advice he gives, or specifies that it only applies to a specific group. Here’s a good explanation by Scott why this approach is a problem.
This leads me to the main issue where I’d really like to know if Peterson is technically right or wrong: how much of a threat are the “postmodernist neo-Marxists” to our civilization? Peterson’s answer is “100%, gulags on the way”, but he’s also a professor at a liberal university. That’s where the postmodernists are, but it’s not really representative of where civilization is. I think it would be very hard for anyone to extrapolate carefully about society at large from such an extreme situation, and I haven’t seen evidence that Peterson can be trusted to do so.
[Please delete this thread if you think this is getting out of hand. Because it might :)]
I’m not really going to change my mind on the basis of just your own authority backing Peterson’s authority.
See right here, you haven’t listened. What I’m saying is that there is some fairly objective quality which I called “truth-seeking juice” about people like Peterson, Eliezer and Scott which you can evaluate by yourself. But you are just dug yourself into the same trap a little bit more. From what you write, your heuristics for evaluating sources seem to be a combination of authority and fact-checking isolated pieces (regardless of how much you understand the whole picture). Those are really bad heuristics!
The only reason why Eliezer and Scott seem trustworthy to you is that their big picture is similar to your default, so what they say is automatically parsed as true/sensible. They make tons of mistakes and might fairly be called “technically wrong on many things”. And yet you don’t care because you when you feel their big picture is right, those mistakes feel to you like not-really-mistakes.
On a charitable interpretation of pop Bayesianism, its message is:
Everyone needs to understand basic probability theory!
That is a sentiment I agree with violently. I think most people could understand probability, and it should be taught in high school. It’s not really difficult, and it’s incredibly valuable. For instance, many public policy issues can’t properly be understood without probability theory.
Unfortunately, if this is the pop Bayesians’ agenda, they aren’t going at it right. They preach almost exclusively a formula called Bayes’ Rule. (The start of Julia Galef’s video features it in neon.) That is not a good way to teach probability.
What about if you go read that, and try to mentally swap places. The degree to which Chapman doesn’t get Eliezer’s big picture is probably similar to the degree to which you don’t get Peterson’s big picture, with similar results.
I’m worried we may be falling into an argument about definitions, which seems to happen a lot around JBP. Let me try to sharpen some distinctions.
In your quote, Chapman disagrees with Eliezer about his general approach, or perhaps about what Eliezer finds meaningful, but not about matters of fact. I disagree with JBP about matters of fact.
My best guess at what “truth-seeking juice” means comes in two parts: a desire to find the truth, and a methodology for doing so. All three of Eliezer/Scott/JBP have the first part down, but their methodologies are very different. Eliezer’s strength is overcoming bias, and Scott’s strength is integrating scientific evidence, and I believe they’re very good at it because I’ve seen them do it a lot and be wrong about facts very very rarely. In this post I actually disagree with Eliezer about a matter of fact (how many people before modernity were Biblical literalists), and I do so with some trepidation.
JBP’s methodology is optimized for finding his own truth, the metaphorical kind. Like Scott has a track record of being right on science debates, JBP has a track record of all his ideas fitting into a coherent and inspirational worldview—his big picture. When I say he’s wrong I don’t mean his big picture is bad. I mean he’s wrong about facts, and that the Peterson mask is dangerous when one needs to get the facts right.
I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.
I’m not sure...
a) how to handle this sort of disagreeing on vantage points, where it’s hard to disentangle ‘person has an important frame that you’re not seeing that is worth at least having the ability to step inside’ vs ‘person is just wrong’ and ‘person is trying to help you step inside a frame’ vs ‘person is making an opaque-and-wrong appeal to authority’ (or various shades of similar issues).
or, on the meta level:
b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but… just aren’t very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I’m thinking a lot about right now in the general case).
As of now I have not read anything about Peterson besides this post and one friend’s facebook review of his book, so I don’t have a horse in the object level discussion.
Jacob: I think what Squirrel is saying is that your focus on the object level claims from within your current frame is causing you to miss important insights you could grok if you were trying harder to step inside Jordan’s frame (as opposed to what you are currently doing, which looks more like “explaining his frame from inside your frame.”)
[To be clear, your frame, which I share, seems like a really super great way to see the world and possibly literally the best one, but I think the mental skill of deeply inhabiting other worldviews is important, albeit for reasons I’d probably need to spend 10 hours thinking about in order to fully justify]
[[Also, insofar as chains of authority is worth listening to and insofar as I get any authority cred, I think Squirrel is pretty worth listening to as filter for directing your attention at things that might be nonsense or might be weirdly important]]
Squirrel: I’d tentatively guess that you’d make better headway trying to describe Jordan’s frame and what value you got out of it than the hard-to-tell-from-argument-by-authority thing you’re currently doing, although also I think it may have been correct to do the first two comments you did before getting to that point anyway, dunno.
Meta: I think it’s a reasonable norm on LW for to expect people to acquire the “absorb weird frames you don’t understand” skill, but also a reasonable to have the default frame be “the sort of approach outlined in the sequences”, and try as best you can to make foreign frames legible within that paradigm.
Ray, are you 100% sure that’s what is actually going on?
Let’s introduce some notation, following the OP: there are (at least) two relevant frameworks of truth, the technical, which we’ll denote T, and the metaphorical, M. In this community we should be able to agree what T is, and I may or may not be confused about what M is and how it relates to T. I wrote this post specifically to talk about M, but I don’t think that’s where Squirrel and I are in disagreement.
My post explicitly said that I think that Peterson is M.right even though he’s T.wrong-on-many-things. Squirrel didn’t say they (he? she? ze?) “got some value” out of Peterson in the M-framework. They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct. Well, Eliezer told me how to assess whether someone is T.correct—I look at the evidence in the object-level claims.
If someone thinks I’m doing T wrong and misapplying rationality, I’m going to need specifics. Ditto if someone thinks that Eliezer is also T.wrong-on-many-things and I don’t notice that because I’m deluding myself, So far, I’m the only one who came up with an example of where I think that Eliezer it T.wrong.
My point when talking about Squirrel’s authority isn’t to belittle them, but to say that changing my mind would require a bit more effort, if anyone feels up to it. It should be obvious that my own framework is such that saying “truth juice” is unlikely to move me. I want to be moved! I’ve been spelling out the details not because I want to fight over C-16 or low carb breakfasts, but to make it easier for people who want to convince me or change my framework to see where the handles are. And I’ve tried to introduce specific language so we don’t talk past each other (Rule 10: be precise in your speech).
Of course, that doesn’t make me entitled to people’s efforts. If you have something more fun to do on a Sunday, no hard feelings :)
Ray, are you 100% sure that’s what is actually going on?
Nope! (It was my best guess, which is why I used some words like “seems” and “I think that Squirrel is saying”)
But, sounds from the other comment I got it about right.
I agree that persuading someone to step harder into a frame requires a fair bit of effort than what Squirrel has done so far (so far I’ve never seen anyone convince someone of this sort of thing in one sitting, and always seems to require direct chains of trust, often over years, but I think the art of talking about this usefully has a lot of room for progress)
They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct.
Frustrating, that’s not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful:
I don’t think you can safely say Peterson is “technically wrong” about anything
I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below.
I’m sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that’s what I think is actually stopping people from thinking here.
Yeah, these are issues outside of his cognitive expertise and it’s quite clear that he’s getting them wrong… you are mostly accusing him of getting things wrong about which he never cared in the first place.
What exactly did you think I meant when I said he’s “technically wrong about many things” and you told me to be careful? I meant something very close to what your quote says, I don’t even know if we’re disagreeing about anything.
And by the way, there is plenty of room for disagreement. alkjash just wrote what I thought you were going to, a detailed point-by-point argument for why Peterson isn’t, in fact, wrong. There’s a big difference between alkjash’s “Peterson doesn’t say what you think he says” and “Peterson says what you think and he’s wrong, but it’s not important to the big picture”. If Peterson really says “humans can’t do math without terminal values” that’s a very interesting statement, certainly not one that I can judge as obviously wrong.
I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.
I think you should consider the possibility that the not-very-positive reaction your comments about Peterson here have received may have a cause other than status-fighting.
(LW is one of the less status-crazy places I’m familiar with. The complaints about Peterson in this discussion do not look to me as if they are primarily motivated by status concerns. Some of your comments about him seem needlessly status-defensive, though.)
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
The same is true of other things: blog/Twitter followers, Facebook likes etc. are important inasmuch as they give me the ability to spread my message to more people. If I never said anything controversial for fear of losing measurable status, I would be foregoing all the benefits of acquiring it in the first place.
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
Getting laid, for one thing.
And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to.
Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he’d agree that that’s not terribly usual.
Yeah, I thought it could be something like that. I don’t live in Berkeley, and no woman who has ever slept with me cared one jot about my LW karma.
With that said, the kind of status that can be gained or lost by debating the technical correctness of claims JBP makes with someone you don’t know personally seems too far removed from anyone’s actual social life to have an impact on getting laid one way or another.
If you want to me to accept JBP as an authority on technical truth (like Eliezer or Scott are), then I would like to actually see some case-specific arguments. Since I found the case-specific arguments to go against Peterson on the issues where I disagree, I’m not really going to change my mind on the basis of just your own authority backing Peterson’s authority.
For example: the main proof Peterson cites to show he was right about C-16 being the end of free speech is the Lindsay Shepherd fiasco. Except her case wasn’t even in the relevant jurisdiction, which the university itself admitted! The Shepherd case was about C-16, but no one thinks (anymore) that she was in any in violation of C-16 or could be punished under it. I’ll admit JBP was right when Shepherd is dragged to jail by the secret police.
Where I think Peterson goes wrong most often is when he overgeneralizes from the small and biased sample of his own experience. Eating nothing but chicken and greens helped cure Peterson’s own rare autoimmune disease, so now everyone should stop eating carbs forever. He almost never qualifies his opinions or the advice he gives, or specifies that it only applies to a specific group. Here’s a good explanation by Scott why this approach is a problem.
This leads me to the main issue where I’d really like to know if Peterson is technically right or wrong: how much of a threat are the “postmodernist neo-Marxists” to our civilization? Peterson’s answer is “100%, gulags on the way”, but he’s also a professor at a liberal university. That’s where the postmodernists are, but it’s not really representative of where civilization is. I think it would be very hard for anyone to extrapolate carefully about society at large from such an extreme situation, and I haven’t seen evidence that Peterson can be trusted to do so.
[Please delete this thread if you think this is getting out of hand. Because it might :)]
See right here, you haven’t listened. What I’m saying is that there is some fairly objective quality which I called “truth-seeking juice” about people like Peterson, Eliezer and Scott which you can evaluate by yourself. But you are just dug yourself into the same trap a little bit more. From what you write, your heuristics for evaluating sources seem to be a combination of authority and fact-checking isolated pieces (regardless of how much you understand the whole picture). Those are really bad heuristics!
The only reason why Eliezer and Scott seem trustworthy to you is that their big picture is similar to your default, so what they say is automatically parsed as true/sensible. They make tons of mistakes and might fairly be called “technically wrong on many things”. And yet you don’t care because you when you feel their big picture is right, those mistakes feel to you like not-really-mistakes.
Here’s an example of someone who doesn’t automatically get Eliezer’s big picture, and thinks very sensibly from their own perspective:
What about if you go read that, and try to mentally swap places. The degree to which Chapman doesn’t get Eliezer’s big picture is probably similar to the degree to which you don’t get Peterson’s big picture, with similar results.
I’m worried we may be falling into an argument about definitions, which seems to happen a lot around JBP. Let me try to sharpen some distinctions.
In your quote, Chapman disagrees with Eliezer about his general approach, or perhaps about what Eliezer finds meaningful, but not about matters of fact. I disagree with JBP about matters of fact.
My best guess at what “truth-seeking juice” means comes in two parts: a desire to find the truth, and a methodology for doing so. All three of Eliezer/Scott/JBP have the first part down, but their methodologies are very different. Eliezer’s strength is overcoming bias, and Scott’s strength is integrating scientific evidence, and I believe they’re very good at it because I’ve seen them do it a lot and be wrong about facts very very rarely. In this post I actually disagree with Eliezer about a matter of fact (how many people before modernity were Biblical literalists), and I do so with some trepidation.
JBP’s methodology is optimized for finding his own truth, the metaphorical kind. Like Scott has a track record of being right on science debates, JBP has a track record of all his ideas fitting into a coherent and inspirational worldview—his big picture. When I say he’s wrong I don’t mean his big picture is bad. I mean he’s wrong about facts, and that the Peterson mask is dangerous when one needs to get the facts right.
I notice that my default sense is that Jacob is making a reasonable ask here, but also that Squirrel seems to be trying to do something similar to what I just felt compelled to do on a different thread so I feel obliged to lean into it a bit.
I’m not sure...
a) how to handle this sort of disagreeing on vantage points, where it’s hard to disentangle ‘person has an important frame that you’re not seeing that is worth at least having the ability to step inside’ vs ‘person is just wrong’ and ‘person is trying to help you step inside a frame’ vs ‘person is making an opaque-and-wrong appeal to authority’ (or various shades of similar issues).
or, on the meta level:
b) what reasonable norms/expectations on LessWrong for handling that sort of thing are. Err on one side and a lot of people miss important things, err on another side and people waste a lot of time on views that maybe have interesting frames but… just aren’t very good. (I like that Jacob set pretty good discussion norms on this thread but this is a thing I’m thinking a lot about right now in the general case).
As of now I have not read anything about Peterson besides this post and one friend’s facebook review of his book, so I don’t have a horse in the object level discussion.
Jacob: I think what Squirrel is saying is that your focus on the object level claims from within your current frame is causing you to miss important insights you could grok if you were trying harder to step inside Jordan’s frame (as opposed to what you are currently doing, which looks more like “explaining his frame from inside your frame.”)
[To be clear, your frame, which I share, seems like a really super great way to see the world and possibly literally the best one, but I think the mental skill of deeply inhabiting other worldviews is important, albeit for reasons I’d probably need to spend 10 hours thinking about in order to fully justify]
[[Also, insofar as chains of authority is worth listening to and insofar as I get any authority cred, I think Squirrel is pretty worth listening to as filter for directing your attention at things that might be nonsense or might be weirdly important]]
Squirrel: I’d tentatively guess that you’d make better headway trying to describe Jordan’s frame and what value you got out of it than the hard-to-tell-from-argument-by-authority thing you’re currently doing, although also I think it may have been correct to do the first two comments you did before getting to that point anyway, dunno.
Meta: I think it’s a reasonable norm on LW for to expect people to acquire the “absorb weird frames you don’t understand” skill, but also a reasonable to have the default frame be “the sort of approach outlined in the sequences”, and try as best you can to make foreign frames legible within that paradigm.
Ray, are you 100% sure that’s what is actually going on?
Let’s introduce some notation, following the OP: there are (at least) two relevant frameworks of truth, the technical, which we’ll denote T, and the metaphorical, M. In this community we should be able to agree what T is, and I may or may not be confused about what M is and how it relates to T. I wrote this post specifically to talk about M, but I don’t think that’s where Squirrel and I are in disagreement.
My post explicitly said that I think that Peterson is M.right even though he’s T.wrong-on-many-things. Squirrel didn’t say they (he? she? ze?) “got some value” out of Peterson in the M-framework. They explicitly said that he’s not wrong-on-many-things in the T framework, the same way Eliezer is T.correct. Well, Eliezer told me how to assess whether someone is T.correct—I look at the evidence in the object-level claims.
If someone thinks I’m doing T wrong and misapplying rationality, I’m going to need specifics. Ditto if someone thinks that Eliezer is also T.wrong-on-many-things and I don’t notice that because I’m deluding myself, So far, I’m the only one who came up with an example of where I think that Eliezer it T.wrong.
My point when talking about Squirrel’s authority isn’t to belittle them, but to say that changing my mind would require a bit more effort, if anyone feels up to it. It should be obvious that my own framework is such that saying “truth juice” is unlikely to move me. I want to be moved! I’ve been spelling out the details not because I want to fight over C-16 or low carb breakfasts, but to make it easier for people who want to convince me or change my framework to see where the handles are. And I’ve tried to introduce specific language so we don’t talk past each other (Rule 10: be precise in your speech).
Of course, that doesn’t make me entitled to people’s efforts. If you have something more fun to do on a Sunday, no hard feelings :)
Nope! (It was my best guess, which is why I used some words like “seems” and “I think that Squirrel is saying”)
But, sounds from the other comment I got it about right.
I agree that persuading someone to step harder into a frame requires a fair bit of effort than what Squirrel has done so far (so far I’ve never seen anyone convince someone of this sort of thing in one sitting, and always seems to require direct chains of trust, often over years, but I think the art of talking about this usefully has a lot of room for progress)
Frustrating, that’s not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful:
I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below.
I’m sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that’s what I think is actually stopping people from thinking here.
Your reply below says:
What exactly did you think I meant when I said he’s “technically wrong about many things” and you told me to be careful? I meant something very close to what your quote says, I don’t even know if we’re disagreeing about anything.
And by the way, there is plenty of room for disagreement. alkjash just wrote what I thought you were going to, a detailed point-by-point argument for why Peterson isn’t, in fact, wrong. There’s a big difference between alkjash’s “Peterson doesn’t say what you think he says” and “Peterson says what you think and he’s wrong, but it’s not important to the big picture”. If Peterson really says “humans can’t do math without terminal values” that’s a very interesting statement, certainly not one that I can judge as obviously wrong.
I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.
I think you should consider the possibility that the not-very-positive reaction your comments about Peterson here have received may have a cause other than status-fighting.
(LW is one of the less status-crazy places I’m familiar with. The complaints about Peterson in this discussion do not look to me as if they are primarily motivated by status concerns. Some of your comments about him seem needlessly status-defensive, though.)
Not to sound glib, but what good is LW status if you don’t use it to freely express your opinions and engage in discussion on LW?
The same is true of other things: blog/Twitter followers, Facebook likes etc. are important inasmuch as they give me the ability to spread my message to more people. If I never said anything controversial for fear of losing measurable status, I would be foregoing all the benefits of acquiring it in the first place.
Getting laid, for one thing.
And, you know, LW is a social group. Status is its own reward. High-status people probably feel better about themselves than low-status people do, and an increase in status will probably make people feel better about themselves than they used to.
Eric Hoffer was a longshoreman who just happened to write wildly popular philosophy books, but I think he’d agree that that’s not terribly usual.
Yeah, I thought it could be something like that. I don’t live in Berkeley, and no woman who has ever slept with me cared one jot about my LW karma.
With that said, the kind of status that can be gained or lost by debating the technical correctness of claims JBP makes with someone you don’t know personally seems too far removed from anyone’s actual social life to have an impact on getting laid one way or another.