“I am the pearl-caster and you’re swine” arguments tend to go badly.
Domain expertise is a thing, and society possesses a general social norm in favor of being charitable to domain experts. He also doesn’t come across to me as particularly hostile.
Domain expertise is a thing, and society possesses a general social norm in favor of being charitable to domain experts.
I don’t think the norm is as general as this implies. Western society expects a great deal of charity toward the mentor in a mentor/student relationship, but that relationship is usually a consensual one—it can be assumed in some situations, such as between adults and children or within certain business relationships, but it isn’t automatically in effect in a casual context even if one person has very much more subject matter expertise than the other. It’s usually considered very rude to assume the mentor role without a willing student, unless you’re well-known as a public intellectual, which no one here is.
And the pattern’s weaker still online, where credentials are harder to verify and more egalitarian norms tend to prevail. Except in a venue specifically set up to foster such relationships (like a Reddit AMA), they’re quite rare—even people known as intellectual heavyweights in a certain context, like Scott or Eliezer around here, can usually expect to relate to people more in a first-among-equals kind of way. In fact it’s not uncommon for them to receive more criticism.
Well the issue is, JonahSinick doesn’t come across to me as arrogant, hostile, or assuming any kind of relationship of superiority in the first place. He’s sharing his domain knowledge with us for the sheer pleasure of doing so, and wants to be helpful to people who’ve gotten discouraged about learning mathematics. Given his motivations, his actions, and the context for all of them, I just don’t see the rudeness. It looks to me like some very conceited LW regulars are reading a preaching into this article and JonahSinick’s comments that just isn’t there, by action or intention.
even people known as intellectual heavyweights in a certain context, like Scott or Eliezer around here, can usually expect to relate to people more in a first-among-equals kind of way. In fact it’s not uncommon for them to receive more criticism.
I usually don’t see that much vehement criticism of Scott; it helps that he behaves in a very egalitarian fashion. Eliezer tends to take somewhat heavy criticism, including sometimes from me, precisely because he adheres to the LW community norm of “We here at LW are smarter and know better than everyone else, and we don’t need your stinking domain knowledge.” Oh, and also because Eliezer is phenomenally bad at explaining his thoughts and intentions to people outside Bay Area techno-futurist circles, which probably comes of training himself to be good at explaining his thoughts and intentions to an incredibly narrow, self-selected, and psychologically unusual circle of people. Once you’ve been reading him for long enough to have a clear idea what he’s trying to say, even he’s really not that bad.
It’s funny: when I got here, I thought Eliezer’s Sequences were basically nothing special, just explaining some science and machine-learning stuff to people who apparently can’t be arsed to read the primary sources. But the longer I’m here, the more I sometimes want to exasperatedly say to some or another “aspiring rationalist” who thinks they’re being ever-so-clever, no, you are actually being a Straw Vulcan, read the fucking Sequences.
doesn’t come across to me as arrogant, hostile, or assuming any kind of relationship of superiority in the first place.
Hostile, no. Arrogant—a bit, but quite within the LW norm. But asserting superiority? Very much so. Here is a direct quote:
I outstrip all but a small handful of LWers in intellectual caliber by a very large margin.
And the problems arose not because of claims about superior domain knowledge, but rather claims about superior “crystallized intelligence” and “intellectual caliber” which are much wider than “I’m really good at math”.
Eliezer’s Sequences contain a lot of science and machine-learning stuff as you describe, and a few core bits that… aren’t. Going by volume, most of them are good. But the actual objections to them, of course, will be disproportionately around those few core bits. And sometimes not agreeing with something that is phrased like a scientific lecture can look an awful lot like refusal to listen, even when it’s not.
But the actual objections to them, of course, will be disproportionately around those few core bits.
Well yeah, but hey: take what’s useful and chuck out the rest.
Besides which, by bothering to try to come up with a wholly naturalistic worldview that never resorts to mysticism in the first place, Eliezer is massively ahead of the overwhelming majority of, for example, laypeople and philosophers. Practicing scientists are better at science, but often resort to mysticism themselves when confronted with questions outside their own domain (ie: Roger Penrose and his quantum-woo on consciousness).
I do dislike the degree of mathematical Platonism and Tegmarkism I occasionally see around here, but that’s just my personal extreme distaste for mysticism coming out.
Basically, it’s really nice to have a community where words like “irreducible” will get you lynched, and if I have to put up with a few old blog entries being kinda bad at conveying their intended point, or just plain being wrong, so be it.
Besides which, by bothering to try to come up with a wholly naturalistic worldview that never resorts to mysticism in the first place, Eliezer is massively ahead of the overwhelming majority of, for example, laypeople and philosophers.
That means that if I just say “space aliens have replaced the President”, I’m saying something bad, but if I copy a math textbook, and add a footnote “also, space aliens have replaced the President”, I’m saying something good, because the sum total of what I am saying (a lot of good math + one bad thing about aliens) is good. In one sense that’s correct; people could certainly learn lots of math from my footnoted math textbook. But we don’t generally add these kinds of things together.
Domain expertise is a thing, and society possesses a general social norm in favor of being charitable to domain experts. He also doesn’t come across to me as particularly hostile.
I don’t think the norm is as general as this implies. Western society expects a great deal of charity toward the mentor in a mentor/student relationship, but that relationship is usually a consensual one—it can be assumed in some situations, such as between adults and children or within certain business relationships, but it isn’t automatically in effect in a casual context even if one person has very much more subject matter expertise than the other. It’s usually considered very rude to assume the mentor role without a willing student, unless you’re well-known as a public intellectual, which no one here is.
And the pattern’s weaker still online, where credentials are harder to verify and more egalitarian norms tend to prevail. Except in a venue specifically set up to foster such relationships (like a Reddit AMA), they’re quite rare—even people known as intellectual heavyweights in a certain context, like Scott or Eliezer around here, can usually expect to relate to people more in a first-among-equals kind of way. In fact it’s not uncommon for them to receive more criticism.
Well the issue is, JonahSinick doesn’t come across to me as arrogant, hostile, or assuming any kind of relationship of superiority in the first place. He’s sharing his domain knowledge with us for the sheer pleasure of doing so, and wants to be helpful to people who’ve gotten discouraged about learning mathematics. Given his motivations, his actions, and the context for all of them, I just don’t see the rudeness. It looks to me like some very conceited LW regulars are reading a preaching into this article and JonahSinick’s comments that just isn’t there, by action or intention.
I usually don’t see that much vehement criticism of Scott; it helps that he behaves in a very egalitarian fashion. Eliezer tends to take somewhat heavy criticism, including sometimes from me, precisely because he adheres to the LW community norm of “We here at LW are smarter and know better than everyone else, and we don’t need your stinking domain knowledge.” Oh, and also because Eliezer is phenomenally bad at explaining his thoughts and intentions to people outside Bay Area techno-futurist circles, which probably comes of training himself to be good at explaining his thoughts and intentions to an incredibly narrow, self-selected, and psychologically unusual circle of people. Once you’ve been reading him for long enough to have a clear idea what he’s trying to say, even he’s really not that bad.
It’s funny: when I got here, I thought Eliezer’s Sequences were basically nothing special, just explaining some science and machine-learning stuff to people who apparently can’t be arsed to read the primary sources. But the longer I’m here, the more I sometimes want to exasperatedly say to some or another “aspiring rationalist” who thinks they’re being ever-so-clever, no, you are actually being a Straw Vulcan, read the fucking Sequences.
Hostile, no. Arrogant—a bit, but quite within the LW norm. But asserting superiority? Very much so. Here is a direct quote:
And the problems arose not because of claims about superior domain knowledge, but rather claims about superior “crystallized intelligence” and “intellectual caliber” which are much wider than “I’m really good at math”.
Eliezer’s Sequences contain a lot of science and machine-learning stuff as you describe, and a few core bits that… aren’t. Going by volume, most of them are good. But the actual objections to them, of course, will be disproportionately around those few core bits. And sometimes not agreeing with something that is phrased like a scientific lecture can look an awful lot like refusal to listen, even when it’s not.
Well yeah, but hey: take what’s useful and chuck out the rest.
Besides which, by bothering to try to come up with a wholly naturalistic worldview that never resorts to mysticism in the first place, Eliezer is massively ahead of the overwhelming majority of, for example, laypeople and philosophers. Practicing scientists are better at science, but often resort to mysticism themselves when confronted with questions outside their own domain (ie: Roger Penrose and his quantum-woo on consciousness).
I do dislike the degree of mathematical Platonism and Tegmarkism I occasionally see around here, but that’s just my personal extreme distaste for mysticism coming out.
Basically, it’s really nice to have a community where words like “irreducible” will get you lynched, and if I have to put up with a few old blog entries being kinda bad at conveying their intended point, or just plain being wrong, so be it.
That means that if I just say “space aliens have replaced the President”, I’m saying something bad, but if I copy a math textbook, and add a footnote “also, space aliens have replaced the President”, I’m saying something good, because the sum total of what I am saying (a lot of good math + one bad thing about aliens) is good. In one sense that’s correct; people could certainly learn lots of math from my footnoted math textbook. But we don’t generally add these kinds of things together.
Did you write the rest of the math textbook?