Hi everyone. Author here. I’ll maybe reply in a more granular way later, but to quickly clear up a few things:
-I didn’t write the headlines. But of course they’re the first thing readers encounter, so I won’t expect you to assess my intentions without reference to them. That said, I especially wanted to get readers up to half-speed on a lot of complicated issues, so that we can have a more sophisticated discussion going forward.
-A lot fell out during editing. An outtake that will be posted online Monday concerns “normal startup culture”—in which I went to TechCrunch Disrupt. I don’t take LW/MIRI/CFAR to be typical of Silicon Valley culture; rather, a part of Bay Area memespace that is poorly understood or ignored but still important. Of course some readers will be put off. Others will explore more deeply, and things that seemed weird at first will come to seem more normal. That’s what happened with me, but it took months of exposure. And I still struggle with the coexistence of universalism and elitism in the community, but it’s not like I have a wholly satisfying solution; maybe by this time next year I’ll be a neoreactionary, who knows!!
-Regarding the statistics and summary of the LW survey. That section was much longer initially, and we kept cutting. I think the last thing to go was a sentence about the liberal/libertarian/socialist/conservative breakdown. We figured that that various “suggestive statistical irrelevancies” would imply the diversity of political opinion. Maybe we were overconfident. Anyway, after the few paragraphs about Thiel, I tried not to treat libertarianism until the final sections, and even there with some sympathy.
-”Overhygienic,” I can see how that might be confusing. I meant epistemic hygiene.
While I’m here, let me plug two novels I think LW readers might appreciate: Watt by Samuel Beckett (an obsessively logical, hilarious book) and The Man Without Qualities by Robert Musil, whose hero is a rationalist in abeyance (Musil was a former engineer, philosopher, and psychologist himself).
I’d be curious to hear more about the ways in which you think CFAR is over-(epistemically) hygienic. Feel free to email me if you prefer, but I bet a lot of people here would also be interested to hear your critique.
And I still struggle with the coexistence of universalism and elitism in the community, but it’s not like I have a wholly satisfying solution; maybe by this time next year I’ll be a neoreactionary, who knows!!
An interesting problem. There are a few things that can be said about this.
1) Neoreaction is not the only tendency that combines universalism and elitism—for that matter, it consistently rejects universalism, so it’s one way of resolving the tension you’re perceiving. Another way is to embrace both: this could be done by belief in a heritable factor of general intelligence (which strikes me as the rationalist thing to do, and which necessarily entails some degree of elitism), but that’s merely the most visible option. An alternative is to say that some cultures are superior to others (the North to the South for a common political example, aspiring-rationalist culture to the culture at large for a local one), which also necessarily entails elitism: at the very least, the inferiors must be uplifted.
2) The coexistence of universalism and elitism (and technocratic progressivism) is reminiscent of the later days of the British Empire. They believed that they could figure out a universal morality—and beyond that, a universally proper culture—but, of course, only the more developed and rational among even their own people could play a part in that. I suspect that LW draws disproportionately from communities that contain ideological descent from the British Empire, and that its surrounding baggage uncritically reflects that descent—in fact, this is provably true for one aspect of LW-rationality unless utilitarianism was independently developed somewhere else. (The last line from the last point sounds familiar.)
3) Neoreaction is probably partially an exercise in constructing new narratives and value-systems that are at least as plausible as the ones that are currently dominant. This isn’t incompatible with the generation of true insights—in fact, the point can’t be made with false ones. (Obviously false ones, at least, but if the epistemic sanity waterline isn’t high enough around here to make that almost as difficult a constraint, rationalism has probably failed.) There’s also some shock-jock countersignaling stuff, especially with Moldbug.
4) The comparative study of civilizations leads (at least when taken in conjunction with the point that technological progress and political progress cannot be assumed to be the same, or even driven by the same factors, except insofar as technology can make possible beneficial things that would not have been possible otherwise—though it can do the same for harmful things, like clickbait or the nuclear bomb) to two insights: first, that civilizations keep collapsing, and second, that they tend to think they’re right. No two fundamentally disagreeing civilizations can be right at the same time—so either value-systems cannot be compared (which is both easily dismissed and likely to contain a grain of truth for the simple reason that, if any of our basic moral drives come neither from culture nor about facts about the outside world, what else could they be but innate? Even the higher animals show signs of a sense of morality in lab tests, I’ve heard.) or one of them is wrong. It’s the same argument as the atheist one against religion, just fully generalized. (I don’t think the argument works for atheism, since, if you grant that the God or gods of divine-containing religions want humans to follow them, Christianity and the various paganisms can’t be seriously compared—but I digress.) Hence the utility of generating alternative narratives for the cause of seeking truth.
5) People concerned about civilizational risk would do well to take the possibility of collapse seriously, as per the fourth point. People who want to hurry up and solve morality and build it into a friendly AI, even more so. Those who believe that every civilization would come to the same moral solution should want there to be as many people likely to support this goal and do good and useful work toward it as possible, before a government or a business beats them to it, which seems to imply that they should either want there to be as many not-unfriendly and likely-to-be-useful civilizations as possible or that they should at least want Western civilization (i.e. the USA, Canada, and some amount of Europe depending on who you talk to) not to collapse, since it’s generated by far the highest proportion of people who take that task seriously. (IIRC the last part is close to the reasoning Anissimov went through, but I could be misremembering.)
(There’s likely to be at least one part of this that’s completely wrong, especially since it’s two in the morning and I’m rushing through this so I can sleep. A slice of virtual Stollen to anyone who finds one—it’s that time of the year.)
I’m not sure I see the contradiction. “We have found the way (elitism), and others should follow (universalism)” seems like a pretty coherent position, and one I’d expect to see throughout history, not just in the British Empire. Isn’t it implicit in the idea of missionary religion, and of much philosophy?
Granted, there’s a distinction you can make between “We found the way by luck” and “We found the way by virtue”. The former is less elitist than the latter, but it still entails that “our way is better than yours”.
...I think I’ve lost sight of what defines ‘elitism’ besides believing something.
I still suspect there are differences in how this combination is enforced, but I’ll need to do a lot more research now. Anyone know of any good books on the French or Spanish Empires, or the Islamic conquests?
...oh, Islam is actually a good example: their thing seems to be directly manipulating the incentive structure, whether by the jizya or the sword. Did they force Christians to go to Islamic schools, or did they just tax the Christians more than the Muslims? (Or neither? Did Christians have to pay zakat? IIRC they didn’t, but it might have varied...?)
Did they force Christians to go to Islamic schools, or did they just tax the Christians more than the Muslims? (Or neither? Did Christians have to pay zakat? IIRC they didn’t, but it might have varied...?)
I’ve heard that at one point the authorities were discouraging conversion to Islam because of the effect on tax revenue.
According to the book “A Historical And Economic Geography Of Ottoman Greece: The Southwestern Morea in the 18th Century” by Fariba Zarinebaf, John Bennet and Jack L. Davis:
To finance its war efforts, the Ottoman state relied heavily on revenues
from the cizye (poll tax) collected directly by the central treasury.
Therefore, it generally did not support forced conversion of the non-Muslim
reaya. The social pressure to convert must have been considerable, however,
in areas where the majority of the population was Muslim. Furthermore,
an increase in the amount of the cizye must also have indirectly
encouraged conversion in the second half of the 16th century. An imperial
order issued to the kadi of the districts of Manafge and Modon on 19
Zilkade 978/March 1570 stated that there were illegal attempts by taxfarmers
to collect cizye from converts who were timar-holders and who
had been serving in the Ottoman army for fifteen years. From this report
it is clear that local Christians converted to Islam to enter the ranks of the
military to avoid the payment of taxes. But it is also obvious that tax collectors
and tax-farmers resented the tax-exempt privileges of the converts
Glossary:
cizye - Islamic poll tax imposed on a non-Muslim
household
reaya—productive groups (peasants, merchants, artisans)
subject to taxes, in contrast to askeri (q.v.)
(military), who were tax-exempt
Zilkade—Dhu al-Qi’dah, the eleventh month in the Islamic calendar. It is one of the four sacred months in Islam during which warfare is prohibited, hence the name ‘Master of Truce’.
timar—prebend in the form of state taxes in return for
regular military service, conventionally less than
20,000 akçes (q.v.) in value
No two fundamentally disagreeing civilizations can be right at the same time—so either value-systems cannot be compared … or one of them is wrong.
Think it’s a bit more complicated. The issue is that while value systems can be compared, there are many different criteria by which they can be measured against each other. In different comparison frameworks the answer as to which is superior is likely to be different, too.
Consider e.g. a tapir and a sloth. Both are animals which live in the same habitat. Can they be compared? They “fundamentally disagree” about whether it’s better to live on the ground or up in the trees—is one of them “right” and the other “wrong”?
This, by the way, probably argues for your point that generating alternative narratives is useful.
Good point—you have to take into account technological, genetic, geographic, economic, geopolitical, etc. conditions as well.
(Which poses an interesting question: what sort of thing is America or any one of its component parts to be compared to? Or is there a more general rule—something with a similar structure to “if the vast majority of other civilizations would disagree up to their declining period, you’re probably wrong”?)
Steppe hordes, sea empires, and hill tribes may be alike enough that similar preconditions for civilization would be necessary. (cf. hbdchick’s inbreeding/outbreeding thing, esp. the part about the Semai: same effect, totally different place)
4) The comparative study of civilizations leads (at least when taken in conjunction with the point that technological progress and political progress cannot be assumed to be the same, or even driven by the same factors, except insofar as technology can make possible beneficial things that would not have been possible otherwise—though it can do the same for harmful things, like clickbait or the nuclear bomb) to two insights: first, that civilizations keep collapsing, and second, that they tend to think they’re right. No two fundamentally disagreeing civilizations can be right at the same time—so either value-systems cannot be compared (which is both easily dismissed and likely to contain a grain of truth for the simple reason that, if any of our basic moral drives come neither from culture nor about facts about the outside world, what else could they be but innate? Even the higher animals show signs of a sense of morality in lab tests, I’ve heard.) or one of them is wrong. It’s the same argument as the atheist one against religion, just fully generalized. (I don’t think the argument works for atheism, since, if you grant that the God or gods of divine-containing religions want humans to follow them, Christianity and the various paganisms can’t be seriously compared—but I digress.) Hence the utility of generating alternative narratives for the cause of seeking truth.
I think this is the completely wrong part, in that it assumes that any living individual ever considers everything about their civilization to be Good and Right. By and large, even the ruling classes don’t get everything they want (for example, they wanted a Hayekian utopia along Peter Thiel’s lines, but what they got was the messiness of actually existing neoliberalism). And in fact, one of the chief causes for the repeated collapses is that institutional structures usually can’t take being pushed and pulled in too many contradictory directions at once without ceasing to act coherently for anything at all (they become “unagenty”, in our language).
The US Congress is a fairly good present-day example: it’s supposed to act for the people as a whole, for the districts, and for the “several States”; for the right of the majority to govern as they will and for the right of small ideological minorities to obstruct whatever they please; for the fair representation of the voters and for the institutionalization of the political parties. When these goals become contradictory instead of complementary, the institution stops functioning (ie: it passes no legislation, not even routine matters, instead of merely passing legislation I disagree with), and society has to replace it or face decline.
I think this is the completely wrong part, in that it assumes that any living individual ever considers everything about their civilization to be Good and Right. By and large, even the ruling classes don’t get everything they want (for example, they wanted a Hayekian utopia along Peter Thiel’s lines, but what they got was the messiness of actually existing neoliberalism).
I’m not talking about practice, but rather about ideals, value systems, that sort of thing. Tumblrites haven’t gotten what they want either—but they still want what they want, and what they want is determined by something, and whatever that something is, it varies.
I talked to one of my roommates, a Google scientist who worked on neural nets. The CFAR workshop was just a whim to him, a tourist weekend. “They’re the nicest people you’d ever meet,” he said, but then he qualified the compliment. “Look around. If they were effective, rational people, would they be here? Something a little weird, no?”
This is hilarious, in implying exactly the reason I go to LW meetups (there’s other ultra-nerds to socialize with!) and why I don’t go to CFAR workshops (they’re an untested self-help program that asks me to pay for the privilege of doing what I could do for free at LW meetups).
-Regarding the statistics and summary of the LW survey. That section was much longer initially, and we kept cutting. I think the last thing to go was a sentence about the liberal/libertarian/socialist/conservative breakdown. We figured that that various “suggestive statistical irrelevancies” would imply the diversity of political opinion. Maybe we were overconfident.
I think you were overconfident: the article definitely comes across as associating “cyberpunks, cypherpunks, extropians, transhumanists, and singularians” with right-libertarianism. As the survey confirms, LW and its “rationalists” and assorted nerds in each of those other categories vary across the entire spectrum of opinions commonly held by highly-educated and materially privileged white male Western technologists ;-).
maybe by this time next year I’ll be a neoreactionary, who knows!!
Now, a small rebuke: I know you are trying to signal a humble openness to new knowledge, but to the best of my knowledge, neoreaction is incorrect. It’s not wise to be so open-minded your brains fall out, like Michel Foucault praising the Iranian Revolution.
I liked the excerpts gwern quoted and see truth (and positive things) in most of it. “Hydra-headed” for EY’s writing seems inapt. If you refute one of his essays 3 more will spring up in response?
Not sure what Vassar thinks is 3 in 1000 people—exploring+building boldly? Leadership?
Almost running a red light while buzzed+chatting. Hm. Well, I’m sure we all try to have a healthy respect for the dangers of killing and being killed while driving cars.
Hi everyone. Author here. I’ll maybe reply in a more granular way later, but to quickly clear up a few things:
-I didn’t write the headlines. But of course they’re the first thing readers encounter, so I won’t expect you to assess my intentions without reference to them. That said, I especially wanted to get readers up to half-speed on a lot of complicated issues, so that we can have a more sophisticated discussion going forward.
-A lot fell out during editing. An outtake that will be posted online Monday concerns “normal startup culture”—in which I went to TechCrunch Disrupt. I don’t take LW/MIRI/CFAR to be typical of Silicon Valley culture; rather, a part of Bay Area memespace that is poorly understood or ignored but still important. Of course some readers will be put off. Others will explore more deeply, and things that seemed weird at first will come to seem more normal. That’s what happened with me, but it took months of exposure. And I still struggle with the coexistence of universalism and elitism in the community, but it’s not like I have a wholly satisfying solution; maybe by this time next year I’ll be a neoreactionary, who knows!!
-Regarding the statistics and summary of the LW survey. That section was much longer initially, and we kept cutting. I think the last thing to go was a sentence about the liberal/libertarian/socialist/conservative breakdown. We figured that that various “suggestive statistical irrelevancies” would imply the diversity of political opinion. Maybe we were overconfident. Anyway, after the few paragraphs about Thiel, I tried not to treat libertarianism until the final sections, and even there with some sympathy.
-”Overhygienic,” I can see how that might be confusing. I meant epistemic hygiene.
-letters@harpers.org for clarifying letters, please! And I’m sam@canopycanopycanopy.com.
-
Thanks for showing up.
While I’m here, let me plug two novels I think LW readers might appreciate: Watt by Samuel Beckett (an obsessively logical, hilarious book) and The Man Without Qualities by Robert Musil, whose hero is a rationalist in abeyance (Musil was a former engineer, philosopher, and psychologist himself).
Good sociology yo, good sardonicism without sneering, best article I’ve seen about this subculture yet.
Thanks for showing up and clarifying, Sam!
I’d be curious to hear more about the ways in which you think CFAR is over-(epistemically) hygienic. Feel free to email me if you prefer, but I bet a lot of people here would also be interested to hear your critique.
“Almost everyone found politics to be tribal and viscerally upsetting.”
This is gold.
An interesting problem. There are a few things that can be said about this.
1) Neoreaction is not the only tendency that combines universalism and elitism—for that matter, it consistently rejects universalism, so it’s one way of resolving the tension you’re perceiving. Another way is to embrace both: this could be done by belief in a heritable factor of general intelligence (which strikes me as the rationalist thing to do, and which necessarily entails some degree of elitism), but that’s merely the most visible option. An alternative is to say that some cultures are superior to others (the North to the South for a common political example, aspiring-rationalist culture to the culture at large for a local one), which also necessarily entails elitism: at the very least, the inferiors must be uplifted.
2) The coexistence of universalism and elitism (and technocratic progressivism) is reminiscent of the later days of the British Empire. They believed that they could figure out a universal morality—and beyond that, a universally proper culture—but, of course, only the more developed and rational among even their own people could play a part in that. I suspect that LW draws disproportionately from communities that contain ideological descent from the British Empire, and that its surrounding baggage uncritically reflects that descent—in fact, this is provably true for one aspect of LW-rationality unless utilitarianism was independently developed somewhere else. (The last line from the last point sounds familiar.)
3) Neoreaction is probably partially an exercise in constructing new narratives and value-systems that are at least as plausible as the ones that are currently dominant. This isn’t incompatible with the generation of true insights—in fact, the point can’t be made with false ones. (Obviously false ones, at least, but if the epistemic sanity waterline isn’t high enough around here to make that almost as difficult a constraint, rationalism has probably failed.) There’s also some shock-jock countersignaling stuff, especially with Moldbug.
4) The comparative study of civilizations leads (at least when taken in conjunction with the point that technological progress and political progress cannot be assumed to be the same, or even driven by the same factors, except insofar as technology can make possible beneficial things that would not have been possible otherwise—though it can do the same for harmful things, like clickbait or the nuclear bomb) to two insights: first, that civilizations keep collapsing, and second, that they tend to think they’re right. No two fundamentally disagreeing civilizations can be right at the same time—so either value-systems cannot be compared (which is both easily dismissed and likely to contain a grain of truth for the simple reason that, if any of our basic moral drives come neither from culture nor about facts about the outside world, what else could they be but innate? Even the higher animals show signs of a sense of morality in lab tests, I’ve heard.) or one of them is wrong. It’s the same argument as the atheist one against religion, just fully generalized. (I don’t think the argument works for atheism, since, if you grant that the God or gods of divine-containing religions want humans to follow them, Christianity and the various paganisms can’t be seriously compared—but I digress.) Hence the utility of generating alternative narratives for the cause of seeking truth.
5) People concerned about civilizational risk would do well to take the possibility of collapse seriously, as per the fourth point. People who want to hurry up and solve morality and build it into a friendly AI, even more so. Those who believe that every civilization would come to the same moral solution should want there to be as many people likely to support this goal and do good and useful work toward it as possible, before a government or a business beats them to it, which seems to imply that they should either want there to be as many not-unfriendly and likely-to-be-useful civilizations as possible or that they should at least want Western civilization (i.e. the USA, Canada, and some amount of Europe depending on who you talk to) not to collapse, since it’s generated by far the highest proportion of people who take that task seriously. (IIRC the last part is close to the reasoning Anissimov went through, but I could be misremembering.)
(There’s likely to be at least one part of this that’s completely wrong, especially since it’s two in the morning and I’m rushing through this so I can sleep. A slice of virtual Stollen to anyone who finds one—it’s that time of the year.)
I’m not sure I see the contradiction. “We have found the way (elitism), and others should follow (universalism)” seems like a pretty coherent position, and one I’d expect to see throughout history, not just in the British Empire. Isn’t it implicit in the idea of missionary religion, and of much philosophy?
Granted, there’s a distinction you can make between “We found the way by luck” and “We found the way by virtue”. The former is less elitist than the latter, but it still entails that “our way is better than yours”.
...I think I’ve lost sight of what defines ‘elitism’ besides believing something.
Dammit! You win an entire virtual Stollen.
I still suspect there are differences in how this combination is enforced, but I’ll need to do a lot more research now. Anyone know of any good books on the French or Spanish Empires, or the Islamic conquests?
...oh, Islam is actually a good example: their thing seems to be directly manipulating the incentive structure, whether by the jizya or the sword. Did they force Christians to go to Islamic schools, or did they just tax the Christians more than the Muslims? (Or neither? Did Christians have to pay zakat? IIRC they didn’t, but it might have varied...?)
I’ve heard that at one point the authorities were discouraging conversion to Islam because of the effect on tax revenue.
According to the book “A Historical And Economic Geography Of Ottoman Greece: The Southwestern Morea in the 18th Century” by Fariba Zarinebaf, John Bennet and Jack L. Davis:
Glossary:
cizye - Islamic poll tax imposed on a non-Muslim household
reaya—productive groups (peasants, merchants, artisans) subject to taxes, in contrast to askeri (q.v.) (military), who were tax-exempt
kadi—Muslim judge
Zilkade—Dhu al-Qi’dah, the eleventh month in the Islamic calendar. It is one of the four sacred months in Islam during which warfare is prohibited, hence the name ‘Master of Truce’.
timar—prebend in the form of state taxes in return for regular military service, conventionally less than 20,000 akçes (q.v.) in value
If there are leaders, and there are followers, then that’s not really one-size-fits-all. That’s more like two-sizes-fit-all… Biversalism.
Think it’s a bit more complicated. The issue is that while value systems can be compared, there are many different criteria by which they can be measured against each other. In different comparison frameworks the answer as to which is superior is likely to be different, too.
Consider e.g. a tapir and a sloth. Both are animals which live in the same habitat. Can they be compared? They “fundamentally disagree” about whether it’s better to live on the ground or up in the trees—is one of them “right” and the other “wrong”?
This, by the way, probably argues for your point that generating alternative narratives is useful.
Good point—you have to take into account technological, genetic, geographic, economic, geopolitical, etc. conditions as well.
(Which poses an interesting question: what sort of thing is America or any one of its component parts to be compared to? Or is there a more general rule—something with a similar structure to “if the vast majority of other civilizations would disagree up to their declining period, you’re probably wrong”?)
Steppe hordes, sea empires, and hill tribes may be alike enough that similar preconditions for civilization would be necessary. (cf. hbdchick’s inbreeding/outbreeding thing, esp. the part about the Semai: same effect, totally different place)
I think this is the completely wrong part, in that it assumes that any living individual ever considers everything about their civilization to be Good and Right. By and large, even the ruling classes don’t get everything they want (for example, they wanted a Hayekian utopia along Peter Thiel’s lines, but what they got was the messiness of actually existing neoliberalism). And in fact, one of the chief causes for the repeated collapses is that institutional structures usually can’t take being pushed and pulled in too many contradictory directions at once without ceasing to act coherently for anything at all (they become “unagenty”, in our language).
The US Congress is a fairly good present-day example: it’s supposed to act for the people as a whole, for the districts, and for the “several States”; for the right of the majority to govern as they will and for the right of small ideological minorities to obstruct whatever they please; for the fair representation of the voters and for the institutionalization of the political parties. When these goals become contradictory instead of complementary, the institution stops functioning (ie: it passes no legislation, not even routine matters, instead of merely passing legislation I disagree with), and society has to replace it or face decline.
I’m not talking about practice, but rather about ideals, value systems, that sort of thing. Tumblrites haven’t gotten what they want either—but they still want what they want, and what they want is determined by something, and whatever that something is, it varies.
This is hilarious, in implying exactly the reason I go to LW meetups (there’s other ultra-nerds to socialize with!) and why I don’t go to CFAR workshops (they’re an untested self-help program that asks me to pay for the privilege of doing what I could do for free at LW meetups).
I think you were overconfident: the article definitely comes across as associating “cyberpunks, cypherpunks, extropians, transhumanists, and singularians” with right-libertarianism. As the survey confirms, LW and its “rationalists” and assorted nerds in each of those other categories vary across the entire spectrum of opinions commonly held by highly-educated and materially privileged white male Western technologists ;-).
Overall, brilliant article. If our group came across looking insane, that’s our fault, since we wave our meta-contrarian flags so emphatically and signal a lot of ego.
Now, a small rebuke: I know you are trying to signal a humble openness to new knowledge, but to the best of my knowledge, neoreaction is incorrect. It’s not wise to be so open-minded your brains fall out, like Michel Foucault praising the Iranian Revolution.
I was joking.
Thank God. I swear that group has an ideological black-hole nerd-sniping effect where otherwise decent people just get sucked down into the morass.
I liked the excerpts gwern quoted and see truth (and positive things) in most of it. “Hydra-headed” for EY’s writing seems inapt. If you refute one of his essays 3 more will spring up in response?
Not sure what Vassar thinks is 3 in 1000 people—exploring+building boldly? Leadership?
Almost running a red light while buzzed+chatting. Hm. Well, I’m sure we all try to have a healthy respect for the dangers of killing and being killed while driving cars.