Still have various options for what to do next, but most likely I will spend at least a year trying to build a large rationality community in Amsterdam. I’m talking 5+ events a week, a dedicated space, membership program, website, retreats, etc.
The emphasis will be on developing applied rationality. My approach will be to cover many different paradigms of self-improvement. My hunch is that one will start noticing patterns that these paradigms have in common.
I’m thinking authentic relating, radical honesty, CFAR-style applied rationality, shadow work, yoga/meditation, psychedelic therapy, street epistemology, tantra, body work, nonviolent communication, etc. If you know anything that would fit in this list, please comment!
This would be one pillar of the organisation, and the other one would be explicitly teaching an Effective Altruist ethos to justify working on rationality in the first place.
If this goes really well, I’m hoping this will develop into something like “the CFAR of Europe” at some point.
Much of the list is mutually exclusive/contradictory.
Or rather, there is good stuff that can be mined from all of the above, and incorporated into a general practice of rationality, but there’s a big difference between saying:
“rationality can be extracted from X, Y, and Z”
“X, Y, and Z are good places to look for rationality”
“X, Y, and Z are examples of rationality.”
The first is almost trivially true of all things, so it also seems straightforwardly true of the list “authentic relating, shadow work, yoga/meditation, psychedelic therapy, tantra, body work, etc.”
The second, applied to your list, is a gray area where the word “good” would be doing a lot of work. Searching for rationality in [authentic relating, shadow work, yoga/meditation, psychedelic therapy, tantra, body work, etc.] is not a straightforward process of separating the gold from the dross; it’s a process of painstakingly distinguishing the gold from the fool’s gold while breathing in atmosphere specifically evolved to confuse and poison you. It’s not clear that these domains in particular should be promoted to the attention of someone seeking to develop rationality tech, over other plausible avenues of investigation.
The third, applied to your list, is substantially more false than true (though it still contains some truth).
I think this is important to note less because it matters to me what you’re doing off in Europe with a project you feel personally inspired by, and more because it matters to me how we allow the word “rationality” to be defined and used here on LessWrong.
As stated, you’ve said some stuff that’s compatible with how I think we ought to define and use the word “rationality,” but you’ve definitely not ruled out everything else.
I agree with you that we need to separate the good stuff from the bad stuff and that there is a risk here that I end up diluting the brand of rationality by not doing this well enough.
My intuition is that I’m perfectly capable of doing this, but perhaps I’m not the best person to make that call, and I’m reminded that you’ve personally called me out in the past on being too lax in my thinking.
I feel like you have recently written a lot about the particular way in which you think people in LW might be going off the rails, so I could spend some time reading your stuff and trying to pass your ITT.
It doesn’t, but I tend to go with the assumption that if one person voices an objection, there are 100 more with the same objection that don’t voice it I put this on my to do list, might take a few weeks to come back to it, but I will come back to it
Yep, I was going to write something similar to what Duncan did. Some topics seem so strongly connected with irrationality, that if you mention them, it will almost inevitably attract people already interested in that topic including the irrational parts, and those people will be coordinated about the irrational parts. While you will be pushing in the direction of rationality, they will be actively pushing in different directions, how sure you are to win all these battles? I assume that creating a community that is 50% rational and 50% mystical would not make you happy. Maybe not even 80% rational and 20% mystical.
One of my pet peeves about the current rationalist community is what seems to me as quite uncritical approval of Buddhism. Especially when contrasted to our utter dismissal of Christianity, which only makes a contrarian exception for Chesterton. (I imagine that Chesterton himself might ask us whether we are literally atheists, or merely anti-Christians.) I am not saying that people here are buying Buddhism hook, line, and sinker; but they are strongly privileging a lot of stuff that comes from it. Like, you can introduce a concept from Buddhism, and people will write articles about how that actually matches our latest congnitive science or some kind of therapy. Do the same with a concept from Christianity, and you will get some strongly worded reprimand about how dangerous it is to directly import concepts from a poisonous memeplex, unless you carefully rederive it from the first principles, in which case it is unlikely to be 100% the same, and it is better to use a different word to express it.
I will end my rant here, just saying that your plans sounds to me like the same thing might happen with Buddhism and Hinduism and New Age and whatever at the same time, so I would predict that you will lose some of the battles. And ironically, you may lose some potential rationalists, as they will observe you losing these battles (or at least not winning them conclusively) and decide that they would prefer some place with stronger norms against mysticism.
(Then again, things are never easy, there is also the opposite error of Hollywood rationality, etc.)
I think it is worth someone pursuing this project, but maybe it’d make more sense to pursue it under the post-rationality brand instead? Then again, this might reduce the amount of criticism from rationality folk in exchange for increasing Viliam’s worries.
(The main issue I have with many post-rationalists is that they pay too little heed to Ken Wilbur’s pre/trans fallacy).
I decided to quit my job.
Still have various options for what to do next, but most likely I will spend at least a year trying to build a large rationality community in Amsterdam. I’m talking 5+ events a week, a dedicated space, membership program, website, retreats, etc.
The emphasis will be on developing applied rationality. My approach will be to cover many different paradigms of self-improvement. My hunch is that one will start noticing patterns that these paradigms have in common.
I’m thinking authentic relating, radical honesty, CFAR-style applied rationality, shadow work, yoga/meditation, psychedelic therapy, street epistemology, tantra, body work, nonviolent communication, etc. If you know anything that would fit in this list, please comment!
This would be one pillar of the organisation, and the other one would be explicitly teaching an Effective Altruist ethos to justify working on rationality in the first place.
If this goes really well, I’m hoping this will develop into something like “the CFAR of Europe” at some point.
Much of the list is mutually exclusive/contradictory.
Or rather, there is good stuff that can be mined from all of the above, and incorporated into a general practice of rationality, but there’s a big difference between saying:
“rationality can be extracted from X, Y, and Z”
“X, Y, and Z are good places to look for rationality”
“X, Y, and Z are examples of rationality.”
The first is almost trivially true of all things, so it also seems straightforwardly true of the list “authentic relating, shadow work, yoga/meditation, psychedelic therapy, tantra, body work, etc.”
The second, applied to your list, is a gray area where the word “good” would be doing a lot of work. Searching for rationality in [authentic relating, shadow work, yoga/meditation, psychedelic therapy, tantra, body work, etc.] is not a straightforward process of separating the gold from the dross; it’s a process of painstakingly distinguishing the gold from the fool’s gold while breathing in atmosphere specifically evolved to confuse and poison you. It’s not clear that these domains in particular should be promoted to the attention of someone seeking to develop rationality tech, over other plausible avenues of investigation.
The third, applied to your list, is substantially more false than true (though it still contains some truth).
I think this is important to note less because it matters to me what you’re doing off in Europe with a project you feel personally inspired by, and more because it matters to me how we allow the word “rationality” to be defined and used here on LessWrong.
As stated, you’ve said some stuff that’s compatible with how I think we ought to define and use the word “rationality,” but you’ve definitely not ruled out everything else.
Appreciate the criticism.
I agree with you that we need to separate the good stuff from the bad stuff and that there is a risk here that I end up diluting the brand of rationality by not doing this well enough.
My intuition is that I’m perfectly capable of doing this, but perhaps I’m not the best person to make that call, and I’m reminded that you’ve personally called me out in the past on being too lax in my thinking.
I feel like you have recently written a lot about the particular way in which you think people in LW might be going off the rails, so I could spend some time reading your stuff and trying to pass your ITT.
Does that sound like a good plan to you?
I mean, that’s unusually generous of you. I don’t think [my objection] obligates you to do [that much work]. But if you’re down, I’m down.
It doesn’t, but I tend to go with the assumption that if one person voices an objection, there are 100 more with the same objection that don’t voice it
I put this on my to do list, might take a few weeks to come back to it, but I will come back to it
Yep, I was going to write something similar to what Duncan did. Some topics seem so strongly connected with irrationality, that if you mention them, it will almost inevitably attract people already interested in that topic including the irrational parts, and those people will be coordinated about the irrational parts. While you will be pushing in the direction of rationality, they will be actively pushing in different directions, how sure you are to win all these battles? I assume that creating a community that is 50% rational and 50% mystical would not make you happy. Maybe not even 80% rational and 20% mystical.
One of my pet peeves about the current rationalist community is what seems to me as quite uncritical approval of Buddhism. Especially when contrasted to our utter dismissal of Christianity, which only makes a contrarian exception for Chesterton. (I imagine that Chesterton himself might ask us whether we are literally atheists, or merely anti-Christians.) I am not saying that people here are buying Buddhism hook, line, and sinker; but they are strongly privileging a lot of stuff that comes from it. Like, you can introduce a concept from Buddhism, and people will write articles about how that actually matches our latest congnitive science or some kind of therapy. Do the same with a concept from Christianity, and you will get some strongly worded reprimand about how dangerous it is to directly import concepts from a poisonous memeplex, unless you carefully rederive it from the first principles, in which case it is unlikely to be 100% the same, and it is better to use a different word to express it.
I will end my rant here, just saying that your plans sounds to me like the same thing might happen with Buddhism and Hinduism and New Age and whatever at the same time, so I would predict that you will lose some of the battles. And ironically, you may lose some potential rationalists, as they will observe you losing these battles (or at least not winning them conclusively) and decide that they would prefer some place with stronger norms against mysticism.
(Then again, things are never easy, there is also the opposite error of Hollywood rationality, etc.)
I think it is worth someone pursuing this project, but maybe it’d make more sense to pursue it under the post-rationality brand instead? Then again, this might reduce the amount of criticism from rationality folk in exchange for increasing Viliam’s worries.
(The main issue I have with many post-rationalists is that they pay too little heed to Ken Wilbur’s pre/trans fallacy).