I wouldn’t mind reading if someone took a crack at a Sequences 2.0, or something completely different.
One way of looking at the failure mode of Scientology is that they lead with genuinely useful material, which hooks people and establishes them as a credible source of wisdom. They then have a progressive structure that convinces you new epiphanies are just around the corner, you just need to put in a little more effort / time / cash—but there is no epiphany waiting that will be as useful as the original epiphanies.
This happens lots of places. I recall reading about some Alexander Technique expert, who continued doing lessons in the hopes of recapturing the first moment when he experienced lightness in his body. He never could, because the thing that was shocking about the first time was the surprise, not the lightness, and no matter how light he got, he could not become as surprised by it.
The healthy approach is to have a purpose, to pursue a well of knowledge for as long as doing so enhances that purpose, and then to abandon that well of knowledge as soon as it no longer enhances that purpose.
But here we run into the issue that, while rationality may be the common interest of many causes, the “something new” is unlikely to be a specifically rationality thing. It’s more likely to be something that some people find interesting and some people find boring, and so the people split into different taskforces to solve different problems. (That is, the Craft and the Community sequence really does anticipate lots of these issues.)
I don’t mean to advocate an epiphany-driven model of discovery.
To use your Scientology example and terminology, what I am advocating is not that we find the “next big thing,” but that we pursue refinement of the original, “genuinely useful material.” Of course, it is much easier to advocate this than to put the work in, but that’s why I’m using the open thread.
There are some legitimate issues with some of the Sequences (both resolved and unresolved). The comments represent a very nice start, but there may be some serious philosophical work to be done. There is a well of knowledge about pursuing wells of knowledge, and I would find it purposeful to refine the effective pursuit of knowledge.
while rationality may be the common interest of many causes, the “something new” is unlikely to be a specifically rationality thing. It’s more likely to be something that some people find interesting and some people find boring, and so the people split into different taskforces to solve different problems.
The taskforces may be really necessary, and it may be really difficult to admit in near mode.
On some level, it feels wrong to try fragmenting the LW community. I mean, I am so happy that I have found such wonderful people… and now should my next step be to choose a topic that doesn’t interest most of them, and focus on that? So that some subgroup will be interested in that, and most will not?
Yes, exactly this. Because trying to make everyone interested means staying on the level of generalities, ignoring the virtue of narrowness. You conduct experiments with specific data, and only generalize later. And yet, focusing on the specific feels like deviating from the topic of this website, which is about rationality in general.
Historically, even Eliezer didn’t make everyone on Overcoming Bias happy. There were people who didn’t care about quantum physics. Actually, even today some people feel like the Sequences would be better without the quantum physics parts; like they ruin the otherwise good advice on rationality. Quantum physics is just a narrow specific topic; why couldn’t Eliezer just leave it out? Well, Eliezer had his reasons, but there is a meta-reason that if you start leaving out specific things for the reason that they are not central to the issue of rationality, and that some people may not be interested in them, then what remains? General pro-”rationality” applause lights? Topics too mathematical to have any obvious connection with everyday life?
On the other hand, it may be a pattern that Eliezer split from Overcoming Bias to follow his own topics; and now Scott is similarly trying to apply rationality to politics on SSC… something like it’s easier to focus fully on your mission, when you have your own playground, so you really don’t have to care about what other people think about your approach. Like, if you have something to protect, it makes sense to bring it on your own turf, where you can protect it better. Maybe it is necessary that the next big “lesswrongish” thing must happen outside of LW. We already have links to rationality blogs here. Maybe we should embrace that.
However, it seems like a waste of resources if everyone is trying to set up and maintain their own blog, and especially the debating software. Fragmentation of the community: good or bad? The readers of LW are not exactly the same as the readers of SSC. For those who are different, it is better to have the sites separated (so e.g. Scott’s readers don’t have to worry about LW karma assassination). For those who are the same, it would be more convenient to have the same user interface for everything; the same inbox for replies both on LW and on SSC.
How could we better encourage the creation of the taskforces? Do we even have them, explicitly? (I imagine something like “special interest groups” in Mensa: a group of people with a specific goal, that has a name and explicit membership.) Maybe it is psychologically necessary to make the taskforce function better. (The only examples of LW groups that have a name I could give here would be effective altruists and neoreactionaries, and I already feel bad about the latter example: I think it goes against the usual LW approach to politics.)
Maybe LW should explicitly rebrand as a platform for multiple taskforces. That does not mean that everyone here would have to choose one; it just means that the taskforces would be the officially recognized way of how LW community works. If you want to read and talk, welcome! But if you want to do something important, join an existing taskforce or create a new one!
Also, this is how we could measure the effectivity of multiple rationalist groups. We don’t all compete in the area of general rationality; how does one even measure that? But we could have multiple taskforves, and some of them would give impressive results, and others would not. Even if different areas are not directly comparable, there would be at least a difference between getting things done and merely talking about getting things done.
To a degree this could be done even without changing software. We could announce the creation of the taskforces in regular discussion, and perhaps have a convention that when an article is published officially in the name of a taskforce, it will use the name of the taskforce in the title (whether it is an article on LW or a link to an article on a different website). Then, the individual taskforces could publish their plans, results, lessons learned, etc.
However, it seems like a waste of resources if everyone is trying to set up and maintain their own blog, and especially the debating software.
So, the thing that I think most exciting here would be some sort of LW comment API / easy WordPress plugin / whatever, so that one can trivially add LW commenting to their blog, and people can have one shared account and comment response inbox across all rationalist blogs.
How could we better encourage the creation of the taskforces? Do we even have them, explicitly?
So, I think one component of it being a “taskforce” instead of just a blog is that actual work is getting done. Yes, there’s stuff like IAFF that’s primarily discussion—because that discussion is leading to papers and constitutes “actual work.” But CFAR seems like it falls into the ‘task force’ category—it has a mission, but also employees, a budget, and so on.
And I think it makes sense to treat LW as a “forum,” in the ancient Roman sense. You’ll talk about your business in the marketplace and keep abreast of what’s going on elsewhere, but it’s not a good place to try to get your work done—that’s what your own building is for.
people were donating. We started getting donations right away, via Paypal. We even got congratulatory notes saying how the appeal had finally gotten them to start moving. (...) But none of those donors posted their agreement to the mailing list. Not one. So far as any of those donors knew, they were alone. And when they tuned in the next day, they discovered not thanks, but arguments for why they shouldn’t have donated. The criticisms, the justifications for not donating—only those were displayed proudly in the open.
Maybe we are doing a similar thing here, only instead of “donating / not donating”, the choice is “doing / not doing”. People are doing stuff—in private. (At least sometimes we have a bragging thread. For individuals.) Publicly—we complain that LessWrong is dying, or that LessWrong is all procrastination and no action. And then we upvote the complaints for the courage, or perhaps hoping that it will make someone else do something.
I guess what I would like to see here is something like the bragging threads… but for groups of rationalists. Perhaps with posts instead of comments, although comments are still better than nothing. Do something awesome; post the results and what you learned. Doesn’t have to be rationality in general, or AI, or quantum physics; it could be anything you care about, as long as you are willing to use your reason while doing it. It could be even learning together, or solving Project Euler together. Something you decide to do, and kinda precommit to publish your achievements and/or failures.
(Technically, organizing local meetups is also a project LessWrongers do in groups, so at least this kind of taskforces already provably exists.)
It seems to me like if you want to start a task force, start a task force. No rebranding needed.
This strikes me as close to right. LW might be a good place, though not the only good place, to announce starting a task force. If you attract people who are willing and able to do the work, then LW might or might not turn out to be a good place for primary discussion of what the task force is doing.
They then have a progressive structure that convinces you new epiphanies are just around the corner, you just need to put in a little more effort / time / cash—but there is no epiphany waiting that will be as useful as the original epiphanies.
I think “epiphany” isn’t a good way to think about scientology. The advantages you get by not being emotionally reactive to triggers, aren’t about “epiphanies”.
This happens lots of places. I recall reading about some Alexander Technique expert, who continued doing lessons in the hopes of recapturing the first moment when he experienced lightness in his body. He never could, because the thing that was shocking about the first time was the surprise, not the lightness, and no matter how light he got, he could not become surprised by it.
Could you source that story? To me that sound like someone not practicing “beginners mind” and as a result getting things wrong.
But here we run into the issue that, while rationality may be the common interest of many causes, the “something new” is unlikely to be a specifically rationality thing.
Eliezer mostly wrote about his own thoughts on rationality and at the beginning of LW, I think there reason to assume that it covers everything meaningful there to say about rationality.
Could you source that story? To me that sound like someone not practicing “beginners mind” and as a result getting things wrong.
If I recall correctly (60%?), it was Frank Pierce Jones. I’ll have to do some digging to find the initial quote, and I remember reading him as healthily noticing that desire and acknowledging that it was impossible, rather than misspending his life in pursuit of it.
(I’ve also stuck an “as” in the quoted text to make it a little clearer what claim I’m making.)
Eliezer mostly wrote about his own thoughts on rationality and at the beginning of LW, I think there reason to assume that it covers everything meaningful there to say about rationality.
I agree there are more ways out there than just Eliezer’s way, and people should be encouraged to discover theirs and post about it here. My hope was more to convey that some fruits can only be picked once.
One way of looking at the failure mode of Scientology is that they lead with genuinely useful material, which hooks people and establishes them as a credible source of wisdom. They then have a progressive structure that convinces you new epiphanies are just around the corner, you just need to put in a little more effort / time / cash—but there is no epiphany waiting that will be as useful as the original epiphanies.
This happens lots of places. I recall reading about some Alexander Technique expert, who continued doing lessons in the hopes of recapturing the first moment when he experienced lightness in his body. He never could, because the thing that was shocking about the first time was the surprise, not the lightness, and no matter how light he got, he could not become as surprised by it.
The healthy approach is to have a purpose, to pursue a well of knowledge for as long as doing so enhances that purpose, and then to abandon that well of knowledge as soon as it no longer enhances that purpose.
But here we run into the issue that, while rationality may be the common interest of many causes, the “something new” is unlikely to be a specifically rationality thing. It’s more likely to be something that some people find interesting and some people find boring, and so the people split into different taskforces to solve different problems. (That is, the Craft and the Community sequence really does anticipate lots of these issues.)
I don’t mean to advocate an epiphany-driven model of discovery.
To use your Scientology example and terminology, what I am advocating is not that we find the “next big thing,” but that we pursue refinement of the original, “genuinely useful material.” Of course, it is much easier to advocate this than to put the work in, but that’s why I’m using the open thread.
There are some legitimate issues with some of the Sequences (both resolved and unresolved). The comments represent a very nice start, but there may be some serious philosophical work to be done. There is a well of knowledge about pursuing wells of knowledge, and I would find it purposeful to refine the effective pursuit of knowledge.
The taskforces may be really necessary, and it may be really difficult to admit in near mode.
On some level, it feels wrong to try fragmenting the LW community. I mean, I am so happy that I have found such wonderful people… and now should my next step be to choose a topic that doesn’t interest most of them, and focus on that? So that some subgroup will be interested in that, and most will not?
Yes, exactly this. Because trying to make everyone interested means staying on the level of generalities, ignoring the virtue of narrowness. You conduct experiments with specific data, and only generalize later. And yet, focusing on the specific feels like deviating from the topic of this website, which is about rationality in general.
Historically, even Eliezer didn’t make everyone on Overcoming Bias happy. There were people who didn’t care about quantum physics. Actually, even today some people feel like the Sequences would be better without the quantum physics parts; like they ruin the otherwise good advice on rationality. Quantum physics is just a narrow specific topic; why couldn’t Eliezer just leave it out? Well, Eliezer had his reasons, but there is a meta-reason that if you start leaving out specific things for the reason that they are not central to the issue of rationality, and that some people may not be interested in them, then what remains? General pro-”rationality” applause lights? Topics too mathematical to have any obvious connection with everyday life?
On the other hand, it may be a pattern that Eliezer split from Overcoming Bias to follow his own topics; and now Scott is similarly trying to apply rationality to politics on SSC… something like it’s easier to focus fully on your mission, when you have your own playground, so you really don’t have to care about what other people think about your approach. Like, if you have something to protect, it makes sense to bring it on your own turf, where you can protect it better. Maybe it is necessary that the next big “lesswrongish” thing must happen outside of LW. We already have links to rationality blogs here. Maybe we should embrace that.
However, it seems like a waste of resources if everyone is trying to set up and maintain their own blog, and especially the debating software. Fragmentation of the community: good or bad? The readers of LW are not exactly the same as the readers of SSC. For those who are different, it is better to have the sites separated (so e.g. Scott’s readers don’t have to worry about LW karma assassination). For those who are the same, it would be more convenient to have the same user interface for everything; the same inbox for replies both on LW and on SSC.
How could we better encourage the creation of the taskforces? Do we even have them, explicitly? (I imagine something like “special interest groups” in Mensa: a group of people with a specific goal, that has a name and explicit membership.) Maybe it is psychologically necessary to make the taskforce function better. (The only examples of LW groups that have a name I could give here would be effective altruists and neoreactionaries, and I already feel bad about the latter example: I think it goes against the usual LW approach to politics.)
Maybe LW should explicitly rebrand as a platform for multiple taskforces. That does not mean that everyone here would have to choose one; it just means that the taskforces would be the officially recognized way of how LW community works. If you want to read and talk, welcome! But if you want to do something important, join an existing taskforce or create a new one!
Also, this is how we could measure the effectivity of multiple rationalist groups. We don’t all compete in the area of general rationality; how does one even measure that? But we could have multiple taskforves, and some of them would give impressive results, and others would not. Even if different areas are not directly comparable, there would be at least a difference between getting things done and merely talking about getting things done.
To a degree this could be done even without changing software. We could announce the creation of the taskforces in regular discussion, and perhaps have a convention that when an article is published officially in the name of a taskforce, it will use the name of the taskforce in the title (whether it is an article on LW or a link to an article on a different website). Then, the individual taskforces could publish their plans, results, lessons learned, etc.
So, the thing that I think most exciting here would be some sort of LW comment API / easy WordPress plugin / whatever, so that one can trivially add LW commenting to their blog, and people can have one shared account and comment response inbox across all rationalist blogs.
So, I think one component of it being a “taskforce” instead of just a blog is that actual work is getting done. Yes, there’s stuff like IAFF that’s primarily discussion—because that discussion is leading to papers and constitutes “actual work.” But CFAR seems like it falls into the ‘task force’ category—it has a mission, but also employees, a budget, and so on.
And I think it makes sense to treat LW as a “forum,” in the ancient Roman sense. You’ll talk about your business in the marketplace and keep abreast of what’s going on elsewhere, but it’s not a good place to try to get your work done—that’s what your own building is for.
It seems to me like if you want to start a task force, start a task force. No rebranding needed.
Unless “officially” encouraged by LW, people may be doing it away from LW, so it may not be visibly displayed here.
Somewhat related: “Why Our Kind Can’t Cooperate”:
Maybe we are doing a similar thing here, only instead of “donating / not donating”, the choice is “doing / not doing”. People are doing stuff—in private. (At least sometimes we have a bragging thread. For individuals.) Publicly—we complain that LessWrong is dying, or that LessWrong is all procrastination and no action. And then we upvote the complaints for the courage, or perhaps hoping that it will make someone else do something.
I guess what I would like to see here is something like the bragging threads… but for groups of rationalists. Perhaps with posts instead of comments, although comments are still better than nothing. Do something awesome; post the results and what you learned. Doesn’t have to be rationality in general, or AI, or quantum physics; it could be anything you care about, as long as you are willing to use your reason while doing it. It could be even learning together, or solving Project Euler together. Something you decide to do, and kinda precommit to publish your achievements and/or failures.
(Technically, organizing local meetups is also a project LessWrongers do in groups, so at least this kind of taskforces already provably exists.)
This strikes me as close to right. LW might be a good place, though not the only good place, to announce starting a task force. If you attract people who are willing and able to do the work, then LW might or might not turn out to be a good place for primary discussion of what the task force is doing.
I think “epiphany” isn’t a good way to think about scientology. The advantages you get by not being emotionally reactive to triggers, aren’t about “epiphanies”.
Could you source that story? To me that sound like someone not practicing “beginners mind” and as a result getting things wrong.
Eliezer mostly wrote about his own thoughts on rationality and at the beginning of LW, I think there reason to assume that it covers everything meaningful there to say about rationality.
If I recall correctly (60%?), it was Frank Pierce Jones. I’ll have to do some digging to find the initial quote, and I remember reading him as healthily noticing that desire and acknowledging that it was impossible, rather than misspending his life in pursuit of it.
(I’ve also stuck an “as” in the quoted text to make it a little clearer what claim I’m making.)
I agree there are more ways out there than just Eliezer’s way, and people should be encouraged to discover theirs and post about it here. My hope was more to convey that some fruits can only be picked once.