Are you saying you now don’t think LW is “useful for noticing bullshit and cutting it away from my thoughts”, or that the value of doing this isn’t as high as you thought?
I used to be very skeptical of Eliezer’s ideas about improving rationality when he was posting the Sequences, but one result that’s hard to deny is that all of a sudden there is a community of people who I can discuss my decision theory ideas with, whereas before that I seemingly couldn’t get them across to anyone except maybe one or two people, even though I had my own highly active mailing list.
I’d say that being able to achieve this kind of subtle collective improvement in philosophical ability is already quite impressive, even if the effect is not very dramatic in any given individual. (Of course ultimately the improvement has to be graded against what’s needed to solve FAI and not against my expectations, and it seems to still fall far short of that.)
It’s indeed nice to have a community that discusses decision-theoretic ideas, but a simpler explanation is that Eliezer’s writings attracted many smart folks and also happened to make these ideas salient, not that Eliezer’s writings improved people’s philosophical ability.
Attracting many smart folks and making some particular ideas salient to them is no mean feat in itself. But do you think that’s really all it took? That any group of smart people, if they get together and become interested in some philosophical topic, could likely make progress instead of getting trapped in a number of possible ways?
I think it’s always helpful when a community has a vernacular and a common library of references. It’s better if the references are unusually accurate, but even bland ones might still speed up progress on projects.
an easier explanation is that Eliezer’s writings attracted many smart folks and also happened to make these ideas salient, not that Eliezer’s writings improved people’s philosophical ability
Eliezer’s writings were certainly the focus of my own philosophical development. The current me didn’t exist before processing them, and was historically caused by them, even though it might have formed on its own a few years later.
I had been considering earlier today that since I started reading lesswrong I noticed a considerable increase in my ability to spot and discern bullshit and flawed arguments, without paying much attention to really asking myself the right questions in order to favor other things I considered more important to think about.
Reading this made me realize that I’ve drawn a conclusion too early. Perhaps I should re-read those “epiphany addiction” posts with this in mind.
Thanks. In most of those links, the author says that he gained some useful mental tools, and maybe that he feels better. That’s good. But no one said that rationality helped them achieve any goal other the goal of being rational.
For example:
Launch a successful startup
Get a prestigious job
Break out of a long-term abusive relationship.
Lose weight (Diets are discussed, but I don’t see that a discussion driven by LW/SI-rationality is any more successful in this area than any random discussion of diets.)
Get lucky in love (and from what I can tell, the PUAs do have testimonials for their techniques)
Avoid akrasia (The techniques discussed are gathered from elsewhere; so to the extent that rationality means “reading up on the material,” the few successes attested in this area can count as confirmation.)
Break an addiction to drugs/gambling.
… and so on.
Religious deconversion doesn’t count for the purpose of my query unless the testimonial describes some instrumental benefit.
Carl’s comment about the need for an experiment is good; but if someone can just give a testimonial, that would be a good start!
I think LW-style thinking may have helped me persist better at going to the gym (which has been quite beneficial for me) than I otherwise would have, but obviously it’s hard to know for sure.
“I used to buy lottery tickets every day but now I understand the negative expectation of the gamble and the diminishing marginal utility of the ticket, so I don’t.”
A doctor says “I now realize that I was giving my patients terrible advice about what it meant when a test showed positive for a disease. Now that I have been inducted into the Secret Order of Bayes, My advice on that is much better now.”
This topic has been raised dozens of times before, but the stories are scattered. Here’s a sampling:
Louie’s What I’ve Learned from Less Wrong
cousin_it on how LW helps him notice bullshit
FrankAdamek on gains from LW
cata’s group rationality diary thread contains lots of stories of people benefiting from applying the lessons learned in rationality camps
A couple people have posted about how LW deconverted them from their religions, but I can’t recall where
But also see this comment from Carl Shulman.
That comment of mine was from 2010 and I disagree with it now. My current opinion is better expressed in the “Epiphany addiction” post and comments.
Are you saying you now don’t think LW is “useful for noticing bullshit and cutting it away from my thoughts”, or that the value of doing this isn’t as high as you thought?
Looking back today, the improvement seems smaller than I thought then, and LW seems to have played a smaller role in it.
I used to be very skeptical of Eliezer’s ideas about improving rationality when he was posting the Sequences, but one result that’s hard to deny is that all of a sudden there is a community of people who I can discuss my decision theory ideas with, whereas before that I seemingly couldn’t get them across to anyone except maybe one or two people, even though I had my own highly active mailing list.
I’d say that being able to achieve this kind of subtle collective improvement in philosophical ability is already quite impressive, even if the effect is not very dramatic in any given individual. (Of course ultimately the improvement has to be graded against what’s needed to solve FAI and not against my expectations, and it seems to still fall far short of that.)
It’s indeed nice to have a community that discusses decision-theoretic ideas, but a simpler explanation is that Eliezer’s writings attracted many smart folks and also happened to make these ideas salient, not that Eliezer’s writings improved people’s philosophical ability.
Attracting many smart folks and making some particular ideas salient to them is no mean feat in itself. But do you think that’s really all it took? That any group of smart people, if they get together and become interested in some philosophical topic, could likely make progress instead of getting trapped in a number of possible ways?
I think it’s always helpful when a community has a vernacular and a common library of references. It’s better if the references are unusually accurate, but even bland ones might still speed up progress on projects.
Eliezer’s writings were certainly the focus of my own philosophical development. The current me didn’t exist before processing them, and was historically caused by them, even though it might have formed on its own a few years later.
Hmm. Thanks for that update.
I had been considering earlier today that since I started reading lesswrong I noticed a considerable increase in my ability to spot and discern bullshit and flawed arguments, without paying much attention to really asking myself the right questions in order to favor other things I considered more important to think about.
Reading this made me realize that I’ve drawn a conclusion too early. Perhaps I should re-read those “epiphany addiction” posts with this in mind.
Thanks. In most of those links, the author says that he gained some useful mental tools, and maybe that he feels better. That’s good. But no one said that rationality helped them achieve any goal other the goal of being rational.
For example:
Launch a successful startup
Get a prestigious job
Break out of a long-term abusive relationship.
Lose weight (Diets are discussed, but I don’t see that a discussion driven by LW/SI-rationality is any more successful in this area than any random discussion of diets.)
Get lucky in love (and from what I can tell, the PUAs do have testimonials for their techniques)
Avoid akrasia (The techniques discussed are gathered from elsewhere; so to the extent that rationality means “reading up on the material,” the few successes attested in this area can count as confirmation.)
Break an addiction to drugs/gambling.
… and so on.
Religious deconversion doesn’t count for the purpose of my query unless the testimonial describes some instrumental benefit.
Carl’s comment about the need for an experiment is good; but if someone can just give a testimonial, that would be a good start!
There’s also Zvi losing weight with TDT. :)
Losing weight is a core human value?
Thanks, I edited it.
I think LW-style thinking may have helped me persist better at going to the gym (which has been quite beneficial for me) than I otherwise would have, but obviously it’s hard to know for sure.
Or even better:
“I used to buy lottery tickets every day but now I understand the negative expectation of the gamble and the diminishing marginal utility of the ticket, so I don’t.”
A doctor says “I now realize that I was giving my patients terrible advice about what it meant when a test showed positive for a disease. Now that I have been inducted into the Secret Order of Bayes, My advice on that is much better now.”
.… etc.