Rationality is a method for answering questions, not an answer itself. If you don’t have any pressing questions—in other words, you’re happy and content—you may not see much use for it yet.
When I first finished reading the sequences, I thought, “Great! Now I’ll go through my beliefs and fix all the stupid ones! Okay, what do I believe that’s wrong?” My reply: ”...” Obviously, it’s not that simple—if I knew it was wrong, I wouldn’t have believed it in the first place. I could have tried to reevaluate everything I believe from the ground up, but that sounded like a poor effort:reward task. I suspect you feel the same way.
So what am I getting out of Bayesian rationality, the study of biases, and the Less Wrong community?
A collection of effective life-hacks and a community dedicated to finding and sharing more. Examples: Learn from Textbooks, rejection therapy, Defeating Ugh fields.
A strategy for attacking questions that I really don’t know the answer to. Examples: What can my parents do to take care of their surviving elders without totally sacrificing their financial and mental health? What can I do to help my autistic, college drop-out younger brother? What should my wife and I do about her house in Florida that’s been on the market for nearly a year?
In addition to all that, I’m updating my beliefs in place. When I learn something that surprises me, I take a closer look at why I believe what I believe, looking for an unfounded assumption that lead to the current error. That’s what I suggest for you: don’t expect what you’ve learned here to rewrite your entire worldview, but keep it handy for the next time life asks a Hard Question or throws you an utterly unanticipated datum.
Rationality is a method for answering questions, not an answer itself. If you don’t have any pressing questions—in other words, you’re happy and content—you may not see much use for it yet.
When I first finished reading the sequences, I thought, “Great! Now I’ll go through my beliefs and fix all the stupid ones! Okay, what do I believe that’s wrong?” My reply: ”...” Obviously, it’s not that simple—if I knew it was wrong, I wouldn’t have believed it in the first place. I could have tried to reevaluate everything I believe from the ground up, but that sounded like a poor effort:reward task. I suspect you feel the same way.
So what am I getting out of Bayesian rationality, the study of biases, and the Less Wrong community?
A better understanding of my own motivations. For example: My job hunt post, Motivated Stopping.
A collection of effective life-hacks and a community dedicated to finding and sharing more. Examples: Learn from Textbooks, rejection therapy, Defeating Ugh fields.
A strategy for attacking questions that I really don’t know the answer to. Examples: What can my parents do to take care of their surviving elders without totally sacrificing their financial and mental health? What can I do to help my autistic, college drop-out younger brother? What should my wife and I do about her house in Florida that’s been on the market for nearly a year?
In addition to all that, I’m updating my beliefs in place. When I learn something that surprises me, I take a closer look at why I believe what I believe, looking for an unfounded assumption that lead to the current error. That’s what I suggest for you: don’t expect what you’ve learned here to rewrite your entire worldview, but keep it handy for the next time life asks a Hard Question or throws you an utterly unanticipated datum.