Like some others, I eventually found this site after being directed by fellow nerds to HPMOR. I’ve been working haphazardly through the Sequences (getting neck-deep in cognitive science and philosophy before even getting past the preliminaries for quantum physics, and loving every bit of it).
I can’t point to a clear “aha!” moment when I decided to pursue the LW definition of rationality. I always remember being highly intelligent and interested in Science, but it’s hard for me to model how my brain actually processed information that long ago. Before high school (at the earliest), I was probably just as irrational as everyone else, only with bigger guns.
Sometime during college (B.S. in mechanical engineering), I can recall beginning an active effort to consider as many sides of an issue as possible. This was motivated less from a quest for scientific truth and more from a tendency to get into political discussions. Having been raised by parents who were fairly traditional American conservatives, I quickly found myself becoming some kind of libertarian. This seems to be a common occurrence, both in the welcome comments I’ve read here and elsewhere. I can’t say at this point how much of this change was the result of rational deliberation and how much was from mere social pressure, but on later review it still seems like a good idea regardless.
The first time I can recall actually thinking “I need to improve the way I think” was fairly recent, in graduate school. The primary motivation was still political. I wanted to make sure my beliefs were reasonable, and the first step seemed to be making sure they were self-consistent. Unfortunately, I still didn’t know the first thing about cognitive biases (aside from running head-on into confirmation bias on a regular basis without knowing the name). Concluding that the problem was intractable, I withdrew from all friendly political discussion except one in which my position seemed particularly well-supported and therefore easy to argue rationally. I never cared much for arguing in the first place, so if I’m going to do it I’d prefer to at least have the data on my side.
I’ve since lost even more interest in trying to figure out politics, and decided while reading this site that it would be more immediately important anyway to try figuring out myself. I’ve yet to identify that noble cause to fight for (although I have been interested in manned space exploration enough to get two engineering degrees), but I think a more rational me will be more effective at whatever that cause turns out to be.
I like the “just with bigger guns” metaphor a lot; the trouble with intelligence is its ability to produce smart-seeming arguments for nearly any silly idea.
Exactly. I also suspect that logical overconfidence, i.e. knowing a little bit about bias and thinking it no longer affects you, is magnified with higher intelligence.
I can’t help but remember that saying about great power and great responsibility.
Thanks! I hadn’t read that article yet, but I became familiar with the concept when reading one of Eliezer Yudkowsky’s papers on existential risk assessment. (Either this one or this one) I did have a kind of “Oh Shit” moment when the context of the article hit me.
Hello, Less Wrong.
Like some others, I eventually found this site after being directed by fellow nerds to HPMOR. I’ve been working haphazardly through the Sequences (getting neck-deep in cognitive science and philosophy before even getting past the preliminaries for quantum physics, and loving every bit of it).
I can’t point to a clear “aha!” moment when I decided to pursue the LW definition of rationality. I always remember being highly intelligent and interested in Science, but it’s hard for me to model how my brain actually processed information that long ago. Before high school (at the earliest), I was probably just as irrational as everyone else, only with bigger guns.
Sometime during college (B.S. in mechanical engineering), I can recall beginning an active effort to consider as many sides of an issue as possible. This was motivated less from a quest for scientific truth and more from a tendency to get into political discussions. Having been raised by parents who were fairly traditional American conservatives, I quickly found myself becoming some kind of libertarian. This seems to be a common occurrence, both in the welcome comments I’ve read here and elsewhere. I can’t say at this point how much of this change was the result of rational deliberation and how much was from mere social pressure, but on later review it still seems like a good idea regardless.
The first time I can recall actually thinking “I need to improve the way I think” was fairly recent, in graduate school. The primary motivation was still political. I wanted to make sure my beliefs were reasonable, and the first step seemed to be making sure they were self-consistent. Unfortunately, I still didn’t know the first thing about cognitive biases (aside from running head-on into confirmation bias on a regular basis without knowing the name). Concluding that the problem was intractable, I withdrew from all friendly political discussion except one in which my position seemed particularly well-supported and therefore easy to argue rationally. I never cared much for arguing in the first place, so if I’m going to do it I’d prefer to at least have the data on my side.
I’ve since lost even more interest in trying to figure out politics, and decided while reading this site that it would be more immediately important anyway to try figuring out myself. I’ve yet to identify that noble cause to fight for (although I have been interested in manned space exploration enough to get two engineering degrees), but I think a more rational me will be more effective at whatever that cause turns out to be.
Still reading and updating...
Welcome to LW!
I like the “just with bigger guns” metaphor a lot; the trouble with intelligence is its ability to produce smart-seeming arguments for nearly any silly idea.
Exactly. I also suspect that logical overconfidence, i.e. knowing a little bit about bias and thinking it no longer affects you, is magnified with higher intelligence.
I can’t help but remember that saying about great power and great responsibility.
Yes—see Knowing about biases can hurt people.
Thanks! I hadn’t read that article yet, but I became familiar with the concept when reading one of Eliezer Yudkowsky’s papers on existential risk assessment. (Either this one or this one) I did have a kind of “Oh Shit” moment when the context of the article hit me.