Seems you assume that most peoples’ beliefs are “improper.” Did LW offer you evidence for that conclusion?
Most of my evidence for this comes from my own observations. It’s pretty easy to see just from looking at how people’s lives end up that almost no one can make sound decisions over the time-frame of years. My working hypothesis is that most people can make what looks like an approximation to rational decisions on the order of hours or days in situations where there’s enough at stake for them in the short-term. But the error coefficients compound over time and people carry the scars of their worst failures (ie religion, drug abuse, bigotry, poverty, life threatening obesity, illiteracy, etc).
What Less Wrong helped me realize was that the problem was even worse for abstract reasoning. Almost no one can reason through abstract inference chains longer than 2 or 3 steps, so if people don’t have concepts big enough (or small enough) to explain everything with 1 or 2 steps of inference, they can never learn truth from falsehoods in those domains. I think this is a big reason for the “10 year rule” to become an expert in any field. It takes that long to cache out all the mental boxes the right size so that the experts without natural inference ability (most?) can turn everything into (obvious) 1 step inferences.
The other thing that Less Wrong taught me was that even of those with the ability to reason abstractly, most don’t feel bound to accept conclusions that follow from the premises they believe unless they like the conclusions they get. “Everyone is entitled to their own opinion” is an improvement on “Everyone is entitled to be Catholic”, but it’s a shame that the aftermath of religion accidentally turned being inconsistent into such a cherished personal freedom in our society. So that’s a big problem.
And of the few people left who can and do use reason and aren’t egregiously inconsistent, most are so unprepared to correct for detectable (and correctable) human biases that they can’t reliably reach sound conclusions on an abstract topic anyway even if they do have 10 years of thought put into a field. So where previously, I thought there was something akin to a sizable group of experts who were more or less “above the system” and could look down on problems from a higher level than me and just inevitably get the correct answers for correct reasons, I now accept the less magical (and obvious in retrospect) belief that scientists and other thinkers are inside the system too. And because of the reasons I mentioned above combined with lots of predictably biased behavior, most scientists / thinkers are “worse than noise” in terms of their contribution to the progress of human thought.
And don’t you also need to assume you have a way to generate beliefs that is substantially better at avoiding the desire to sound interesting or smart?
Intellectually engage more with people who are at least trying to use reason. You know, like instead of 4chan or people in my real life.
Most of my evidence for this comes from my own observations. It’s pretty easy to see just from looking at how people’s lives end up that almost no one can make sound decisions over the time-frame of years. My working hypothesis is that most people can make what looks like an approximation to rational decisions on the order of hours or days in situations where there’s enough at stake for them in the short-term. But the error coefficients compound over time and people carry the scars of their worst failures (ie religion, drug abuse, bigotry, poverty, life threatening obesity, illiteracy, etc).
What Less Wrong helped me realize was that the problem was even worse for abstract reasoning. Almost no one can reason through abstract inference chains longer than 2 or 3 steps, so if people don’t have concepts big enough (or small enough) to explain everything with 1 or 2 steps of inference, they can never learn truth from falsehoods in those domains. I think this is a big reason for the “10 year rule” to become an expert in any field. It takes that long to cache out all the mental boxes the right size so that the experts without natural inference ability (most?) can turn everything into (obvious) 1 step inferences.
The other thing that Less Wrong taught me was that even of those with the ability to reason abstractly, most don’t feel bound to accept conclusions that follow from the premises they believe unless they like the conclusions they get. “Everyone is entitled to their own opinion” is an improvement on “Everyone is entitled to be Catholic”, but it’s a shame that the aftermath of religion accidentally turned being inconsistent into such a cherished personal freedom in our society. So that’s a big problem.
And of the few people left who can and do use reason and aren’t egregiously inconsistent, most are so unprepared to correct for detectable (and correctable) human biases that they can’t reliably reach sound conclusions on an abstract topic anyway even if they do have 10 years of thought put into a field. So where previously, I thought there was something akin to a sizable group of experts who were more or less “above the system” and could look down on problems from a higher level than me and just inevitably get the correct answers for correct reasons, I now accept the less magical (and obvious in retrospect) belief that scientists and other thinkers are inside the system too. And because of the reasons I mentioned above combined with lots of predictably biased behavior, most scientists / thinkers are “worse than noise” in terms of their contribution to the progress of human thought.
Intellectually engage more with people who are at least trying to use reason. You know, like instead of 4chan or people in my real life.