A few thoughts that have been brooding, that are vaguely relevant to your post...
One thing that I find is often disappointingly absent from LW discussions of epistemology is how much the appropriate epistemology depends on your goals and your intellectual abilities. If you are someone of median intelligence who just want to carry out a usual trade like making shoes or something, you can largely get by with recieved wisdom. If you are a researcher, your entire job consists of coming up with things that aren’t already present in the market of ideas, and so using at least some local epistemology (or ‘inside view’, or ‘figuring things out’) is a job requirement. If you are trying to start a start-up, or generate any kind of invention, again, you usually have to claim to have some knowledge advantage, and so you need a more local epistemology.
Relatedly, even for any individual person, the kind of thinking I should use depends very much on context. Personally, in order to do research, I try to do a lot of my thinking by myself, in order to train myself to think well. Sure, I do engage in a lot of scholarship too, and I often check my answers through discussing my thinking with others. But I do a lot more independent thinking than I did two years ago anyway. But if I am ever making a truly important decision, such as who to work for, it makes sense for me to be much more deferential, and to seek advice of people who I know to be the best at making that decision, and then to defer to them to a fairly large degree (notwithstanding that they lack some information, which I should adjust for).
It would be nice to see people relax blanket pronouncements (not claiming this is particularly worse in this post compared to elsewhere) in order to give a bit more attention to this dependence on context.
If you are someone of median intelligence who just want to carry out a usual trade like making shoes or something, you can largely get by with recieved wisdom.
AFAICT, this only holds if you’re in a stable sociopolitical/economic context—and, more specifically still, the kind of stable sociopolitical environment that provides relatively benign information-sources. Examples of folks who didn’t fall into this category: (a) folks living in eastern Europe in the late 1930′s (especially if Jewish, but even if not; regardless of how traditional their trade was); (b) folks living in the Soviet Union (required navigating a non-explicit layer of recieved-from-underground knowledge); (c) folks literally making shoes during time-periods in which shoe-making was disrupted by the industrial revolution. It is to my mind an open question whether any significant portion of the US/Europe/etc. will fall into the “can get by largely with received wisdom” reference class across the next 10 years. (They might. I just actually can’t tell.)
It’s possible to be in a situation where the consensus is terrible and you have no ability to rebuild a better knowledge base. It’s tragic but tragedies happen.
It seems like we’re anchoring excessively on the question of sufficiency, when what matters is the net expected benefit. If we rephrase the question and ask “are there populations that are made worse off, on expectation, by more independent thought?”, the answer is clearly yes, which is I think the question that we should be asking (and that fits the point I’m making).
In order to research existential risk, and to actually survive, yes, we need more thought, although this is the kind of research I had in mind in my original comment.
Independent thought helps not just for x-risk, but for personal well-being things like not joining the army, not playing the slot machines, getting off Facebook. In other words, not just for accepting action requests culturally construed as exotic, but rejecting some normal action requests. “Stable sociopolitical/economic context” is actually a fairly strong requirement, given how much the current global narrative is based on exponential growth.
The question is, what do you mean by “independent thought”? If “independent thought” means “mistrust everyone” then clearly it can be a harmful heuristic. If “independent thought” means “use your own thinking faculties to process all evidence you have, including the evidence which consists of the expression of certain beliefs by certain other people” then it’s not clear to me there are significant populations that would be harmed by it. If there are, it means that the society in question has wise and benevolent leaders s.t. for large swaths of the population it is impossible to verify (given their cognitive abilities) that these leaders are indeed wise and benevolent. It seems to me that arriving at such a situation would require a lot of luck, given that the leadership of a society is ultimately determined by the society itself.
My view is that any world where the value of possible outcomes is heavy-tailed distributed (x-risk is a thing, but also more everyday things like income, and I’d guess life satisfaction) is a world where the best opportunities are nonobvious and better epistemology will thus have very strong returns.
I maybe am open to an argument that a 10th century peasant literally has no ability to have a better life, but I basically think that it holds for everyone I talk to day-to-day.
So you’re saying rationality is good if your utility is linear in the quantity of some goods? (For most people it is more like logarithmic, right?) But it seems that you want to say that independent thought is usually useful...
I’m sure the 10th century peasant does have ways to have a better life, but they just don’t necessarily involve doing rationality training, which is pretty obviously does not (and should not) help in all situations. Right?
“One thing that I find is often disappointingly absent from LW discussions of epistemology is how much the appropriate epistemology depends on your goals and your intellectual abilities”—Never really thought of it that way, but makes a lot of sense.
A few thoughts that have been brooding, that are vaguely relevant to your post...
One thing that I find is often disappointingly absent from LW discussions of epistemology is how much the appropriate epistemology depends on your goals and your intellectual abilities. If you are someone of median intelligence who just want to carry out a usual trade like making shoes or something, you can largely get by with recieved wisdom. If you are a researcher, your entire job consists of coming up with things that aren’t already present in the market of ideas, and so using at least some local epistemology (or ‘inside view’, or ‘figuring things out’) is a job requirement. If you are trying to start a start-up, or generate any kind of invention, again, you usually have to claim to have some knowledge advantage, and so you need a more local epistemology.
Relatedly, even for any individual person, the kind of thinking I should use depends very much on context. Personally, in order to do research, I try to do a lot of my thinking by myself, in order to train myself to think well. Sure, I do engage in a lot of scholarship too, and I often check my answers through discussing my thinking with others. But I do a lot more independent thinking than I did two years ago anyway. But if I am ever making a truly important decision, such as who to work for, it makes sense for me to be much more deferential, and to seek advice of people who I know to be the best at making that decision, and then to defer to them to a fairly large degree (notwithstanding that they lack some information, which I should adjust for).
It would be nice to see people relax blanket pronouncements (not claiming this is particularly worse in this post compared to elsewhere) in order to give a bit more attention to this dependence on context.
RyanCarey writes:
AFAICT, this only holds if you’re in a stable sociopolitical/economic context—and, more specifically still, the kind of stable sociopolitical environment that provides relatively benign information-sources. Examples of folks who didn’t fall into this category: (a) folks living in eastern Europe in the late 1930′s (especially if Jewish, but even if not; regardless of how traditional their trade was); (b) folks living in the Soviet Union (required navigating a non-explicit layer of recieved-from-underground knowledge); (c) folks literally making shoes during time-periods in which shoe-making was disrupted by the industrial revolution. It is to my mind an open question whether any significant portion of the US/Europe/etc. will fall into the “can get by largely with received wisdom” reference class across the next 10 years. (They might. I just actually can’t tell.)
It’s possible to be in a situation where the consensus is terrible and you have no ability to rebuild a better knowledge base. It’s tragic but tragedies happen.
It seems like we’re anchoring excessively on the question of sufficiency, when what matters is the net expected benefit. If we rephrase the question and ask “are there populations that are made worse off, on expectation, by more independent thought?”, the answer is clearly yes, which is I think the question that we should be asking (and that fits the point I’m making).
In order to research existential risk, and to actually survive, yes, we need more thought, although this is the kind of research I had in mind in my original comment.
Independent thought helps not just for x-risk, but for personal well-being things like not joining the army, not playing the slot machines, getting off Facebook. In other words, not just for accepting action requests culturally construed as exotic, but rejecting some normal action requests. “Stable sociopolitical/economic context” is actually a fairly strong requirement, given how much the current global narrative is based on exponential growth.
The question is, what do you mean by “independent thought”? If “independent thought” means “mistrust everyone” then clearly it can be a harmful heuristic. If “independent thought” means “use your own thinking faculties to process all evidence you have, including the evidence which consists of the expression of certain beliefs by certain other people” then it’s not clear to me there are significant populations that would be harmed by it. If there are, it means that the society in question has wise and benevolent leaders s.t. for large swaths of the population it is impossible to verify (given their cognitive abilities) that these leaders are indeed wise and benevolent. It seems to me that arriving at such a situation would require a lot of luck, given that the leadership of a society is ultimately determined by the society itself.
My view is that any world where the value of possible outcomes is heavy-tailed distributed (x-risk is a thing, but also more everyday things like income, and I’d guess life satisfaction) is a world where the best opportunities are nonobvious and better epistemology will thus have very strong returns.
I maybe am open to an argument that a 10th century peasant literally has no ability to have a better life, but I basically think that it holds for everyone I talk to day-to-day.
So you’re saying rationality is good if your utility is linear in the quantity of some goods? (For most people it is more like logarithmic, right?) But it seems that you want to say that independent thought is usually useful...
I’m sure the 10th century peasant does have ways to have a better life, but they just don’t necessarily involve doing rationality training, which is pretty obviously does not (and should not) help in all situations. Right?
Yes, it seems to me that we should care about some things linearly, though I’ll have to think some more about why I think that.
“One thing that I find is often disappointingly absent from LW discussions of epistemology is how much the appropriate epistemology depends on your goals and your intellectual abilities”—Never really thought of it that way, but makes a lot of sense.