By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:
Assume people who claim to be better than the mainstream are wrong.
Pay greater attention to authority than arguments.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
because otherwise they are going to engage in tribal politics.
Heh. Otherwise? You just said they’re engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don’t want to teach them anything until they stop, you just will not teach them anything, period.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
Well, that makes sense for people who know what they are talking about, are good at compensating for their biases and avoid tribal politics. Less so for people who have trouble with rationality.
Remember: I’m not against doing stuff in Inside View, but I think it will be hard to ‘fix’ completely broken belief systems in that context. You’re going to have trouble even agreeing what constitutes a valid argument; having a discussion where people don’t just end up more polarized is going to be impossible.
Heh. Otherwise? You just said they’re engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don’t want to teach them anything until they stop, you just will not teach them anything, period.
I want to teach them to not get endlessly more radical before I teach anything else. Then I want to teach them to avoid tribalism and stuff like that. When all of that is done, I would begin working on the object-level stuff. Doing it in a different order seems doomed to failure, because it’s very hard to get people to change their minds.
Your approach seems to boil down to “First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them”. I don’t think it’s a good approach—either desirable or effective. You don’t start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
And that’s besides the rather obvious power/control issues.
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can’t really be justified in considering your opinion more accurate than the mainstream one.
Once we have the framework, the question is the cause of the dumb masses. Personally, I think it’s tribal stuff, which means that I honestly believe tribalism should be solved before people can be made more rational. In my experience, tribal stuff seemed to die down when I got more accepting of majoritarianism (because if you respect majoritarianism, you can’t really say “the mainstream is silencing my tribe!” witthout having some important conclusions to make about your tribe).
Your approach seems to boil down to “First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them”. I don’t think it’s a good approach—either desirable or effective. You don’t start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
It’s probably not a good approach for young children or similarly open minds, but we’re not working with a blank slate here. Also, it’s not like the policies I propose are Dark Side Epistemology; avoiding object level is perfectly sensible if you are not an expert.
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can’t really be justified in considering your opinion more accurate than the mainstream one.
Epistemologically, the final arbiter is reality. Besides, what do you call “mainstream”—the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
From the context it seems you associate “mainstream” with “dumb masses”, but the popular views are often remarkably uninformed and are also actively shaped by a variety of interests (both political and commercial). I doubt just being a contrarian in some aspect lifts you into “elite” status (e.g. paleo diets, etc.)
the question is the cause of the dumb masses. Personally, I think it’s tribal stuff
I don’t understand. Are you saying that the masses are dumb because (causative!) the tribal affiliation is strong with them??
but we’re not working with a blank slate here
Which you want to wipe down to the indifferent/accepting/passive moron level before starting to do anything useful?
avoiding object level is perfectly sensible if you are not an expert.
Another claim I strongly disagree with. Following this forces you to believe everything you’re told as long as sufficient numbers of people around you believe the same thing—even though it’s stupid on the object level. I think it’s a very bad approach.
Besides, what do you call “mainstream”—the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
Perhaps I focused too much on ‘mainstream’ when I really meant ‘outside view’. Obviously, outside view can take both of these into account to different degrees, but essentially, the point is that I think teaching the person to use outside view is better, and outside view is heavily biased (arguably justifiably so) in favor of the mainstream.
I doubt just being a contrarian in some aspect lifts you into “elite” status (e.g. paleo diets, etc.)
But that’s my point: a lot of different contrarian groups have what the OP calls “a web of lies that sound quite logical and true”. Do you really think you can teach them how to identify such a web of lies while they are stuck in one?
Instead, I think you need to get them unstuck using outside view, and then you can teach them how to identify truth correctly.
I don’t understand. Are you saying the the masses are dumb because (causative!) the tribal affiliation is strong with them??
Yes. The masses try to justify their ingroup, they don’t try to seek truth.
Another claim I strongly disagree with. Following this forces you to believe everything you’re told as long as sufficient numbers of people around you believe the same thing—even though it’s stupid on the object level. I think it’s a very bad approach.
The way is see it is this: if I got into a debate with a conspiracy theorist, I’m sure they would have much better object-level arguments than I do; I bet they would be able to consistently win when debating me. The reason for this is that I’m not an expert on their specific conspiracy, while they know every single shred of evidence in favor of their theory. This means that I need to rely on meta-level indicators like nobody respecting holocaust deniers in order to determine the truth of their theories, unless I want to spend huge amounts of time researching them.
Sure, there are cases where I think I can do better than most people (computer science, math, physics, philosophy, gender, generally whatever I decide is interesting and start learning a lot about) and in those case I’m willing to look at the object level, but otherwise I really don’t trust my own ability to figure out the truth—and I shouldn’t, because it’s necessary to know a lot of the facts before you can even start formulating sensible ideas on your own.
If we take this to the extreme where someone doesn’t understand truth, logic, what constitutes evidence or anything like that, I really would start out by teaching them how to deal with stuff when you don’t understand it in detail, not how to deal with it when you do.
Let’s sort out the terminology. I think we mean different things by “outside view”.
As far as I understand you, for you the “outside view” means not trying to come to any conclusions on your own, but rather accept what the authorities (mainstream, experts, etc.) tell you. Essentially, when you recommend “outside view” to people you tell them not to think for themselves but rather accept what others are telling them (see e.g. here).
I understand “outside view” a bit more traditionally (see e.g. here) and treat it as a forecasting technique. Basically, when you want to forecast something using the inside view, you treat that something as ‘self-propelled’, in a way, you look at its internal workings and mechanisms to figure out what will happen to it. If you take the outside view, on the other hand, you treat that something as a black box that is moved primarily by external forces and so to forecast you look at these external forces and not at the internals.
Given this, I read your recommendation “teaching the person to use outside view is better” as “teach the person to NOT think for himself, but accept whatever most people around think”.
I disagree with this recommendation rather strongly.
Do you really think you can teach them how to identify such a web of lies while they are stuck in one?
Why, yes, I do. In fact, I think it’s the normal process of extracting oneself from “a web of lies”—you start by realizing you’re stuck in one. Of course, no one said it would be easy.
An example—religious deconversion. How do you think it will work in your system?
Yes. The masses try to justify their ingroup, they don’t try to seek truth.
Well, this theory implies some consequences. For example, it implies high negative correlation between IQ (or more fuzzy “smartness”) and the strength of tribal affiliation. Do we observe it? The theory also implies that if the tribal affiliation increases (e.g. because your country got involved in a military conflict), everyone suddenly becomes much dumber. Do we observe that?
if I got into a debate with a conspiracy theorist …I bet they would be able to consistently win when debating me.
I don’t know about that. You think of winning a debate in high-school debate club terms, or maybe in a TV debate terms—the one who scores the most points with the judges wins. That’s not how real life operates. The way for the conspiracy theorist to win the debate is to convince you. Unless you became a believer at the end, he did NOT win the debate. Most debates end in a draw.
otherwise I really don’t trust my own ability to figure out the truth
That’s probably the core difference that leads to our disagreements. I do trust my own ability (the fact that I’m arrogant should not be a surprise to anyone). Specifically, I trust my own ability more than I trust the mainstream opinion.
Of course, my opinion and the mainstream opinion coincide on a great deal of mundane things. But when they don’t, I am not terribly respectful of the mainstream opinion and do not by default yield to it.
In fact, I don’t see how your approach is compatible with being on LW. Let’s take Alice who is a LessWrongian and is concerned about FAI risks. And let’s take Bob who subscribes to your approach of defering to the mainstream.
Alice goes: “I’m concerned about the risk of FAI.”
Bob: “That’s silly. You found yourself a cult with ridiculous ideas. Do you have a Ph.D. in Comp Sci or something similar? If not, you should not try have your own opinion about things you do not understand. Is the mainstream concerned about FAI? It is not. So you should not, as well.”
What can Alice reply to Bob? She is, in fact, not a Ph.D. and has no particular expertise in AI.
If we take this to the extreme where someone doesn’t understand truth, logic, what constitutes evidence or anything like that
I don’t think you can extrapolate from very-low-IQ people to general population. By the same token, these people should not manage their own money, for example, or, in general, lead an unsupervised life.
I understand “outside view” a bit more traditionally and treat it as a forecasting technique.
The thing is, you can apply it more widely than just forecasting. Forecasting is just trying to figure out the future, and there’s no reason you should limit yourself to the future.
Anyway, the way I see it, in inside view, both when forecasting and when trying to figure out truth, you focus on the specific problem you are working on, try to figure out its internals, etc.. In outside view, you look at things outside the problem, like track record of similar things (which I, in my list, called “looks like cultishness”; arguably I could have named that better), other’s expectations of your success (hey bank, I would like to borrow money to start a company! what, you don’t believe I will succeed?), etc.. Perhaps ‘outside view’ isn’t a good term either (which kinda justifies me calling it majoritarianism to begin with...), but whatever. Let’s make up some new terms, how about calling them the helpless and the independent views?
Why, yes, I do. In fact, I think it’s the normal process of extracting oneself from “a web of lies”—you start by realizing you’re stuck in one. Of course, no one said it would be easy.
Well, how often does it happen?
An example—religious deconversion. How do you think it will work in your system?
How much detail do you want it in and how general do you want it to be? What is the starting point of the person who needs to be deconverted? Actually, to skip all these kinds of questions, could you give an example of how you would write how deconversion would work in your system?
Well, this theory implies some consequences. For example, it implies high negative correlation between IQ (or more fuzzy “smartness”) and the strength of tribal affiliation. Do we observe it?
IQ != rationality. I don’t know if there is a correlation, and if there is one, I don’t know in which direction. Eliezer has made a good argument that higher IQ gives a wider possible range of rationality, but I don’t have the evidence to support that.
Anyway, I at least notice that the times where people are wrong, it’s often because they try to signal loyalty to their tribe (of course, there often is an opposing tribe that is correct on the question where the first one was wrong...). This is anecdotal, though, so YMMV. What do you observe? That people who have made certain answers to certain questions part of their identity are more likely to be correct?
The theory also implies that if the tribal affiliation increases (e.g. because your country got involved in a military conflict), everyone suddenly becomes much dumber. Do we observe that?
...probably? Not so much with military conflicts, because you are not doing as much politics as you are doing fighting, but I generally see that if a discussion becomes political, everybody starts saying stupid stuff.
I don’t know about that. You think of winning a debate in high-school debate club terms, or maybe in a TV debate terms—the one who scores the most points with the judges wins. That’s not how real life operates. The way for the conspiracy theorist to win the debate is to convince you. Unless you became a believer at the end, he did NOT win the debate. Most debates end in a draw.
But the only reason I don’t get convinced is because of the helpless view (and, of course, things like tribalism, but let’s pretend I’m a bounded rationalist for simplicity). In the independent view, I see lots of reasons for believing him, and I have no good counterarguments. I mean, I know that I can find counterarguments, but I’m not going to do that after the debate.
In fact, I don’t see how your approach is compatible with being on LW.
Again, I believe in an asymmetry between people who have internalized various lessons on tribalism and other people. I agree that if I did not believe in that asymmetry, I would not have good epistemic reasons for being on LW (though I might have other good reasons, such as entertainment).
What can Alice reply to Bob? She is, in fact, not a Ph.D. and has no particular expertise in AI.
“Smart people like Bill Gates, Stephen Hawking and Elon Musk are worried about AI along with a lot of experts on AI.”
This should also be a significant factor in her belief in AI risk; if smart people or experts weren’t worried, she should not be either.
I don’t think you can extrapolate from very-low-IQ people to general population. By the same token, these people should not manage their own money, for example, or, in general, lead an unsupervised life.
I’ve been in a high-IQ club and not all of them are rational. Take selection effects into account and we might very well end up with a lot of irrational high-IQ people.
there’s no reason you should limit yourself to the future
Actually, there is—future is the only thing you can change—but let’s not sidetrack too much.
how about calling them the helpless and the independent views?
Sure, good names, let’s take ’em.
[Religious deconversion]
The reason I brought it up is that there is no default “do what the mainstream does” position there. The mainstream is religious and the helpless view would tell you to be religious, too.
I don’t have much experience with deconversions, but even looking at personal stories posted on LW, they seem to rotate around doubting particular elements on the objective level, not on the “this belief is too weird” level.
IQ != rationality.
Well, yes, but “rationality” is not terribly well defined and is a whole another can of worms. In particular, I know how to measure IQ and I know how it’s distributed in populations and sub-groups. I do NOT know how to measure the degree of rationality and what’s happening with it in populations. That makes discussions of rationality as an empirical variable to be handwavy and… not stand on solid data ground.
because they try to signal loyalty to their tribe
First, signaling loyalty could be a perfectly rational thing to do. Second, there is the issue of the difference between signaling and true beliefs—signaling something other than what you believe is not uncommon.
the only reason I don’t get convinced is because of the helpless view
No, I don’t think so. You have priors, don’t you? Presumably, quite strong priors about certain things? That’s not the same thing as a helpless view. Besides, being convinced or not involves much more than being able to debunk every single point of the argument. Gish Gallop is not a particularly convincing technique, though it’s good at scoring points :-/
I believe in an asymmetry between people
How do you know who is who? And who gets to decide? If I am talking to someone, do I have to first have to classify her into enlightened or unenlightened?
Smart people … are worried about AI
That’s not a winning line of argument—it’s argument for popularity can be easily shut down by pointing out that a lot more smart people are not worried, and the helpless approach tells you not to pick fringe views.
we might very well end up with a lot of irrational high-IQ people.
The basic question is, how do you know? In particular, can you consistently judge the rationality of someone of noticeably higher IQ?
The reason I brought it up is that there is no default “do what the mainstream does” position there. The mainstream is religious and the helpless view would tell you to be religious, too.
Of course, but you can ask for the asymmetry between $yourcountry, USA, Germany, Italy, Japan and Israel (or whichever group of places you prefer). These places have wildly different attitudes to religion (or, well, at least they follow different religions, somewhat) with no-one being in a better position in terms of figuring out the right religion, so you can conclude that while some religion must be correct, we don’t know which one.
I don’t have much experience with deconversions, but even looking at personal stories posted on LW, they seem to rotate around doubting particular elements on the objective level, not on the “this belief is too weird” level.
Something something selection bias.
Anyway, I don’t know about religious deconversion, but I know I’ve had a lot of stupid political views that I’ve removed by using helpless view.
IIRC my brother deconverted via helpless view, but I might misremember. Either way, that would be n=1, so not that useful.
Well, yes, but “rationality” is not terribly well defined and is a whole another can of worms. In particular, I know how to measure IQ and I know how it’s distributed in populations and sub-groups. I do NOT know how to measure the degree of rationality and what’s happening with it in populations. That makes discussions of rationality as an empirical variable to be handwavy and… not stand on solid data ground.
I quite like Eliezer’s suggestion of using the question of MWI as a test for rationality, but I’m biased; I independently discovered it as a child. :P
First, signaling loyalty could be a perfectly rational thing to do. Second, there is the issue of the difference between signaling and true beliefs—signaling something other than what you believe is not uncommon.
The problem here is that there isn’t necessarily a difference between signaling and true beliefs. Imagine your outgroup saying the most ridiculous thing you can. That thing is likely a kind of signaling, but in some ways (not all, though) it acts like a belief.
You have priors, don’t you?
… I can sometimes for simplicity’s sake be modeled as having priors, but we’re all monkeys after all. But yeah, I know what you mean.
Presumably, quite strong priors about certain things?
Sure. But if I lived in a world where most people believed the holocaust is a hoax, or a world where it was controversial whether it was a hoax but the knowledge of the evidence was distributed in the same way as it is today, I’m pretty sure I would be a holocaust denier.
(Of course, in the second case the evidence in favor of the holocaust having happened would rapidly emerge, completely crush the deniers, and send us back to the current world, but we’re “preventing” this from happening, counterfactually.)
Anyway, this shows that a large part of my belief in the holocaust comes from the fact that everybody knows holocaust deniers are wrong. Sure, the evidence in favor of the holocaust is there, but I (assume I would have (I haven’t actually bothered checking what the deniers are saying)) no way of dealing the denier’s counterarguments, because I would have to dig through mountains of evidence every time.
(If holocaust deniers are actually trivial to disprove, replace them with some other conspiracy theory that’s trickier.)
How do you know who is who? And who gets to decide? If I am talking to someone, do I have to first have to classify her into enlightened or unenlightened?
Well, most of the time, you’re going to notice. Try talking politics with them; the enlightened ones are going to be curious, while the unenlightened ones are going to push specific things. Using the word ‘majoritarian’ for the helpless view might have made it unclear that in many cases, it’s a technique used by relatively few people. Or rather, most people only use it for subjects they aren’t interested in.
However, even if you can’t tell, most of the time it’s not going to matter. I mean, I’m not trying to teach lists of biases or debate techniques to every person you talk to.
That’s not a winning line of argument—it’s argument for popularity can be easily shut down by pointing out that a lot more smart people are not worried, and the helpless approach tells you not to pick fringe views.
Gates is one of the most famous people within tech, though. That’s not exactly fringe.
Actually, I just re-read your scenario. I had understood it as if Alice subscribed to the helpless view. I think that in this case, Bob is making the mistake of treating the helpless view as an absolute truth, rather than a convenient approximation.I wouldn’t dismiss entire communities based on weak helpless view knowledge; it would have to be either strong (i.e. many conspiracy theories) or independent view.
In the case described in the OP, we have strong independent view knowledge that the pseudoscience stuff is wrong.
The basic question is, how do you know? In particular, can you consistently judge the rationality of someone of noticeably higher IQ?
I think so. I mean, it even had a Seer who hosted a reasonably popular event at the club, so… yeah. IQ high, rationality at different levels.
Also, ‘noticeably higher IQ’ is ambiguous. Do you mean ‘noticeably higher IQ’ than I have? Because it was just an ordinary high-IQ thing, not an extreme IQ thing, so it’s not like I was lower than the average of that place. I think its minimum IQ was lower than the LW average, but I might be mixing up some stuff.
so you can conclude that while some religion must be correct, we don’t know which one.
Sorry, under the helpless approach you cannot conclude anything, much less on the basis of something as complex as a cross-cultural comparative religion analysis. If you are helpless, you do what people around you do and think what people around you think. The end.
I quite like Eliezer’s suggestion of using the question of MWI as a test for rationality
Oh-oh. I’m failing this test hard.
Besides, are you quite sure that you want to make an untestable article of faith with zero practical consequences the criterion for rationality? X-/
But if I lived in a world where most people believed the holocaust is a hoax … I’m pretty sure I would be a holocaust denier.
That’s an argument against the helpless view, right? It sure looks this way.
Well, most of the time, you’re going to notice.
Well, yes, I’m going to notice and I generally have little trouble figuring out who’s stupid and who is smart. But that’s me and my personal opinion. You, on the other hand, are setting this up as a generally applicable rule. The problem is who decides. Let’s say I talk a bit to Charlie and decide that Charlie is stupid. Charlie, on the basis of the same conversation, decides that I’m stupid. Who’s right? I have my opinion, and Charlie has his opinion, and how do we resolve this without pulling out IQ tests and equivalents?
It’s essentially a power and control issue: who gets to separate people into elite and masses?
Do you mean ‘noticeably higher IQ’ than I have?
In your setup there is the Arbiter—in your case, yourself—who decides whether someone is smart (and so is allowed to use the independent approach) or stupid (and so must be limited to the helpless approach). This Arbiter has a certain level of IQ. Can the Arbiter judge the smartness/rationality of someone with noticeably higher IQ than the Arbiter himself?
Sorry, under the helpless approach you cannot conclude anything, much less on the basis of something as complex as a cross-cultural comparative religion analysis. If you are helpless, you do what people around you do and think what people around you think. The end.
It seems like we are thinking of two different views, then. Let’s keep the name ‘helpless view’ for mine and call yours ‘straw helpless view’.
The idea behind helpless view is that you’re very irrational in many ways. Which ways?
You’re biased in favor of your ingroups and your culture. This feels like your ingroups are universally correct from the inside, but you can tell that it is a bias from the fact that your outgroups act similarly confident.
You’re biased in favor of elegant convincing-sounding arguments rather than hard-to-understand data.
Your computational power is bounded, so you need to spend a lot of resources to understand things.
There are obviously more, but biases similar to those are the ones the helpless view is intended to fight.
The way it fights those arguments is by not allowing object-level arguments, arguments that favor your ingroups or your culture over others and things like that.
Instead, in helpless view, you focus on things like:
International mainstream consensus. (Things like cross-cultural analysis on opinions, what organizations like the UN say, etc.)
Expert opinions. (If the experts, preferably in your outgroup, agree that something is wrong, rule it out. Silence on the issue does not let you rule it out.)
Things that you are an expert on. (Looking things up on the web does not count as expert.)
What the government says.
(The media are intentionally excluded.)
Oh-oh. I’m failing this test hard.
evil grin
Besides, are you quite sure that you want to make an untestable article of faith with zero practical consequences the criterion for rationality? X-/
Nah, it was mostly meant as a semi-joke. I mean, I like the criterion, but my reasons for liking it are not exactly unbiased.
If I were to actually make a rationality test, I would probably look at the ingroups/outgroups of the people I make the test for, include a bunch of questions about facts where each there is a lot of ingroup/outgroup bias, and look at the answers to that.
That’s an argument against the helpless view, right? It sure looks this way.
Except that we live in the current world, not the counterfactual world, and in the current world the helpless view tells you not to believe conspiracy theories.
Well, yes, I’m going to notice and I generally have little trouble figuring out who’s stupid and who is smart. But that’s me and my personal opinion. You, on the other hand, are setting this up as a generally applicable rule. The problem is who decides. Let’s say I talk a bit to Charlie and decide that Charlie is stupid. Charlie, on the basis of the same conversation, decides that I’m stupid. Who’s right? I have my opinion, and Charlie has his opinion, and how do we resolve this without pulling out IQ tests and equivalents?
It’s essentially a power and control issue: who gets to separate people into elite and masses?
I dunno.
For what purpose are you separating the people into elite and masses? If it’s just a question of who to share dangerous knowledge to, there’s the obvious possibility of just letting whoever wants to share said dangerous knowledge decide.
In your setup there is the Arbiter—in your case, yourself—who decides whether someone is smart (and so is allowed to use the independent approach) or stupid (and so must be limited to the helpless approach). This Arbiter has a certain level of IQ. Can the Arbiter judge the smartness/rationality of someone with noticeably higher IQ than the Arbiter himself?
I don’t know, because I have a really high IQ, so I don’t usually meet people with noticeably higher IQ. Do you have any examples of ultra-high-IQ people who write about controversial stuff?
Instead, in helpless view, you focus on things like: International mainstream consensus.
Hold on. I thought the helpless view was for the “dumb masses”. They are certainly not able to figure out what the “international mainstream consensus” is. Hell, even I have no idea what it is (or even what it means).
A simple example: Western democracy. What’s the “international mainstream consensus”? Assuming it exists, I would guess it says that the Western-style democracy needs a strong guiding hand lest it devolves into degeneracy and amoral chaos. And hey, if you ask the experts in your outgroup (!) they probably wouldn’t be great fans of the Western democracies, that’s why they are in the outgroup to start with.
I have a feeling you want the helpless view to cherry-pick the “right” advice from the confusing mess of the “international consensus” and various experts and government recommendations. I don’t see how this can work well.
in the current world the helpless view tells you not to believe conspiracy theories.
Heh. You know the definitions of a cult and a religion? A cult is a small unsuccessful religion. A religion is a large successful cult.
In exactly the same way what gets labeled a “conspiracy theory” is already a rejected view. If the mainstream believes in a conspiracy theory, it’s not called a “conspiracy theory”, it’s called a deep and insightful analysis. If you were to live in a culture where Holocaust denial was mainstream, it wouldn’t be called a conspiracy theory, it would be called “what all right-minded people believe”.
For what purpose are you separating the people into elite and masses?
For the purpose of promoting/recommending either the independent view or the helpless view.
Hold on. I thought the helpless view was for the “dumb masses”. They are certainly not able to figure out what the “international mainstream consensus” is. Hell, even I have no idea what it is (or even what it means).
The “dumb masses” here are not defined as being low-IQ, but just low-rationality. Low-IQ people would probably be better served just doing what people around them are doing (or maybe not; I’m not an expert in low-IQ people).
A simple example: Western democracy. What’s the “international mainstream consensus”?
Well, one of the first conclusions to draw with helpless view is “politics is too complicated to figure out”. I’m not sure I care that much about figuring out if democracy is good according to helpless view. The UN seems to like democracy, and I would count that as helpless-view evidence in favor of it.
I would guess it says that the Western-style democracy needs a strong guiding hand lest it devolves into degeneracy and amoral chaos.
I would guess that there is an ambiguously pro-democratic response. 48% of the world lives in democracies, and the places that aren’t democratic probably don’t agree as much on how to be un-democratic as the democratic places agree on how to be democratic.
For the purpose of promoting/recommending either the independent view or the helpless view.
Whoever does the promoting/recommending seems like a natural candidate, then.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
Heh. Otherwise? You just said they’re engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don’t want to teach them anything until they stop, you just will not teach them anything, period.
Well, that makes sense for people who know what they are talking about, are good at compensating for their biases and avoid tribal politics. Less so for people who have trouble with rationality.
Remember: I’m not against doing stuff in Inside View, but I think it will be hard to ‘fix’ completely broken belief systems in that context. You’re going to have trouble even agreeing what constitutes a valid argument; having a discussion where people don’t just end up more polarized is going to be impossible.
I want to teach them to not get endlessly more radical before I teach anything else. Then I want to teach them to avoid tribalism and stuff like that. When all of that is done, I would begin working on the object-level stuff. Doing it in a different order seems doomed to failure, because it’s very hard to get people to change their minds.
So, is this an elites vs dumb masses framework? Quod licet Iovi, non licet bovi?
Your approach seems to boil down to “First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them”. I don’t think it’s a good approach—either desirable or effective. You don’t start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
And that’s besides the rather obvious power/control issues.
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can’t really be justified in considering your opinion more accurate than the mainstream one.
Once we have the framework, the question is the cause of the dumb masses. Personally, I think it’s tribal stuff, which means that I honestly believe tribalism should be solved before people can be made more rational. In my experience, tribal stuff seemed to die down when I got more accepting of majoritarianism (because if you respect majoritarianism, you can’t really say “the mainstream is silencing my tribe!” witthout having some important conclusions to make about your tribe).
It’s probably not a good approach for young children or similarly open minds, but we’re not working with a blank slate here. Also, it’s not like the policies I propose are Dark Side Epistemology; avoiding object level is perfectly sensible if you are not an expert.
Epistemologically, the final arbiter is reality. Besides, what do you call “mainstream”—the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
From the context it seems you associate “mainstream” with “dumb masses”, but the popular views are often remarkably uninformed and are also actively shaped by a variety of interests (both political and commercial). I doubt just being a contrarian in some aspect lifts you into “elite” status (e.g. paleo diets, etc.)
I don’t understand. Are you saying that the masses are dumb because (causative!) the tribal affiliation is strong with them??
Which you want to wipe down to the indifferent/accepting/passive moron level before starting to do anything useful?
Another claim I strongly disagree with. Following this forces you to believe everything you’re told as long as sufficient numbers of people around you believe the same thing—even though it’s stupid on the object level. I think it’s a very bad approach.
Perhaps I focused too much on ‘mainstream’ when I really meant ‘outside view’. Obviously, outside view can take both of these into account to different degrees, but essentially, the point is that I think teaching the person to use outside view is better, and outside view is heavily biased (arguably justifiably so) in favor of the mainstream.
But that’s my point: a lot of different contrarian groups have what the OP calls “a web of lies that sound quite logical and true”. Do you really think you can teach them how to identify such a web of lies while they are stuck in one?
Instead, I think you need to get them unstuck using outside view, and then you can teach them how to identify truth correctly.
Yes. The masses try to justify their ingroup, they don’t try to seek truth.
The way is see it is this: if I got into a debate with a conspiracy theorist, I’m sure they would have much better object-level arguments than I do; I bet they would be able to consistently win when debating me. The reason for this is that I’m not an expert on their specific conspiracy, while they know every single shred of evidence in favor of their theory. This means that I need to rely on meta-level indicators like nobody respecting holocaust deniers in order to determine the truth of their theories, unless I want to spend huge amounts of time researching them.
Sure, there are cases where I think I can do better than most people (computer science, math, physics, philosophy, gender, generally whatever I decide is interesting and start learning a lot about) and in those case I’m willing to look at the object level, but otherwise I really don’t trust my own ability to figure out the truth—and I shouldn’t, because it’s necessary to know a lot of the facts before you can even start formulating sensible ideas on your own.
If we take this to the extreme where someone doesn’t understand truth, logic, what constitutes evidence or anything like that, I really would start out by teaching them how to deal with stuff when you don’t understand it in detail, not how to deal with it when you do.
Let’s sort out the terminology. I think we mean different things by “outside view”.
As far as I understand you, for you the “outside view” means not trying to come to any conclusions on your own, but rather accept what the authorities (mainstream, experts, etc.) tell you. Essentially, when you recommend “outside view” to people you tell them not to think for themselves but rather accept what others are telling them (see e.g. here).
I understand “outside view” a bit more traditionally (see e.g. here) and treat it as a forecasting technique. Basically, when you want to forecast something using the inside view, you treat that something as ‘self-propelled’, in a way, you look at its internal workings and mechanisms to figure out what will happen to it. If you take the outside view, on the other hand, you treat that something as a black box that is moved primarily by external forces and so to forecast you look at these external forces and not at the internals.
Given this, I read your recommendation “teaching the person to use outside view is better” as “teach the person to NOT think for himself, but accept whatever most people around think”.
I disagree with this recommendation rather strongly.
Why, yes, I do. In fact, I think it’s the normal process of extracting oneself from “a web of lies”—you start by realizing you’re stuck in one. Of course, no one said it would be easy.
An example—religious deconversion. How do you think it will work in your system?
Well, this theory implies some consequences. For example, it implies high negative correlation between IQ (or more fuzzy “smartness”) and the strength of tribal affiliation. Do we observe it? The theory also implies that if the tribal affiliation increases (e.g. because your country got involved in a military conflict), everyone suddenly becomes much dumber. Do we observe that?
I don’t know about that. You think of winning a debate in high-school debate club terms, or maybe in a TV debate terms—the one who scores the most points with the judges wins. That’s not how real life operates. The way for the conspiracy theorist to win the debate is to convince you. Unless you became a believer at the end, he did NOT win the debate. Most debates end in a draw.
That’s probably the core difference that leads to our disagreements. I do trust my own ability (the fact that I’m arrogant should not be a surprise to anyone). Specifically, I trust my own ability more than I trust the mainstream opinion.
Of course, my opinion and the mainstream opinion coincide on a great deal of mundane things. But when they don’t, I am not terribly respectful of the mainstream opinion and do not by default yield to it.
In fact, I don’t see how your approach is compatible with being on LW. Let’s take Alice who is a LessWrongian and is concerned about FAI risks. And let’s take Bob who subscribes to your approach of defering to the mainstream.
Alice goes: “I’m concerned about the risk of FAI.”
Bob: “That’s silly. You found yourself a cult with ridiculous ideas. Do you have a Ph.D. in Comp Sci or something similar? If not, you should not try have your own opinion about things you do not understand. Is the mainstream concerned about FAI? It is not. So you should not, as well.”
What can Alice reply to Bob? She is, in fact, not a Ph.D. and has no particular expertise in AI.
I don’t think you can extrapolate from very-low-IQ people to general population. By the same token, these people should not manage their own money, for example, or, in general, lead an unsupervised life.
The thing is, you can apply it more widely than just forecasting. Forecasting is just trying to figure out the future, and there’s no reason you should limit yourself to the future.
Anyway, the way I see it, in inside view, both when forecasting and when trying to figure out truth, you focus on the specific problem you are working on, try to figure out its internals, etc.. In outside view, you look at things outside the problem, like track record of similar things (which I, in my list, called “looks like cultishness”; arguably I could have named that better), other’s expectations of your success (hey bank, I would like to borrow money to start a company! what, you don’t believe I will succeed?), etc.. Perhaps ‘outside view’ isn’t a good term either (which kinda justifies me calling it majoritarianism to begin with...), but whatever. Let’s make up some new terms, how about calling them the helpless and the independent views?
Well, how often does it happen?
How much detail do you want it in and how general do you want it to be? What is the starting point of the person who needs to be deconverted? Actually, to skip all these kinds of questions, could you give an example of how you would write how deconversion would work in your system?
IQ != rationality. I don’t know if there is a correlation, and if there is one, I don’t know in which direction. Eliezer has made a good argument that higher IQ gives a wider possible range of rationality, but I don’t have the evidence to support that.
Anyway, I at least notice that the times where people are wrong, it’s often because they try to signal loyalty to their tribe (of course, there often is an opposing tribe that is correct on the question where the first one was wrong...). This is anecdotal, though, so YMMV. What do you observe? That people who have made certain answers to certain questions part of their identity are more likely to be correct?
...probably? Not so much with military conflicts, because you are not doing as much politics as you are doing fighting, but I generally see that if a discussion becomes political, everybody starts saying stupid stuff.
But the only reason I don’t get convinced is because of the helpless view (and, of course, things like tribalism, but let’s pretend I’m a bounded rationalist for simplicity). In the independent view, I see lots of reasons for believing him, and I have no good counterarguments. I mean, I know that I can find counterarguments, but I’m not going to do that after the debate.
Again, I believe in an asymmetry between people who have internalized various lessons on tribalism and other people. I agree that if I did not believe in that asymmetry, I would not have good epistemic reasons for being on LW (though I might have other good reasons, such as entertainment).
“Smart people like Bill Gates, Stephen Hawking and Elon Musk are worried about AI along with a lot of experts on AI.”
This should also be a significant factor in her belief in AI risk; if smart people or experts weren’t worried, she should not be either.
I’ve been in a high-IQ club and not all of them are rational. Take selection effects into account and we might very well end up with a lot of irrational high-IQ people.
Actually, there is—future is the only thing you can change—but let’s not sidetrack too much.
Sure, good names, let’s take ’em.
The reason I brought it up is that there is no default “do what the mainstream does” position there. The mainstream is religious and the helpless view would tell you to be religious, too.
I don’t have much experience with deconversions, but even looking at personal stories posted on LW, they seem to rotate around doubting particular elements on the objective level, not on the “this belief is too weird” level.
Well, yes, but “rationality” is not terribly well defined and is a whole another can of worms. In particular, I know how to measure IQ and I know how it’s distributed in populations and sub-groups. I do NOT know how to measure the degree of rationality and what’s happening with it in populations. That makes discussions of rationality as an empirical variable to be handwavy and… not stand on solid data ground.
First, signaling loyalty could be a perfectly rational thing to do. Second, there is the issue of the difference between signaling and true beliefs—signaling something other than what you believe is not uncommon.
No, I don’t think so. You have priors, don’t you? Presumably, quite strong priors about certain things? That’s not the same thing as a helpless view. Besides, being convinced or not involves much more than being able to debunk every single point of the argument. Gish Gallop is not a particularly convincing technique, though it’s good at scoring points :-/
How do you know who is who? And who gets to decide? If I am talking to someone, do I have to first have to classify her into enlightened or unenlightened?
That’s not a winning line of argument—it’s argument for popularity can be easily shut down by pointing out that a lot more smart people are not worried, and the helpless approach tells you not to pick fringe views.
The basic question is, how do you know? In particular, can you consistently judge the rationality of someone of noticeably higher IQ?
Of course, but you can ask for the asymmetry between $yourcountry, USA, Germany, Italy, Japan and Israel (or whichever group of places you prefer). These places have wildly different attitudes to religion (or, well, at least they follow different religions, somewhat) with no-one being in a better position in terms of figuring out the right religion, so you can conclude that while some religion must be correct, we don’t know which one.
Something something selection bias.
Anyway, I don’t know about religious deconversion, but I know I’ve had a lot of stupid political views that I’ve removed by using helpless view.
IIRC my brother deconverted via helpless view, but I might misremember. Either way, that would be n=1, so not that useful.
I quite like Eliezer’s suggestion of using the question of MWI as a test for rationality, but I’m biased; I independently discovered it as a child. :P
The problem here is that there isn’t necessarily a difference between signaling and true beliefs. Imagine your outgroup saying the most ridiculous thing you can. That thing is likely a kind of signaling, but in some ways (not all, though) it acts like a belief.
… I can sometimes for simplicity’s sake be modeled as having priors, but we’re all monkeys after all. But yeah, I know what you mean.
Sure. But if I lived in a world where most people believed the holocaust is a hoax, or a world where it was controversial whether it was a hoax but the knowledge of the evidence was distributed in the same way as it is today, I’m pretty sure I would be a holocaust denier.
(Of course, in the second case the evidence in favor of the holocaust having happened would rapidly emerge, completely crush the deniers, and send us back to the current world, but we’re “preventing” this from happening, counterfactually.)
Anyway, this shows that a large part of my belief in the holocaust comes from the fact that everybody knows holocaust deniers are wrong. Sure, the evidence in favor of the holocaust is there, but I (assume I would have (I haven’t actually bothered checking what the deniers are saying)) no way of dealing the denier’s counterarguments, because I would have to dig through mountains of evidence every time.
(If holocaust deniers are actually trivial to disprove, replace them with some other conspiracy theory that’s trickier.)
Well, most of the time, you’re going to notice. Try talking politics with them; the enlightened ones are going to be curious, while the unenlightened ones are going to push specific things. Using the word ‘majoritarian’ for the helpless view might have made it unclear that in many cases, it’s a technique used by relatively few people. Or rather, most people only use it for subjects they aren’t interested in.
However, even if you can’t tell, most of the time it’s not going to matter. I mean, I’m not trying to teach lists of biases or debate techniques to every person you talk to.
Gates is one of the most famous people within tech, though. That’s not exactly fringe.
Actually, I just re-read your scenario. I had understood it as if Alice subscribed to the helpless view. I think that in this case, Bob is making the mistake of treating the helpless view as an absolute truth, rather than a convenient approximation.I wouldn’t dismiss entire communities based on weak helpless view knowledge; it would have to be either strong (i.e. many conspiracy theories) or independent view.
In the case described in the OP, we have strong independent view knowledge that the pseudoscience stuff is wrong.
I think so. I mean, it even had a Seer who hosted a reasonably popular event at the club, so… yeah. IQ high, rationality at different levels.
Also, ‘noticeably higher IQ’ is ambiguous. Do you mean ‘noticeably higher IQ’ than I have? Because it was just an ordinary high-IQ thing, not an extreme IQ thing, so it’s not like I was lower than the average of that place. I think its minimum IQ was lower than the LW average, but I might be mixing up some stuff.
Sorry, under the helpless approach you cannot conclude anything, much less on the basis of something as complex as a cross-cultural comparative religion analysis. If you are helpless, you do what people around you do and think what people around you think. The end.
Oh-oh. I’m failing this test hard.
Besides, are you quite sure that you want to make an untestable article of faith with zero practical consequences the criterion for rationality? X-/
That’s an argument against the helpless view, right? It sure looks this way.
Well, yes, I’m going to notice and I generally have little trouble figuring out who’s stupid and who is smart. But that’s me and my personal opinion. You, on the other hand, are setting this up as a generally applicable rule. The problem is who decides. Let’s say I talk a bit to Charlie and decide that Charlie is stupid. Charlie, on the basis of the same conversation, decides that I’m stupid. Who’s right? I have my opinion, and Charlie has his opinion, and how do we resolve this without pulling out IQ tests and equivalents?
It’s essentially a power and control issue: who gets to separate people into elite and masses?
In your setup there is the Arbiter—in your case, yourself—who decides whether someone is smart (and so is allowed to use the independent approach) or stupid (and so must be limited to the helpless approach). This Arbiter has a certain level of IQ. Can the Arbiter judge the smartness/rationality of someone with noticeably higher IQ than the Arbiter himself?
It seems like we are thinking of two different views, then. Let’s keep the name ‘helpless view’ for mine and call yours ‘straw helpless view’.
The idea behind helpless view is that you’re very irrational in many ways. Which ways?
You’re biased in favor of your ingroups and your culture. This feels like your ingroups are universally correct from the inside, but you can tell that it is a bias from the fact that your outgroups act similarly confident.
You’re biased in favor of elegant convincing-sounding arguments rather than hard-to-understand data.
Your computational power is bounded, so you need to spend a lot of resources to understand things.
Mount Stupid
There are obviously more, but biases similar to those are the ones the helpless view is intended to fight.
The way it fights those arguments is by not allowing object-level arguments, arguments that favor your ingroups or your culture over others and things like that.
Instead, in helpless view, you focus on things like:
International mainstream consensus. (Things like cross-cultural analysis on opinions, what organizations like the UN say, etc.)
Expert opinions. (If the experts, preferably in your outgroup, agree that something is wrong, rule it out. Silence on the issue does not let you rule it out.)
Things that you are an expert on. (Looking things up on the web does not count as expert.)
What the government says.
(The media are intentionally excluded.)
evil grin
Nah, it was mostly meant as a semi-joke. I mean, I like the criterion, but my reasons for liking it are not exactly unbiased.
If I were to actually make a rationality test, I would probably look at the ingroups/outgroups of the people I make the test for, include a bunch of questions about facts where each there is a lot of ingroup/outgroup bias, and look at the answers to that.
Except that we live in the current world, not the counterfactual world, and in the current world the helpless view tells you not to believe conspiracy theories.
I dunno.
For what purpose are you separating the people into elite and masses? If it’s just a question of who to share dangerous knowledge to, there’s the obvious possibility of just letting whoever wants to share said dangerous knowledge decide.
I don’t know, because I have a really high IQ, so I don’t usually meet people with noticeably higher IQ. Do you have any examples of ultra-high-IQ people who write about controversial stuff?
Hold on. I thought the helpless view was for the “dumb masses”. They are certainly not able to figure out what the “international mainstream consensus” is. Hell, even I have no idea what it is (or even what it means).
A simple example: Western democracy. What’s the “international mainstream consensus”? Assuming it exists, I would guess it says that the Western-style democracy needs a strong guiding hand lest it devolves into degeneracy and amoral chaos. And hey, if you ask the experts in your outgroup (!) they probably wouldn’t be great fans of the Western democracies, that’s why they are in the outgroup to start with.
I have a feeling you want the helpless view to cherry-pick the “right” advice from the confusing mess of the “international consensus” and various experts and government recommendations. I don’t see how this can work well.
Heh. You know the definitions of a cult and a religion? A cult is a small unsuccessful religion. A religion is a large successful cult.
In exactly the same way what gets labeled a “conspiracy theory” is already a rejected view. If the mainstream believes in a conspiracy theory, it’s not called a “conspiracy theory”, it’s called a deep and insightful analysis. If you were to live in a culture where Holocaust denial was mainstream, it wouldn’t be called a conspiracy theory, it would be called “what all right-minded people believe”.
For the purpose of promoting/recommending either the independent view or the helpless view.
By the way, you asked for a helpless-view deconversion. TomSka just posted one, so...
The “dumb masses” here are not defined as being low-IQ, but just low-rationality. Low-IQ people would probably be better served just doing what people around them are doing (or maybe not; I’m not an expert in low-IQ people).
Well, one of the first conclusions to draw with helpless view is “politics is too complicated to figure out”. I’m not sure I care that much about figuring out if democracy is good according to helpless view. The UN seems to like democracy, and I would count that as helpless-view evidence in favor of it.
I would guess that there is an ambiguously pro-democratic response. 48% of the world lives in democracies, and the places that aren’t democratic probably don’t agree as much on how to be un-democratic as the democratic places agree on how to be democratic.
Whoever does the promoting/recommending seems like a natural candidate, then.