Just wanted to point out an implicit and not necessarily correct assumption, leading to poor-quality advice:
Suppose you know a not-very-smart person (around or below average intelligence)
It seems that you assume that intelligence is one-dimensional. In my experience, while there is a correlation, most people are smarter in some areas than in others. For example, a mathematical genius may be incapable of introspection and have to interest in rational thinking outside math. Let’s take your example:
S/he read about rationality, has utilitarian inclinations, and wants to make the world better. However, s/he isn’t smart enough to discover new knowledge in most fields, or contribute very much to a conversation of more knowledgeable experts on a given topic. Let’s assume s/he has no exceptional talents in any area.
First, an “average person” does not read about rationality and has no “utilitarian inclinations”. They often do want to make the world better if socially conditioned to do so by their church or by the TV commercials showing a sick child in the 3rd world whom you can save for a dollar a day or something. So, the person you describe is not “average”.
Second, this “average person” might be (and likely is) intelligent in a way that does not show up on the IQ tests: he or she might be unusually good at running a corner store, or being a great parent, or whatever. Some of the talents may be latent, because they had no chance of being manifested. I would still call it “intelligence” by Eliezer’s definition: ability to optimize the universe, or at least some small slice of it.
As a consequence, your advice is suspiciously indistinguishable from the one you’d give an “LW-smart” person. My inclination would be to find this person’s area of aptitude and offer custom advice that plays to their strengths.
That doesn’t seem to count as a problem with the above definition. Taboo “intelligent.” Is Deep Blue an optimizing process that successfully optimizes a small part of the universe?
Yes.
Is it an optimizing process that should count as sentient for the purposes of having legal rights? Should we be worried about it taking over the world?
I agree that the word intelligence is too vague, but I’m specifically not including a mathematical genius (who would have an exceptional talent in the area of mathematics).
I strongly disagree that average people can’t or don’t have utilitarian inclinations. I think utilitarianism is one of the easiest philosophies to grasp, and I know a lot of average-IQ people who express the desire to “do as much good as possible” or “help as many people as possible.” Even the advertisements for charities that you mention tend to stress how much good can be achieved with how little money.
It’s certainly good to customize advice, but I think there is a class of advice I would offer to smart, skeptical people that I would hesitate to give to others. For example, I would tell my brightest students to question expert advice, because then they can more deeply understand why experts think what they do, or potentially uncover a true fault in expert reasoning. With my less-bright pupils, I find that this pushes towards conspiracy theories and pseudoscience, and thus more frequently advise them to trust experts and distrust people on the fringe. When smart people question mainstream scientific thinking, they may go astray. In my experience, when average-or-below intelligence people question mainstream scientific thinking, they almost always go astray, and when they don’t it’s usually coincidence.
I’m trying to figure out how to help them understand things more deeply and question things in a more productive manner, and definitely borrowing lots of ideas from LW, but I still think there is a lot more room for improvement.
Just wanted to point out an implicit and not necessarily correct assumption, leading to poor-quality advice:
It seems that you assume that intelligence is one-dimensional. In my experience, while there is a correlation, most people are smarter in some areas than in others. For example, a mathematical genius may be incapable of introspection and have to interest in rational thinking outside math. Let’s take your example:
First, an “average person” does not read about rationality and has no “utilitarian inclinations”. They often do want to make the world better if socially conditioned to do so by their church or by the TV commercials showing a sick child in the 3rd world whom you can save for a dollar a day or something. So, the person you describe is not “average”.
Second, this “average person” might be (and likely is) intelligent in a way that does not show up on the IQ tests: he or she might be unusually good at running a corner store, or being a great parent, or whatever. Some of the talents may be latent, because they had no chance of being manifested. I would still call it “intelligence” by Eliezer’s definition: ability to optimize the universe, or at least some small slice of it.
As a consequence, your advice is suspiciously indistinguishable from the one you’d give an “LW-smart” person. My inclination would be to find this person’s area of aptitude and offer custom advice that plays to their strengths.
IIRC the optimization power has to be cross-domain according to his definition, otherwise Deep Blue would count as intelligent.
That doesn’t seem to count as a problem with the above definition. Taboo “intelligent.” Is Deep Blue an optimizing process that successfully optimizes a small part of the universe?
Yes.
Is it an optimizing process that should count as sentient for the purposes of having legal rights? Should we be worried about it taking over the world?
No.
Deep Blue is a narrow AI...
I agree that the word intelligence is too vague, but I’m specifically not including a mathematical genius (who would have an exceptional talent in the area of mathematics).
I strongly disagree that average people can’t or don’t have utilitarian inclinations. I think utilitarianism is one of the easiest philosophies to grasp, and I know a lot of average-IQ people who express the desire to “do as much good as possible” or “help as many people as possible.” Even the advertisements for charities that you mention tend to stress how much good can be achieved with how little money.
It’s certainly good to customize advice, but I think there is a class of advice I would offer to smart, skeptical people that I would hesitate to give to others. For example, I would tell my brightest students to question expert advice, because then they can more deeply understand why experts think what they do, or potentially uncover a true fault in expert reasoning. With my less-bright pupils, I find that this pushes towards conspiracy theories and pseudoscience, and thus more frequently advise them to trust experts and distrust people on the fringe. When smart people question mainstream scientific thinking, they may go astray. In my experience, when average-or-below intelligence people question mainstream scientific thinking, they almost always go astray, and when they don’t it’s usually coincidence.
I’m trying to figure out how to help them understand things more deeply and question things in a more productive manner, and definitely borrowing lots of ideas from LW, but I still think there is a lot more room for improvement.
I’m sure they express the desire, but do they actually desire it and do they actually do it?