Right. This is why I think it’s under-ratedly important for contrarians who actually believe in the potential efficacy of their beliefs to not seem like contrarians. If you truly believe that your ideas are underrepresented then you will much better promote them by being appearing generally “normal” and passing off the underrepresented idea as a fairly typical part of your ostensibly coherent worldview. I will admit that this is more challenging.
Couldn’t do it if I tried for a hundred years. Not disagreeing, though.
Actually, I’d say that you do a much better job at this than many contrarians on the Internet, MW notwithstanding. At least, you have the “passing off the underrepresented idea as a fairly typical part of your ostensibly coherent worldview” part down.
If you truly believe that your ideas are underrepresented then you will much better promote them by being appearing generally “normal” and passing off the underrepresented idea as a fairly typical part of your ostensibly coherent worldview.
Couldn’t do it if I tried for a hundred years.
Something interesting may be going on here.
Consider the question, “Could you appear generally ‘normal’ and pass off the underrepresented idea as a fairly typical part of your ostensibly coherent worldview, if the fate of the world depended on it?”
I could imagine the same person giving two different answers, depending on how they understood the question to be meant.
One answer would be, “Yes: if I knew the fate of the world depended on it, then I would appear normal”.
The other would be, “No, because the fate of the world could not depend on my appearing normal; in fact the fate of the world would depend on my not appearing normal”.
(A third possible answer would be about one form of akrasia: “In that case I would believe that the fate of the world depended on my appearing normal, but I wouldn’t alieve it enough to be able to appear normal.”)
This seems psychologically parallel to the situation of the question about “Would you kill babies if it was intrinsically the right thing to do?”. One answer is, “Yes: if I knew the intrinsically right thing to do was to kill babies, then I would kill babies”. But the other is, “No, because killing babies would not be the intrinsically right thing to do; in fact the intrinsically right thing to do would be to not kill babies”. (The third answer would be, “In that case I would believe that killing babies was intrinsically the right thing to do, but I wouldn’t alieve it enough to be able to kill babies”.)
Maybe the important question is: what would the meaning be of the imagined statement, “the fate of the world would depend on my not appearing normal”?
For the statement “the intrinsically right thing to do would be to not kill babies”, one possible meaning would be conceptual and epistemic: “I have thought about possible ethical and meta-ethical positions, and it was impossible for killing babies to be the right thing to do.” Another possible meaning would be non-epistemic and intuitive: “If someone has an argument that killing babies is the intrinsically right thing to do, then from priors, I and others would both estimate that the chances are too high that they had made a mistake in ethics or meta-ethics. If I were to agree that they could be right, that would be a concession that would be both too harmful directly and too costly to me socially.”
Similarly, for the statement “the fate of the world would depend on my not appearing normal”, one possible meaning would be conceptual and epistemic: “I have thought about my abilities, and I could do at most two of ‘value appearing normal’, ‘be psychologically able to privately reason to truthful beliefs’, and ‘have enough mental energy left over for other work’, but it was impossible for me to do all three.” Another meaning would non-epistemic and intuitive: “If someone has an argument that it would be good on net to appear normal and not contrarian, then from priors, I and others would both estimate that the chances are high that they had made a motivated mistake about how hard it is to have good epistemology. If I were to agree that they could be right, that would be a concession that would be both too harmful for me directly, and too costly socially as an understood implicit endorsement of that kind of motivated mistake.”
It’s a good link. But I would strongly recommend Eliezer did not try harder to do this. Some considerations:
Eliezer is a terrible politician. Ok, he can get by on a high IQ and plenty of energy. But if you are considering comparative advantage Eliezer should no more devote himself to political advocacy than he should create himself a car out of iron ore.
Apart from details of presentation the important thing to do is be conformist in all areas except the one in which they make their move. This is a significant limitation on what you can achieve, particularly when what you are attempting to achieve involves interacting with the physical reality and not just the social reality.
The Sesame Street approach to belief (one of these things is not like the other ones) is a status optimisation, not necessarily an optimal way to increase the influence of an idea. It involves spending years defending the positions of high status individuals and carefully avoiding all contrarian positions until you have the prestige required to make a play for your own territory. Then, you select the work of (inevitably lower status) innovators in a suitable area. Present the ideas yourself and use your prestige to ensure that your terminology becomes adopted and your papers most frequently cited. The innovators can then choose between marginalization, supplication or moving to a new field. If any innovator happens to come up with ideas that challenge your position and you dismiss them as arrogant and smug and award status to others who, by way of supplication, do likewise.
Does this help make a contrarian idea mainstream? Perhaps. But maybe the status exploitation of ideas market is efficient and your participation makes no particular difference. Either way, I consider gaining power in this manner useful for achieving Eliezer’s aims only in the same way it would be useful for him to gain power through selling stationary or conquering a small nation. Possibly instrumentally useful but far from comparatively advantageous.
This only works well if you’re really high status in the first place. Therefore what someone who reads Andy’s comment should try to do would be to bootstrap the memetic fitness of their idea via adoption by progressively higher status people until you snag a Dawkins. The way to do this isn’t obviously to try to appear especially high status oneself; I suspect a strong method would be to appear just high enough status so as to spam as many moderate-status people as possible with reasonably optimized memes and rely on a halfway decent infection rate. The disease would thus become endemic and hopefully reach fixation. One way to reach that stage would be to become high status oneself, but I’m guessing it’d be more effective to predict who will soon become high status and talk to them while they’re still approachable.
(The above may be obvious but it was useful for me to think through such a memetic strategy explicitly.)
One way to reach that stage would be to become high status oneself, but I’m guessing it’d be more effective to predict who will soon become high status and talk to them while they’re still approachable.
This ability is one that is rather useful for the goal of gaining status, too. (As well as being reflected in the mating strategy of young females.)
But, I think, you’d better be vocal, visible and brash to some extent or you risk science advancing by funerals. If someone believes that replacing status quo beliefs with a correct contratrian belief is very important then IMO her optimal strategy will be somewhere between total crackpotness and total passivity.
Right. This is why I think it’s under-ratedly important for contrarians who actually believe in the potential efficacy of their beliefs to not seem like contrarians. If you truly believe that your ideas are underrepresented then you will much better promote them by being appearing generally “normal” and passing off the underrepresented idea as a fairly typical part of your ostensibly coherent worldview. I will admit that this is more challenging.
Great debate starter.
One quibble, I don’t think that it’s even ostensibly normal to have or aspire to have a coherent worldview.
Strong agreement.
Couldn’t do it if I tried for a hundred years. Not disagreeing, though.
Actually, I’d say that you do a much better job at this than many contrarians on the Internet, MW notwithstanding. At least, you have the “passing off the underrepresented idea as a fairly typical part of your ostensibly coherent worldview” part down.
Something interesting may be going on here.
Consider the question, “Could you appear generally ‘normal’ and pass off the underrepresented idea as a fairly typical part of your ostensibly coherent worldview, if the fate of the world depended on it?”
I could imagine the same person giving two different answers, depending on how they understood the question to be meant.
One answer would be, “Yes: if I knew the fate of the world depended on it, then I would appear normal”.
The other would be, “No, because the fate of the world could not depend on my appearing normal; in fact the fate of the world would depend on my not appearing normal”.
(A third possible answer would be about one form of akrasia: “In that case I would believe that the fate of the world depended on my appearing normal, but I wouldn’t alieve it enough to be able to appear normal.”)
This seems psychologically parallel to the situation of the question about “Would you kill babies if it was intrinsically the right thing to do?”. One answer is, “Yes: if I knew the intrinsically right thing to do was to kill babies, then I would kill babies”. But the other is, “No, because killing babies would not be the intrinsically right thing to do; in fact the intrinsically right thing to do would be to not kill babies”. (The third answer would be, “In that case I would believe that killing babies was intrinsically the right thing to do, but I wouldn’t alieve it enough to be able to kill babies”.)
Maybe the important question is: what would the meaning be of the imagined statement, “the fate of the world would depend on my not appearing normal”?
For the statement “the intrinsically right thing to do would be to not kill babies”, one possible meaning would be conceptual and epistemic: “I have thought about possible ethical and meta-ethical positions, and it was impossible for killing babies to be the right thing to do.” Another possible meaning would be non-epistemic and intuitive: “If someone has an argument that killing babies is the intrinsically right thing to do, then from priors, I and others would both estimate that the chances are too high that they had made a mistake in ethics or meta-ethics. If I were to agree that they could be right, that would be a concession that would be both too harmful directly and too costly to me socially.”
Similarly, for the statement “the fate of the world would depend on my not appearing normal”, one possible meaning would be conceptual and epistemic: “I have thought about my abilities, and I could do at most two of ‘value appearing normal’, ‘be psychologically able to privately reason to truthful beliefs’, and ‘have enough mental energy left over for other work’, but it was impossible for me to do all three.” Another meaning would non-epistemic and intuitive: “If someone has an argument that it would be good on net to appear normal and not contrarian, then from priors, I and others would both estimate that the chances are high that they had made a motivated mistake about how hard it is to have good epistemology. If I were to agree that they could be right, that would be a concession that would be both too harmful for me directly, and too costly socially as an understood implicit endorsement of that kind of motivated mistake.”
Use the try harder, Luke.
It’s a good link. But I would strongly recommend Eliezer did not try harder to do this. Some considerations:
Eliezer is a terrible politician. Ok, he can get by on a high IQ and plenty of energy. But if you are considering comparative advantage Eliezer should no more devote himself to political advocacy than he should create himself a car out of iron ore.
Apart from details of presentation the important thing to do is be conformist in all areas except the one in which they make their move. This is a significant limitation on what you can achieve, particularly when what you are attempting to achieve involves interacting with the physical reality and not just the social reality.
The Sesame Street approach to belief (one of these things is not like the other ones) is a status optimisation, not necessarily an optimal way to increase the influence of an idea. It involves spending years defending the positions of high status individuals and carefully avoiding all contrarian positions until you have the prestige required to make a play for your own territory. Then, you select the work of (inevitably lower status) innovators in a suitable area. Present the ideas yourself and use your prestige to ensure that your terminology becomes adopted and your papers most frequently cited. The innovators can then choose between marginalization, supplication or moving to a new field. If any innovator happens to come up with ideas that challenge your position and you dismiss them as arrogant and smug and award status to others who, by way of supplication, do likewise.
Does this help make a contrarian idea mainstream? Perhaps. But maybe the status exploitation of ideas market is efficient and your participation makes no particular difference. Either way, I consider gaining power in this manner useful for achieving Eliezer’s aims only in the same way it would be useful for him to gain power through selling stationary or conquering a small nation. Possibly instrumentally useful but far from comparatively advantageous.
This only works well if you’re really high status in the first place. Therefore what someone who reads Andy’s comment should try to do would be to bootstrap the memetic fitness of their idea via adoption by progressively higher status people until you snag a Dawkins. The way to do this isn’t obviously to try to appear especially high status oneself; I suspect a strong method would be to appear just high enough status so as to spam as many moderate-status people as possible with reasonably optimized memes and rely on a halfway decent infection rate. The disease would thus become endemic and hopefully reach fixation. One way to reach that stage would be to become high status oneself, but I’m guessing it’d be more effective to predict who will soon become high status and talk to them while they’re still approachable.
(The above may be obvious but it was useful for me to think through such a memetic strategy explicitly.)
This ability is one that is rather useful for the goal of gaining status, too. (As well as being reflected in the mating strategy of young females.)
But, I think, you’d better be vocal, visible and brash to some extent or you risk science advancing by funerals. If someone believes that replacing status quo beliefs with a correct contratrian belief is very important then IMO her optimal strategy will be somewhere between total crackpotness and total passivity.
More challenging, but definitely satisfying in its own way, which is quite different tasting than victory in argument. Highly recommended strategy.
Again, I feel suspicious toward this idea without being sure why.