The relevant data isn’t “I can convince people if I use all the social tricks in the book”
I do start from an expectation that raising the sanity waterline, especially inside my circle of friends and colleagues, has significant moral value, as well as long-term hedonic (quasi-monetary) value for me. So that makes it worth an investment of bit of focused discussion. Do you disagree? Because if not, by the same argument, it is worth doing right so the message is received and sticks.
What’s the other guy doing, still talking about goats and doors, isn’t he just saying he’s the smarter thinker (stupid elitist)?
Two good points. I very much avoid the standard examples, because they’re too hard to relate to to make the discussion interesting and worth remembering. I prefer to pick up some apparent confusion in the behavior of the person I’m talking to—something central with an extensive object level works best—and keep asking questions and throwing in stories of how I use to make similar mistakes and how I try to fix them.
So that makes it worth an investment of bit of focused discussion. Do you disagree?
It would be a rookie mistake to disagree with “this has value to me, so it is worth an investment”, which is nearly tautological (barring fringe cases).
What I disagree with is “I find it much easier to change people’s minds about rationality than about, say, the NSA.” If you’re at the right hierarchical spot on the totem pole and do all the right social signals, you can convince people of nearly anything, excepting things that could reflect negatively on themselves (such as rationality)*. The latter is still possible, but much harder than many of the bullshit claims which happen to reflect positively on someone’s ego. That you happen to be right about rationality is just, you know, a happy coincidence from their point of view.
If you’re up for it, I dare you to go out and convince someone of average reasoning skills of some bullshit proposition, using the same effort and enthusiasm you display for rationality. Then after that beware of rethinking your life and becoming a used car salesman.
* There is a special kind of shtick where the opposite applies, namely the whole religious “original sin, you’re not worthy, now KNEEL, MAGGOT!”-technique. Though that’s been losing its relevance, now that everyone’s a special snowflake. New age, new selling tactics.
Take a non-political wrong belief then. Same applies to selling sugar pills, I’m sorry, homeopathy. At least some people are earning billions with it.
Also, guarded statements such as “political beliefs can reflect negatively (...) depending on their social circle” are as close to null statements as you can reasonably get. I could substitute “can help you become an astronaut” and the statement would be correct.
I’m not sure who in their right mind would argue against “can … under certain circumstances …”-type social statements. It’s good to qualify our statements and to hedge our bets, wary of blanket generalizations, but at some point we need to stop and make some sort of stand, or we’re doing the equivalent of throwing empty chat bubbles at each other.
I don’t think that chaosmage accidently choose a political belief. Replacing it with a less controversial claim would be strawmanning the original post.
You don’t think “convincing someone that homeopathy works” is controversial enough? Are you objecting to both political and non-political beliefs, and wouldn’t that make the initial claim, you know, unfalsifiable?
For reference, the initial mention was:
I find it much easier to change people’s minds about rationality than about, say, the NSA.
As far as homeopathy goes the belief is differently controversial for different people. Convincing the average new atheist that homeopathy works is very hard. There’s identity involved. Convincing people who don’t care on the other hand is easier.
Are you objecting to both political and non-political beliefs, and wouldn’t that make the initial claim, you know, unfalsifiable?
I do think that chaosmage has experience in trying to change someone mind about the NSA. I do think that he found in his experience that it’s easier to change someone’s mind about rationality.
There nothing unfalsifiable about making that observation.
When I try to convince people of of rationality, I very much rely on people’s natural distaste for inconsistency. This distaste seems quite primal and I don’t think I’ve met anybody who doesn’t have it. Of course people tolerate all sorts of inconsistencies, but it takes System 2 effort, while System 1 clearly prefers the simplicity that comes with consistency. Rationality is great at consistency.
Therefore, a lot of what I do is pointing out inconsistencies and offering rationality as a way of fixing them. So while their System 2 analyzes whether what I’m saying can be made to fit with what they already believe, their System 1 keeps pointing out this search for consistency “feels right”. And when we’re finishing up at the object level, I can surprise them by predicting they’re having this feeling, explain why they do, and maybe go into the System 1 / System 2 paradigm and a recommendation for the Kahnemann book.
I don’t see how I could adapt this method in order to convince people of random bullshit.
And I disagree with your claim that people can be convinced of nearly anything. It is easy to convince them of things that don’t conflict with their world-view, but if they have the (thankfully now quite common) habit of checking Wikipedia, that will leave little room. You can wager social capital on your claim, but you’ll lose that investment if they continue to disagree and you risk them faking agreement in order to salvage your relationship. And unlike the used car salesman, I’m not satisfied if they agree today and disagree tomorrow.
It would be a rookie mistake to disagree with “this has value to me, so it is worth an investment”, which is nearly tautological
Obviously, which is why my question was about the valuation, not the consequence from it.
This reminds me of the HPMOR chapter in which Harry tests Hermione’s hypothesis checking skills. What you’re telling me is just what you’d expect if people were easily convinced of pretty much anything (with some caveats, admittedly), given social capital and social investment (which you have mentioned in your initial explanation). You have a specialized mechanism-of-action to explain your apparent success, one which indeed may not be easily adaptable to other ventures.
The problem is that it doesn’t explain the ubiquitous occurrence of people being convinced of pretty much any topic you could imagine (requiring more specialized theories). From organized religion, cults, homeopathy, nationalism, anti-nationalism, consumerism, anti-consumerism, big weddings, small weddings, no weddings, monogamy, polygamy, psychic energy, Keynesianism, Austrian economics, rationality, anti-rationality, the list goes on. It doesn’t matter that some of these happen to be correct when their exact opposite is also on the list, with plenty of adherents.
You have the anecdote, but looking at the human condition, I see plenty of data to the opposite. Though if you interpret that differently, please share.
I’m not satisfied if they agree today and disagree tomorrow.
Do you think that when (some time after your rationality talk with them) they display a bias in a real life situation, and you kindly make them notice that they did, that they’ll agree and have learned a lesson? It’s all good as long as it’s in the abstract, and a friendly guy who is sharing a cool topic he’s passionate about.
Which is good in a way, because just as the first rationality talk doesn’t stick, neither does the first e.g. “convert to your girlfriend’s religion”, usually.
Also, what, in your opinion, are the relative weights you’d ascribe to your success, in terms of “social investment / social strategy” versus your System 2/System 1 approach?
I would be interested in you actually trying the real, falsifying experiment: convincing someone of something obviously false (to you). It’s not hard, in the general case. Though, as you say, in recent years it has become slightly harder in some ways, easier in others: Far from creating one shared space, today’s interconnectivity seems to have led to a bunch of echo-chamber bubbles, even if Wikipedia is a hopeful sign.
Then again, Wikipedia exists. As does the multi-billion (and growing) homeopathy market.
looking at the human condition, I see plenty of data to the opposite.
I see it too—but you’re only talking about the present. Put it into historical context and it indicates the opposite of what you think it indicates. The history of bullshit is that, while there is still too much of it, it started at vastly worse premodern levels that were really weird, and has been losing arguments ever since.
One of my favorite examples is the reports of the first Protestant ministers who went through the 16th century villages, talked to peasants about what they actually believed and, because they weren’t evaluating their own work, actually recorded that. Turns out even after hundreds of years of Catholicism, these peasants had all sorts of ideas, and those ideas varied wildly from village to village or person to person. They’d have three gods, or eight, or even believe in forms of reincarnation. The Protestants went to homogenize that, of course, and in places like Brazil they still do. But even in Europe and the US, the number of distinct belief systems continues to decline.
Far from creating one shared space, today’s interconnectivity seems to have led to a bunch of echo-chamber bubbles
Medieval villages and similarly unconnected societies are echo-chamber bubbles too. So the number of bubbles has been going down, sharply, and competition between ideas has clearly become tougher. If an exception like Homeopathy is growing, that means it has been unusually successful in the harder environment (a big part, in this case, was greatly reduced claims of effectiveness). But that shouldn’t distract from the fact that lots of pseudotherapies that were comparable to it fifty years ago, such as Anthroposophic medicine and Orgone therapy, have gone down. And of course the quacks and witch doctors that we used to have before those were even more heterogenous and numerous.
And that’s exactly what you’d expect to see in a world where whether someone accepts an idea very much depends on what else that someone already believes. People aren’t usually choosing rationally what to believe, but they’re definitely choosing.
There seems to be a hard-coded exception for young kids, who will believe any bullshit their parents tell them, and that, rather than active conversion, is how some religions continue to grow. Surely it also helps bullshit that isn’t religion.
I’m obviously not doing this experiment you’re talking about, because it is wildly unethical and incurs severe social cost. And even if it turned out I can convince people of bullshit just as well as I convince them of rationality, that wouldn’t be relevant to my original assertion that convincing the unconvinced is not at all an “open and hard problem”.
The problem is that it doesn’t explain the ubiquitous occurrence of people being convinced of pretty much any topic you could imagine (requiring more specialized theories).
It’s quite easy to say in the abstract that people can be persuaded. It’s quite different to see what effort it takes to convince another person in real life.
I think we all have conversation where we try to convince someone and fail.
I do start from an expectation that raising the sanity waterline, especially inside my circle of friends and colleagues, has significant moral value, as well as long-term hedonic (quasi-monetary) value for me. So that makes it worth an investment of bit of focused discussion. Do you disagree? Because if not, by the same argument, it is worth doing right so the message is received and sticks.
Two good points. I very much avoid the standard examples, because they’re too hard to relate to to make the discussion interesting and worth remembering. I prefer to pick up some apparent confusion in the behavior of the person I’m talking to—something central with an extensive object level works best—and keep asking questions and throwing in stories of how I use to make similar mistakes and how I try to fix them.
It would be a rookie mistake to disagree with “this has value to me, so it is worth an investment”, which is nearly tautological (barring fringe cases).
What I disagree with is “I find it much easier to change people’s minds about rationality than about, say, the NSA.” If you’re at the right hierarchical spot on the totem pole and do all the right social signals, you can convince people of nearly anything, excepting things that could reflect negatively on themselves (such as rationality)*. The latter is still possible, but much harder than many of the bullshit claims which happen to reflect positively on someone’s ego. That you happen to be right about rationality is just, you know, a happy coincidence from their point of view.
If you’re up for it, I dare you to go out and convince someone of average reasoning skills of some bullshit proposition, using the same effort and enthusiasm you display for rationality. Then after that beware of rethinking your life and becoming a used car salesman.
* There is a special kind of shtick where the opposite applies, namely the whole religious “original sin, you’re not worthy, now KNEEL, MAGGOT!”-technique. Though that’s been losing its relevance, now that everyone’s a special snowflake. New age, new selling tactics.
Political beliefs like beliefs about the NSA can reflect negatively on a person depending on their social circle.
Take a non-political wrong belief then. Same applies to selling sugar pills, I’m sorry, homeopathy. At least some people are earning billions with it.
Also, guarded statements such as “political beliefs can reflect negatively (...) depending on their social circle” are as close to null statements as you can reasonably get. I could substitute “can help you become an astronaut” and the statement would be correct.
I’m not sure who in their right mind would argue against “can … under certain circumstances …”-type social statements. It’s good to qualify our statements and to hedge our bets, wary of blanket generalizations, but at some point we need to stop and make some sort of stand, or we’re doing the equivalent of throwing empty chat bubbles at each other.
I don’t think that chaosmage accidently choose a political belief. Replacing it with a less controversial claim would be strawmanning the original post.
You don’t think “convincing someone that homeopathy works” is controversial enough? Are you objecting to both political and non-political beliefs, and wouldn’t that make the initial claim, you know, unfalsifiable?
For reference, the initial mention was:
As far as homeopathy goes the belief is differently controversial for different people. Convincing the average new atheist that homeopathy works is very hard. There’s identity involved. Convincing people who don’t care on the other hand is easier.
I do think that chaosmage has experience in trying to change someone mind about the NSA. I do think that he found in his experience that it’s easier to change someone’s mind about rationality.
There nothing unfalsifiable about making that observation.
When I try to convince people of of rationality, I very much rely on people’s natural distaste for inconsistency. This distaste seems quite primal and I don’t think I’ve met anybody who doesn’t have it. Of course people tolerate all sorts of inconsistencies, but it takes System 2 effort, while System 1 clearly prefers the simplicity that comes with consistency. Rationality is great at consistency.
Therefore, a lot of what I do is pointing out inconsistencies and offering rationality as a way of fixing them. So while their System 2 analyzes whether what I’m saying can be made to fit with what they already believe, their System 1 keeps pointing out this search for consistency “feels right”. And when we’re finishing up at the object level, I can surprise them by predicting they’re having this feeling, explain why they do, and maybe go into the System 1 / System 2 paradigm and a recommendation for the Kahnemann book.
I don’t see how I could adapt this method in order to convince people of random bullshit.
And I disagree with your claim that people can be convinced of nearly anything. It is easy to convince them of things that don’t conflict with their world-view, but if they have the (thankfully now quite common) habit of checking Wikipedia, that will leave little room. You can wager social capital on your claim, but you’ll lose that investment if they continue to disagree and you risk them faking agreement in order to salvage your relationship. And unlike the used car salesman, I’m not satisfied if they agree today and disagree tomorrow.
Obviously, which is why my question was about the valuation, not the consequence from it.
This reminds me of the HPMOR chapter in which Harry tests Hermione’s hypothesis checking skills. What you’re telling me is just what you’d expect if people were easily convinced of pretty much anything (with some caveats, admittedly), given social capital and social investment (which you have mentioned in your initial explanation). You have a specialized mechanism-of-action to explain your apparent success, one which indeed may not be easily adaptable to other ventures.
The problem is that it doesn’t explain the ubiquitous occurrence of people being convinced of pretty much any topic you could imagine (requiring more specialized theories). From organized religion, cults, homeopathy, nationalism, anti-nationalism, consumerism, anti-consumerism, big weddings, small weddings, no weddings, monogamy, polygamy, psychic energy, Keynesianism, Austrian economics, rationality, anti-rationality, the list goes on. It doesn’t matter that some of these happen to be correct when their exact opposite is also on the list, with plenty of adherents.
You have the anecdote, but looking at the human condition, I see plenty of data to the opposite. Though if you interpret that differently, please share.
Do you think that when (some time after your rationality talk with them) they display a bias in a real life situation, and you kindly make them notice that they did, that they’ll agree and have learned a lesson? It’s all good as long as it’s in the abstract, and a friendly guy who is sharing a cool topic he’s passionate about.
Which is good in a way, because just as the first rationality talk doesn’t stick, neither does the first e.g. “convert to your girlfriend’s religion”, usually.
Also, what, in your opinion, are the relative weights you’d ascribe to your success, in terms of “social investment / social strategy” versus your System 2/System 1 approach?
I would be interested in you actually trying the real, falsifying experiment: convincing someone of something obviously false (to you). It’s not hard, in the general case. Though, as you say, in recent years it has become slightly harder in some ways, easier in others: Far from creating one shared space, today’s interconnectivity seems to have led to a bunch of echo-chamber bubbles, even if Wikipedia is a hopeful sign.
Then again, Wikipedia exists. As does the multi-billion (and growing) homeopathy market.
I see it too—but you’re only talking about the present. Put it into historical context and it indicates the opposite of what you think it indicates. The history of bullshit is that, while there is still too much of it, it started at vastly worse premodern levels that were really weird, and has been losing arguments ever since.
One of my favorite examples is the reports of the first Protestant ministers who went through the 16th century villages, talked to peasants about what they actually believed and, because they weren’t evaluating their own work, actually recorded that. Turns out even after hundreds of years of Catholicism, these peasants had all sorts of ideas, and those ideas varied wildly from village to village or person to person. They’d have three gods, or eight, or even believe in forms of reincarnation. The Protestants went to homogenize that, of course, and in places like Brazil they still do. But even in Europe and the US, the number of distinct belief systems continues to decline.
Medieval villages and similarly unconnected societies are echo-chamber bubbles too. So the number of bubbles has been going down, sharply, and competition between ideas has clearly become tougher. If an exception like Homeopathy is growing, that means it has been unusually successful in the harder environment (a big part, in this case, was greatly reduced claims of effectiveness). But that shouldn’t distract from the fact that lots of pseudotherapies that were comparable to it fifty years ago, such as Anthroposophic medicine and Orgone therapy, have gone down. And of course the quacks and witch doctors that we used to have before those were even more heterogenous and numerous.
And that’s exactly what you’d expect to see in a world where whether someone accepts an idea very much depends on what else that someone already believes. People aren’t usually choosing rationally what to believe, but they’re definitely choosing.
There seems to be a hard-coded exception for young kids, who will believe any bullshit their parents tell them, and that, rather than active conversion, is how some religions continue to grow. Surely it also helps bullshit that isn’t religion.
I’m obviously not doing this experiment you’re talking about, because it is wildly unethical and incurs severe social cost. And even if it turned out I can convince people of bullshit just as well as I convince them of rationality, that wouldn’t be relevant to my original assertion that convincing the unconvinced is not at all an “open and hard problem”.
It’s quite easy to say in the abstract that people can be persuaded. It’s quite different to see what effort it takes to convince another person in real life.
I think we all have conversation where we try to convince someone and fail.