When I try to convince people of of rationality, I very much rely on people’s natural distaste for inconsistency. This distaste seems quite primal and I don’t think I’ve met anybody who doesn’t have it. Of course people tolerate all sorts of inconsistencies, but it takes System 2 effort, while System 1 clearly prefers the simplicity that comes with consistency. Rationality is great at consistency.
Therefore, a lot of what I do is pointing out inconsistencies and offering rationality as a way of fixing them. So while their System 2 analyzes whether what I’m saying can be made to fit with what they already believe, their System 1 keeps pointing out this search for consistency “feels right”. And when we’re finishing up at the object level, I can surprise them by predicting they’re having this feeling, explain why they do, and maybe go into the System 1 / System 2 paradigm and a recommendation for the Kahnemann book.
I don’t see how I could adapt this method in order to convince people of random bullshit.
And I disagree with your claim that people can be convinced of nearly anything. It is easy to convince them of things that don’t conflict with their world-view, but if they have the (thankfully now quite common) habit of checking Wikipedia, that will leave little room. You can wager social capital on your claim, but you’ll lose that investment if they continue to disagree and you risk them faking agreement in order to salvage your relationship. And unlike the used car salesman, I’m not satisfied if they agree today and disagree tomorrow.
It would be a rookie mistake to disagree with “this has value to me, so it is worth an investment”, which is nearly tautological
Obviously, which is why my question was about the valuation, not the consequence from it.
This reminds me of the HPMOR chapter in which Harry tests Hermione’s hypothesis checking skills. What you’re telling me is just what you’d expect if people were easily convinced of pretty much anything (with some caveats, admittedly), given social capital and social investment (which you have mentioned in your initial explanation). You have a specialized mechanism-of-action to explain your apparent success, one which indeed may not be easily adaptable to other ventures.
The problem is that it doesn’t explain the ubiquitous occurrence of people being convinced of pretty much any topic you could imagine (requiring more specialized theories). From organized religion, cults, homeopathy, nationalism, anti-nationalism, consumerism, anti-consumerism, big weddings, small weddings, no weddings, monogamy, polygamy, psychic energy, Keynesianism, Austrian economics, rationality, anti-rationality, the list goes on. It doesn’t matter that some of these happen to be correct when their exact opposite is also on the list, with plenty of adherents.
You have the anecdote, but looking at the human condition, I see plenty of data to the opposite. Though if you interpret that differently, please share.
I’m not satisfied if they agree today and disagree tomorrow.
Do you think that when (some time after your rationality talk with them) they display a bias in a real life situation, and you kindly make them notice that they did, that they’ll agree and have learned a lesson? It’s all good as long as it’s in the abstract, and a friendly guy who is sharing a cool topic he’s passionate about.
Which is good in a way, because just as the first rationality talk doesn’t stick, neither does the first e.g. “convert to your girlfriend’s religion”, usually.
Also, what, in your opinion, are the relative weights you’d ascribe to your success, in terms of “social investment / social strategy” versus your System 2/System 1 approach?
I would be interested in you actually trying the real, falsifying experiment: convincing someone of something obviously false (to you). It’s not hard, in the general case. Though, as you say, in recent years it has become slightly harder in some ways, easier in others: Far from creating one shared space, today’s interconnectivity seems to have led to a bunch of echo-chamber bubbles, even if Wikipedia is a hopeful sign.
Then again, Wikipedia exists. As does the multi-billion (and growing) homeopathy market.
looking at the human condition, I see plenty of data to the opposite.
I see it too—but you’re only talking about the present. Put it into historical context and it indicates the opposite of what you think it indicates. The history of bullshit is that, while there is still too much of it, it started at vastly worse premodern levels that were really weird, and has been losing arguments ever since.
One of my favorite examples is the reports of the first Protestant ministers who went through the 16th century villages, talked to peasants about what they actually believed and, because they weren’t evaluating their own work, actually recorded that. Turns out even after hundreds of years of Catholicism, these peasants had all sorts of ideas, and those ideas varied wildly from village to village or person to person. They’d have three gods, or eight, or even believe in forms of reincarnation. The Protestants went to homogenize that, of course, and in places like Brazil they still do. But even in Europe and the US, the number of distinct belief systems continues to decline.
Far from creating one shared space, today’s interconnectivity seems to have led to a bunch of echo-chamber bubbles
Medieval villages and similarly unconnected societies are echo-chamber bubbles too. So the number of bubbles has been going down, sharply, and competition between ideas has clearly become tougher. If an exception like Homeopathy is growing, that means it has been unusually successful in the harder environment (a big part, in this case, was greatly reduced claims of effectiveness). But that shouldn’t distract from the fact that lots of pseudotherapies that were comparable to it fifty years ago, such as Anthroposophic medicine and Orgone therapy, have gone down. And of course the quacks and witch doctors that we used to have before those were even more heterogenous and numerous.
And that’s exactly what you’d expect to see in a world where whether someone accepts an idea very much depends on what else that someone already believes. People aren’t usually choosing rationally what to believe, but they’re definitely choosing.
There seems to be a hard-coded exception for young kids, who will believe any bullshit their parents tell them, and that, rather than active conversion, is how some religions continue to grow. Surely it also helps bullshit that isn’t religion.
I’m obviously not doing this experiment you’re talking about, because it is wildly unethical and incurs severe social cost. And even if it turned out I can convince people of bullshit just as well as I convince them of rationality, that wouldn’t be relevant to my original assertion that convincing the unconvinced is not at all an “open and hard problem”.
The problem is that it doesn’t explain the ubiquitous occurrence of people being convinced of pretty much any topic you could imagine (requiring more specialized theories).
It’s quite easy to say in the abstract that people can be persuaded. It’s quite different to see what effort it takes to convince another person in real life.
I think we all have conversation where we try to convince someone and fail.
When I try to convince people of of rationality, I very much rely on people’s natural distaste for inconsistency. This distaste seems quite primal and I don’t think I’ve met anybody who doesn’t have it. Of course people tolerate all sorts of inconsistencies, but it takes System 2 effort, while System 1 clearly prefers the simplicity that comes with consistency. Rationality is great at consistency.
Therefore, a lot of what I do is pointing out inconsistencies and offering rationality as a way of fixing them. So while their System 2 analyzes whether what I’m saying can be made to fit with what they already believe, their System 1 keeps pointing out this search for consistency “feels right”. And when we’re finishing up at the object level, I can surprise them by predicting they’re having this feeling, explain why they do, and maybe go into the System 1 / System 2 paradigm and a recommendation for the Kahnemann book.
I don’t see how I could adapt this method in order to convince people of random bullshit.
And I disagree with your claim that people can be convinced of nearly anything. It is easy to convince them of things that don’t conflict with their world-view, but if they have the (thankfully now quite common) habit of checking Wikipedia, that will leave little room. You can wager social capital on your claim, but you’ll lose that investment if they continue to disagree and you risk them faking agreement in order to salvage your relationship. And unlike the used car salesman, I’m not satisfied if they agree today and disagree tomorrow.
Obviously, which is why my question was about the valuation, not the consequence from it.
This reminds me of the HPMOR chapter in which Harry tests Hermione’s hypothesis checking skills. What you’re telling me is just what you’d expect if people were easily convinced of pretty much anything (with some caveats, admittedly), given social capital and social investment (which you have mentioned in your initial explanation). You have a specialized mechanism-of-action to explain your apparent success, one which indeed may not be easily adaptable to other ventures.
The problem is that it doesn’t explain the ubiquitous occurrence of people being convinced of pretty much any topic you could imagine (requiring more specialized theories). From organized religion, cults, homeopathy, nationalism, anti-nationalism, consumerism, anti-consumerism, big weddings, small weddings, no weddings, monogamy, polygamy, psychic energy, Keynesianism, Austrian economics, rationality, anti-rationality, the list goes on. It doesn’t matter that some of these happen to be correct when their exact opposite is also on the list, with plenty of adherents.
You have the anecdote, but looking at the human condition, I see plenty of data to the opposite. Though if you interpret that differently, please share.
Do you think that when (some time after your rationality talk with them) they display a bias in a real life situation, and you kindly make them notice that they did, that they’ll agree and have learned a lesson? It’s all good as long as it’s in the abstract, and a friendly guy who is sharing a cool topic he’s passionate about.
Which is good in a way, because just as the first rationality talk doesn’t stick, neither does the first e.g. “convert to your girlfriend’s religion”, usually.
Also, what, in your opinion, are the relative weights you’d ascribe to your success, in terms of “social investment / social strategy” versus your System 2/System 1 approach?
I would be interested in you actually trying the real, falsifying experiment: convincing someone of something obviously false (to you). It’s not hard, in the general case. Though, as you say, in recent years it has become slightly harder in some ways, easier in others: Far from creating one shared space, today’s interconnectivity seems to have led to a bunch of echo-chamber bubbles, even if Wikipedia is a hopeful sign.
Then again, Wikipedia exists. As does the multi-billion (and growing) homeopathy market.
I see it too—but you’re only talking about the present. Put it into historical context and it indicates the opposite of what you think it indicates. The history of bullshit is that, while there is still too much of it, it started at vastly worse premodern levels that were really weird, and has been losing arguments ever since.
One of my favorite examples is the reports of the first Protestant ministers who went through the 16th century villages, talked to peasants about what they actually believed and, because they weren’t evaluating their own work, actually recorded that. Turns out even after hundreds of years of Catholicism, these peasants had all sorts of ideas, and those ideas varied wildly from village to village or person to person. They’d have three gods, or eight, or even believe in forms of reincarnation. The Protestants went to homogenize that, of course, and in places like Brazil they still do. But even in Europe and the US, the number of distinct belief systems continues to decline.
Medieval villages and similarly unconnected societies are echo-chamber bubbles too. So the number of bubbles has been going down, sharply, and competition between ideas has clearly become tougher. If an exception like Homeopathy is growing, that means it has been unusually successful in the harder environment (a big part, in this case, was greatly reduced claims of effectiveness). But that shouldn’t distract from the fact that lots of pseudotherapies that were comparable to it fifty years ago, such as Anthroposophic medicine and Orgone therapy, have gone down. And of course the quacks and witch doctors that we used to have before those were even more heterogenous and numerous.
And that’s exactly what you’d expect to see in a world where whether someone accepts an idea very much depends on what else that someone already believes. People aren’t usually choosing rationally what to believe, but they’re definitely choosing.
There seems to be a hard-coded exception for young kids, who will believe any bullshit their parents tell them, and that, rather than active conversion, is how some religions continue to grow. Surely it also helps bullshit that isn’t religion.
I’m obviously not doing this experiment you’re talking about, because it is wildly unethical and incurs severe social cost. And even if it turned out I can convince people of bullshit just as well as I convince them of rationality, that wouldn’t be relevant to my original assertion that convincing the unconvinced is not at all an “open and hard problem”.
It’s quite easy to say in the abstract that people can be persuaded. It’s quite different to see what effort it takes to convince another person in real life.
I think we all have conversation where we try to convince someone and fail.