(I use the term “full reversal” to mean going from high confidence in a belief to high confidence in the opposite belief. A “hard reversal” is when a full reversal happens quickly.)
When have you noticed and remembered peers or colleagues changing their minds?
I think the question might need some modifiers to exclude the vast amounts of boring examples. Obviously your question does not evoke answers so boring as “Oh, the store is closed? Okay, then we can’t get milk tonight” but what about a corporate executive pivoting his strategy when he hears business-relevant news? By now I am bored of Losing-the-Faith stories, but I don’t deny their relevance to human rationality.
Anyway, I think full reversals tend to happen much less frequently than moderate reductions in confidence. Much more common are things of the form “I used to be totally anti-X, but now I see that the reality is a lot more complex and I’ve become much less certain” or “I used to be completely convinced that Y was true and the deniers were just being silly, but I read a couple decent challenges and now I’m just pretty confused overall”. One way in which this happens is when someone accepts that their strong belief actually depends on some fact that they don’t know much about.
But to try to directly answer your question, I might list:
Megan Phelps-Roper left the Westboro Baptist Church, in part due to having respectful debates on Twitter
Bostrom’s Hypothetical Apostasy never really caught on, despite sounding pretty cool on paper. Too bad.
Rationalists have gotten some recognition for anticipating the pandemic early—you might be able to find some good examples of mind-changing there.
Rationalist-adjascent blogger Tim Urban had a fairly sharp reversal on cryonics.
There’s that classic (boring?) example of a person quitting grad school after spending a few minutes answering reasonable questions about their motivations.
If you want a more politically-charged example: Scott Alexander loosely identifies as libertarian, having formerly been vocally anti-libertarian. Seems like this happened via deliberate argumentation, including some email exchanges with David Friedman (son of Milton Friedman).
I’ve seen some of my friends and acquaintances change their minds about psychoactive drugs.
Thanks, those are all promising directions! I’ve edited to [about important things] in the question; in phrasing the post I had edited it from over-specified to under-specified and your feedback helps target a happier medium. “Important” is still vague, of course.
One way in which this happens is when someone accepts that their strong belief actually depends on some fact that they don’t know much about.
“rationalism reduces a thinker’s odds of forming or maintaining a strong belief which depends on facts they know little about”, a nice counterpoint to “for all the talk about changing minds, I don’t see it happening as much as I’d expect”. It suggests that seeing too many hard reversals among thinkers of a particular school would suggest that the school itself might encourage them to draw strong conclusions too soon.
I’ve seen some of my friends and acquaintances change their minds about psychoactive drugs.
The conversations about both the pros and cons of altered states, which don’t or can’t resort to “just get into the state and see for yourself”, seem likely to have great examples of communication about difficult-to-communicate experiences. And I have access to a lot of that content online! Thank you for the nudge toward connecting these existing observations in a more useful way than I did before.
(I use the term “full reversal” to mean going from high confidence in a belief to high confidence in the opposite belief. A “hard reversal” is when a full reversal happens quickly.)
I think the question might need some modifiers to exclude the vast amounts of boring examples. Obviously your question does not evoke answers so boring as “Oh, the store is closed? Okay, then we can’t get milk tonight” but what about a corporate executive pivoting his strategy when he hears business-relevant news? By now I am bored of Losing-the-Faith stories, but I don’t deny their relevance to human rationality.
Anyway, I think full reversals tend to happen much less frequently than moderate reductions in confidence. Much more common are things of the form “I used to be totally anti-X, but now I see that the reality is a lot more complex and I’ve become much less certain” or “I used to be completely convinced that Y was true and the deniers were just being silly, but I read a couple decent challenges and now I’m just pretty confused overall”. One way in which this happens is when someone accepts that their strong belief actually depends on some fact that they don’t know much about.
But to try to directly answer your question, I might list:
Megan Phelps-Roper left the Westboro Baptist Church, in part due to having respectful debates on Twitter
Bostrom’s Hypothetical Apostasy never really caught on, despite sounding pretty cool on paper. Too bad.
Rationalists have gotten some recognition for anticipating the pandemic early—you might be able to find some good examples of mind-changing there.
Rationalist-adjascent blogger Tim Urban had a fairly sharp reversal on cryonics.
There’s that classic (boring?) example of a person quitting grad school after spending a few minutes answering reasonable questions about their motivations.
If you want a more politically-charged example: Scott Alexander loosely identifies as libertarian, having formerly been vocally anti-libertarian. Seems like this happened via deliberate argumentation, including some email exchanges with David Friedman (son of Milton Friedman).
I’ve seen some of my friends and acquaintances change their minds about psychoactive drugs.
Thanks, those are all promising directions! I’ve edited to [about important things] in the question; in phrasing the post I had edited it from over-specified to under-specified and your feedback helps target a happier medium. “Important” is still vague, of course.
“rationalism reduces a thinker’s odds of forming or maintaining a strong belief which depends on facts they know little about”, a nice counterpoint to “for all the talk about changing minds, I don’t see it happening as much as I’d expect”. It suggests that seeing too many hard reversals among thinkers of a particular school would suggest that the school itself might encourage them to draw strong conclusions too soon.
The conversations about both the pros and cons of altered states, which don’t or can’t resort to “just get into the state and see for yourself”, seem likely to have great examples of communication about difficult-to-communicate experiences. And I have access to a lot of that content online! Thank you for the nudge toward connecting these existing observations in a more useful way than I did before.