The Sin of Persuasion
Related to Your Rationality is My Business
Among religious believers in the developed world, there is something of a hierarchy in terms of social tolerability. Near the top are the liberal, nonjudgmental, frequently nondenominational believers, of whom it is highly unpopular to express disapproval. At the bottom you find people who picket funerals or bomb abortion clinics, the sort with whom even most vocally devout individuals are quick to deny association.
Slightly above these, but still very close to the bottom of the heap, are proselytizers and door to door evangelists. They may not be hateful about their beliefs, indeed many find that their local Jehovah’s Witnesses are exceptionally nice people, but they’re simply so annoying. How can they go around pressing their beliefs on others and judging people that way?
I have never known another person to criticize evangelists for not trying hard enough to change others’ beliefs.
And yet, when you think about it, these people are dealing with beliefs of tremendous scale. If the importance of saving a single human life is worth so much more than our petty discomforts with defying social convention or our own cognitive biases, how much greater must be the weight of saving an immortal soul from an eternity of hell? Shouldn’t they be doing everything in their power to change the minds of others, if that’s what it takes to save them? Surely if there is a fault in their actions, it’s that they’re doing too little given the weight their beliefs should impose on them.
But even if you believe you believe this is a matter of eternity, of unimaginable degrees of utility, if you haven’t internalized that belief, then it sure is annoying to be pestered about the state of your immortal soul.
This is by no means exclusive to religion. Proselytizing vegans, for instance, occupy a similar position on the scale of socially acceptable dietary positions. You might believe that nonhuman animals possess significant moral worth, and by raising them in oppressive conditions only to slaughter them en masse, humans are committing an enormous moral atrocity, but may heaven forgive you if you try to convince other people of this so that they can do their part in reversing the situation. Far more common are vegans who are adamantly non-condemnatory. They may abstain from using any sort of animal products on strictly moral grounds, but, they will defensively assert, they’re not going to criticize anyone else for doing otherwise. Individuals like this are an object example that the disapproval of evangelism does not simply come down to distaste for the principles being preached.
So why the taboo on trying to change others’ beliefs? Well, as a human universal, it’s hard to change our minds. Having our beliefs confronted tends to make us anxious. It might feel nice to see someone strike a blow against the hated enemy, but it’s safer and more comfortable to not have a war waged on your doorstep. And so, probably out of a shared desire not to have our own beliefs confronted, we’ve developed a set of social norms where individuals have an expectation of being entitled to their own distinct factual beliefs about the universe.
Of course, the very name of this blog derives from the conviction that that sort of thinking is not correct. But it’s worth wondering, when we consider a society which upholds a free market of ideas which compete on their relative strength, whether we’ve taken adequate precautions against the sheer annoyingness of a society where the taboo on actually trying to convince others of one’s beliefs has been lifted.
I think if you drop the view that door-to-door evangelism is about conversation and see it mostly as a retention mechanism instead, it addresses some of the issues in the article.
The statistics on the number of people converted to a religion because someone came to their door and argued with them are pitiful. If you spend two years trying to convert people that way, the median expectation is “no conversions” (one or two is substantial success, and more than that is a crazy outlier). If your goal is actually to change people’s minds your approach has to be much more subtle. It requires building a relationship of trust, respect, and care for a long time before a normal person will take your words as worthwhile evidence for updating their life philosophy.
As near as I can tell, this is pretty much the correct strategy for honesty convincing someone to adopt the practices suggested by a belief, whether its a matter of Veganism or Mormonism. My understanding from reading about the sociology of evangelism is that the Mormons give pamphlets to the laity that tell them not to proselyze to friends, co-workers, neighbors etc. Their job is to be friendly, helpful, and admirable while deflecting questions about their beliefs as “helpful for me but not that important” until someone is so impressed by the quality of their lives and how nice they are at a BBQ that they ask two or three times about their beliefs. Then the family invites the interested party (usually an entire family, with bonds of friendship that run husband-to-husband, wife-to-wife, kids-to-kids) to a church social event to help them see how things work. If the family is still interested, the believing family doesn’t “close the sale” itself—they invite the interested family over to talk with a specialized member of the church who can help the interested family formally articulate their by-now-pre-existing interest in joining the church.
In the meantime, from what I can tell, the real adaptive function of going door to door is that it exposes a new adult believer to a lot of debate-style conversation about the idea they are promoting. In the absence of effective epistemic warnings to the contrary they become more committed to their idea via massive commitment and consistency, and various other cognitive biases. This makes them less likely to feel that the belief is just a matter of lip service, which would make a lapse in belief more likely in the absence of the social processes of regular church attendance.
One of my pet hypotheses for “where rationalists come from” is that many of us engage in debate-style conversations (similar to proselytizing) for fun when we are young in the absence of any coherent doctrine to defend that has been handed down to us by an institution. This practice teaches us respect for the usefulness of defending “the right content” as a helpful technique for the intrinsically valuable outcome of winning a debate. Then epistemic rationality is intelligible as a truth-biased (and hence nominally moral) method to find the content and justifications that will be useful for winning future debates… then you have enough to bootstrap into logical fallacies and cost-benefit-analysis and so on. This isn’t a very admirable theory, but if it was true then it seems helpful to know if you’re trying to teach or find rationalists.
Also, if this hypothesis is on the mark it probably has sociological implications: If someone gets an emotional boost from winning a debate then someone else is going to lose a debate and get negative reinforcement for the same verbal interactions. Unless this process was engineered somehow (with people losing on purpose and not taking it hard?) I suspect that rationalists produced by this method can never be more than a fraction of the population, and they will necessarily be intermixed with people for whom debate has almost entirely negative emotional associations.
From this (admittedly very weird) perspective, evangelists for false doctrines may unwittingly be performing an epistemic public service that, on net, raises the sanity waterline :-)
That’s useful information (for the cultishness discussion thread). Do you have a cite to hand?
I must admit, I’d assumed this method had some success, enough to bother. So I’m glad we don’t have to fear rationalists going door to door waking people too early on Saturday morning and saying “Say, friend, have you read the Sequences?”
Searching for confirmation on google I found different statistics quoted in an article about relative religious growth rates here. The source may have incentives to inflate their numbers but they claim:
I don’t remember where I read the numbers I gave above, but the first place I’d look would be The Future of Religion: Secularization, Revival and Cult Formation which is a generally interesting read.
I think the book would be helpful for the “cultishness discussion” here but also for the general “rapture of the nerds” critique aimed at the singularity hypothesis when people first encounter it. One helpful thing about The Future Of Religion is to dissolve the confusion that comes up from the conjunction of particular supernatural theories and particular sociological processes.
Sloppy thinking can lead to accusations that some political movements or psycho-therapeutic fads are “cults” when there is no substantial religious doctrine, just a certain set of “human tribal instincts” being deployed in a manner that is characteristic of many voluntary social processes. The LaRouche Movement is an instructive edge case, which is sometimes accused of being a “political cult”.
Any conscious/rational human improvement system probably will have to make use of some of these basic mechanisms if it is to be effective, and the question (as always) is simply whether they are being used for good ends, given the actual state of the world.
Another useful item for the cultishness discussion might be the Bonewits’ Cult Danger Evaluation Frame because it encodes a useful set of symptoms that predict whether something is going seriously wrong with a group… or at least signal that is is “playing with fluorine”. There are reasons to play with fluorine, but not many good reasons.
Those numbers and references are utterly wonderful, thank you!
Calling something with no substantial religious doctrine a “cult” is not a category error for the thing I’m talking about as the word “cult”. It’s not my private definition either. It’s a particular sociological phenomenon in a group. It’s the thing someone is worried about when they say “Has Bob joined some sort of cult?” I need to actually describe what I’m talking about, which I’m not sure how to unpack with the requisite accuracy. But LaRouche is right in there, and Amway is too for another one.
Now to relate that more specifically to predatory infectious memes. And dig up all the stuff I read ten or so years ago.
Yeah, I read into that stuff largely out of a curiosity about memetics too :-)
I like “the mom test” for that. If you’re hanging out with a group based around a common set of beliefs that are taken by the group to be formally true, would you be too embarrassed to ask your mom for advice about the group or the beliefs? If so, then for you (given your parents and their community and so on) it’s close enough to a cult that you really should stay away (unless you’re just doing participant observation as a research project). On the other hand, if you’re not too embarrassed, then you really should actually go ask her for advice, because that kind of practical and emotionally grounded feedback is good input for people to be mindful of, even if it isn’t always perfect.
This is a primary reason I recommend that people talk to one or both parents about SIAI if they start suspecting that they should get personally involved with existential risk activism and FAI and so on :-)
I’m personally not at all surprised that the success rates are so low. If the evangelists had actually internalized the idea that they’re doing it to save people from hell, they’d take it much more seriously. Instead, it’s generally framed as a matter of duty to the religious community, or a sign of personal virtue for making the effort at all. It doesn’t need to have a significant success rate to sustain itself, although I imagine that if it were inefficient enough that most door to door evangelists had never heard of a successful conversion, they might reevaluate their methods.
If you really wanted to convert others though, not merely to do the most that was comfortable, or convince yourself you had made an honest effort, I think that the mainstream Mormon approach would not be the most effective method. Or rather, it doesn’t take the approach far enough. Maybe I’m merely projecting an atypical attitude, but I think it would be far more effective to dedicate your life to moral causes. Give away everything you own, work your hands to the bone to give away more, try to set a standard that would make Ghandi look undercommitted. The number of people who will be impressed and interested in the beliefs of a mere upstanding community member is nothing compared to that which would be interested in a moral paragon.
Of course, one might argue that people will be driven off if they suspect that the religious beliefs demand too much of them, but while religious believers tend to claim status from the efforts of the most exemplary members of their faith, they rarely try to meet their standards. Also, I’m highly skeptical of any argument that states that the most effective approach conveniently intersects with what is most comfortable.
Of course, in this context, it’s easy to see how putting one’s beliefs into their proper perspective and fully internalizing them can be a tremendous disadvantage. It’s no wonder if most people interpret their religions to only demand as much of them as is convenient.
I’ll second a part of your post: some people simply don’t like debate (either debate in person or argumentative writing.) Communicating with a non-debater about a debate-like topic is very strange; it’s like trying to fence against an opponent wielding a bowl of Jello instead of a sword. LessWrong has a number of essays about the ways people debate irrationally (arguments as soldiers, dark side epistemology, etc.) but I think far more of the population just doesn’t debate at all, and will respond blankly if you try to debate them.
People without a debate mindset might think “Oh, I’m not very good at arguments.” People with a debate mindset, I believe, don’t ever think in those terms: if we believe something strongly, it’s because we think we have a good argument for it, and if we don’t have a good argument for it, then we don’t have particularly strong convictions about it. But a non-debater can both believe something deeply and believe she could be out-argued by the opposing view.
For what it’s worth: I consider myself a “debater” in the sense you mean, but there are plenty of things where I believe them, I feel strongly about them, and I believe I could be out-argued by a sufficiently clever articulation of an opposing view, even if that view was wrong.
Out-argued in what sense? Do you think that you wouldn’t be able to see why their arguments were wrong, or just that you wouldn’t be able to persuade an impartial audience that you were more right?
On subjects that I hold strong beliefs in, I anticipate that I could not be out-argued in the first sense. If someone was able to offer such arguments, I would either have to conclude that they were right, or that I didn’t understand the topic as well as I thought I did in the first place, and would have to revise the strength of my beliefs.
I’m not certain that the dividing line between those two senses is as crisp as you make it sound, but I guess I mean something like the latter sense. That is, I can imagine someone articulating their arguments for -P in such a way that their arguments are more compelling than mine are for P, even when P is true.
The dividing line comes from the fact that an impartial audience is not at all the same thing as a rational audience, and there’s a lot more to rhetoric than making arguments that are logically sound and tenable.
My general defence against this is to be too difficult to actually convince. I nod and smile and acknowledge the quality of the arguments but am not actually convinced to change my mind. I may well have taken this too far. (It certainly frustrates the heck out of people.) It’s useful if you know you’re fond enough of new ideas to be susceptible to neophilia-induced bad ideas. It’s somewhat like being just plain dim.
Ah. It sounds like we have different interpretations of what SarahC meant by out-argued.
I don’t believe a clever debater can long-term convince me of the falsehood of something I believe and feel strongly about (sadly, even if it’s true), although they might induce me to go along temporarily.
This is, incidentally, not to say that I can’t be caught up in cultishness, merely to say that clever arguments are sufficient (or, sadly, necessary) to do it. (ETA: er… I meant, of course, “are not sufficient,” which was perhaps clear)
I think there is a difference between (1) people who enjoy verbally competing to be correct, (2) people who enjoy such competition and have formally trained, studied, and practiced it, and (3) people who have trained, studied, practiced and come out disillusioned with strategic communication. When I was 10 I was the first kind of debater. I think many law school students, by stereotype, are the second kind. Having spent two years in college on a policy debate team and later coached a high school debate team I find myself feeling kinship with the third group.
I think formal competitive debate experience plus philosophy can help to calibrate people in very useful ways. Of a proposition Q you can ask what P(Q|H) is for various values of H if you’re sticking to pure Bayesian rationality. You can also have a sense of how difficult it would be to go aff (or neg) on Q (or not-Q) in front of different audiences.
Among skilled debaters where both sides have a research library and knowledge of Q in advance and skilled debaters are the audience, whoever goes aff and controls the content of Q should generally win.
But a good debater should also be able to take a large number of “open questions” as Q, and win in all four scenarios (aff/neg x Q/not-Q) with an arbitrary audience against an unskilled opponent. Watching this second thing happen, and learning to do it, and teaching other people to do it has given me relatively little respect for casual debates as a truth seeking process, but a lot of respect for formal debates as an educational process.
If anyone reading this is picking colleges, I recommend looking for one with a CEDA program and spending a year on the team :-)
I agree that casual debate isn’t so great as a truth seeking process.
What I was saying was that some people seem to have trouble (or dislike) thinking propositionally; I’ve had conversations where I’m proposing an argument and the other person seems to think that I just want to be cheered up or something, and doesn’t realize I actually want to discuss the substance.
Speaking as a fencer, I’m having a very hard time imagining what this would actually be like.
Really? I’m not a fencer, but I just imagine a fencer standing in a kitchen while the non-debater pulls out the Jello from the fridge. The fencer stands there confused for a bit, while the non-debater goes on their way, but eventually realizes ze’s a fencer for a reason! The fencer lunges, misses, and hits the bowl, breaking it and spilling the Jello. The non-debater then either gets angry and annoyed, or sighs, pulls out another bowl and begins to make a second batch of Jello.
Did that help? :)
Not much, no.
I knew there was a reason I like to hide a glock in my Jello.
This explains some more of the Southpark Episode about the mormons. It is actually pretty correct and good.
You raise a good point: if beliefs about a subject (say, theology) are deadly serious, then it makes sense to try to persuade people to agree with you—after all, you’d be saving lives (or, at least, souls.) The institution of religious toleration, which is relatively recent, is a way of not taking religion quite that seriously. In religiously tolerant societies, we take a meta-value, tolerance, more seriously than the particular religion to which we belong.
I think people who believe in religious tolerance may see religion as a kind of personal identity or choice that should be respected. Not a statement of fact that can be challenged. “You can’t expect her to believe in the Virgin Birth, she’s Jewish!” is a normal thing to think, but “You can’t expect him to believe that an asteroid wiped out the dinosaurs, he’s from California!” isn’t. Dinosaurs are fair game for argument; religious identity, not so much.
Honestly, on topics where “opinions” are really more about personal identity and taste than evaluations of the facts, the religious toleration model might make more sense than the spirited debate model. Religious toleration tacitly admits that most people are not really capable of staying at a factual level when we talk about religion. I wonder what else might be better relegated to the realm of toleration instead of debate.
What else causes people to depart from the factual level?
Yep, I had that in mind. And I have noticed a trend (both in person and in the psych literature) to treat political orientation as more of an identity than as a set of ideas that may be true or false. Studies that show liberals and conservatives have different core moral values, different patterns of brain activity, etc. That’s a model of politics as a kind of overall temperament. Maybe in the future we’ll be as tolerant of political differences as we sometimes are of religious or sexual orientation differences—and it will come to be taboo to investigate or challenge them. I’m still on the fence as to whether I like that or not.
That doesn’t sound to me like a very viable arrangement. People control the direction of the country on the basis of their political beliefs (their religious beliefs too, to an extent.) I think that would make them top priorities for making them things that are rationally discussed. It might be difficult to train people to discuss them rationally, but the payoff is high.
If political persuasion heads the way of religious tolerance, I sincerely hope it is after we find a better model for running nations than politics.
Not in China they don’t. And this last decade, the Chinese government seems to have been doing a better job than say the government of Greece.
The country is doing better, but many of the government’s policies are comparably unsustainable. In theory you might be able to restrict all political decisions to a class who actually know how to work things out in a rational manner and run things properly, but I would not regard China as a good example of this.
Don’t forget that some people in China are still controlling the country on the basis of their political beliefs, the majority of the country simply has no power and thus little incentive to be politically active.
Maybe you’re right that (1) religious and political beliefs are about identity more than truth and that (2) this explains why evangelism is a frowned upon. But could strategy be another explanation for the taboo on proselytism?
It would seem that if people are annoyed by badgering (no matter the content or intent), then badgering people to give up animal products or accept Christ would be counterproductive. Thus non-evangelism would end up more effective in such a society.
Thought experiment:
You, a white male, step through a magic portal and find yourself in a place that seems very similar to America in 1850. You are “recognized” as the missing son of a wealthy slaveowner; you have the same name, and a photograph of that son looks exactly like you did a few years ago. Your “family” appears to believe that your odd behavior and loss of memory must be due to an injury that you sustained during your absence.
Having familiarized yourself with your new identity and environment, you come to the distressing conclusion that your 21st century education taught you relatively little technical information of practical use to someone in 1850. On the bright side, your father seems very willing to spend money on your behalf, and you even have quite a lot in your own name that you could rely on if you happen to end up alienating the rest of your family somehow… by, say, becoming an outspoken abolitionist. It’s obvious to you that slavery is a great moral wrong, but what should you do about it? And because you’re not actually in your own country’s past, you can’t count on a civil war deciding the issue in your favor fifteen years from now—and fifteen years is a long time to wait for freedom...
For some reason, I actually like it when Jehova’s witnesses come to my door. I find it a very warm and fuzzy experience. I know a little bit about the bible (catholic school) so I can communicate with them, and I enjoy having people read to me. I am very upfront that I am willing to listen, but very unlikely to be receptive.
I think “taboo” is a bit of a strong word—if most people don’t go around trying to persuade others to change their minds, it’s not because they learned a social norm against that, but rather that they learned through trial and error that 1) having someone try to change your mind is annoying, 2) when they try to change someone’s mind, that person is likely to be annoyed and less friendly 3) that person is also likely to counter-argument and may push you to re-examine your beliefs, which is uncomfortable.
From a behaviorist point of view, it’s a behavior that gets a lot of punishment and no reward, so it’s bound to disappear.
Contrast that with something like saying bad things of somebody behind his back, which doesn’t get immediate negative feedback, and is kept down mostly by social norms.
It may be a behavior learned by feedback, but that doesn’t mean that there aren’t strong social norms against it. When I mentioned to a friend my intention to write this article, the first thing it made her think of was a comedian who said that, as a strong Christian, one of the most discomforting things he could hear someone say was “I’d like to talk to you about Jesus.” Similarly, consider all the people who take dietary restrictions on themselves for moral rather than health reasons, but are uncomfortable seeing people try to convince others of that same position. These people have internalized a value that causes them to react negatively to others confronting people on closely personal beliefs.
I can think of a special case where this taboo doesn’t hold (or has very different dynamics), which is worth studying for comparison: courtship. When courting someone (which in practice means a man to a woman), you’re basically trying to persuade someone to make decisions she wouldn’t otherwise make, and which will benefit you. Yet this kind of persuasion is in some sense expected.
In NYC, I asked a PUA what makes me, approaching a woman with romantic intent, any different from a panhandler? His answer was (paraphrasing) “because that’s the role they expect out of men”.
Different kind of persuasion.
PUA is more like salesmanship, or professional self-promotion. People don’t object to that much if it’s done skillfully. This post is more about the kind of persuasion that tells people they are incorrect, about things like religious beliefs. People tend to be more offended by that.
People don’t object to skillful persuaders of the other kind, either. See, when you can limit yourself to the set of “happy customers”, both kinds look the same. But that doesn’t answer the question about the basis for why one kind is accepted and one kind isn’t.
If beliefs have consequences, and persuasion is effective, then, with the benefit of hindsight, most people would be grateful for most episodes of persuasion. For example, suppose I want to visit my cousins on Mars for a few decades, and I believe that the regular space shuttle carries a 10% risk of death and the mass driver carries a 20% risk of death. If you successfully convince me that I’ve got it backward, and that really it’s the mass driver that’s safer, then even if I was really annoyed for a few hours, I’ll probably thank you afterward for potentially saving my life.
Note that there are three main ways persuasion could fail: an argument has little evidence to back it up, you are bad at arguing, or I am bad at listening. If the argument has little evidence to back it up, then the belief, so far as we can tell, does not have consequences, and you should leave me alone. If you are bad at arguing, then you need to go practice with friends and leave me alone. If I am bad at listening, then I need to practice with friends, and unless the argument is about something both urgent and important, the best thing you can do for me is be my friend and help me practice being convinced on topics that I am less emotionally invested in.
But remember that anyone with a sufficiently high vested interest in convincing others of an idea would feel free to do so. The example you gave doesn’t sound much like something people would be hesitant to convince you of today; it’s easy to envision a friend telling you that it’s safer to take a trip by plane. The scenario looks much different if you simply envision it as an increase in the exchange of advice rather than an increase in confrontations of what we currently consider identity politics.
I guess the reason why I chose an example that sounded like an exchange of advice is to point out that even if you had a huge chunk of your identity wrapped up in your belief that the shuttle was safer, you would still be glad, in hindsight, that I confronted you, no matter how uncomfortable the confrontation was, because knowing the truth has set you free.
Where individual beliefs most likely do not have consequences (theology, national politics, parenting styles, sports-team-affiliation, etc.), there should still be a norm against unwanted confrontation.
All of those seem to be things that do have significant consequences though, with the possible exception of sports team affiliation.
Admittedly some questions of theology seem almost completely inconsequential (what does it matter if Jesus is consubstantial with God?) but others would be matters of extreme importance if true. Anything with a bearing on how to achieve a desired afterlife, for example. They only seem inconsequential if you haven’t internalized the idea that they apply to anything real.
My subjective impression is that most moderately religious people in industrialized countries haven’t. Otherwise, when relatives drop out of the faith, you would expect to see them get daily evangelical phone calls, rather than frosty silence.
Likewise parenting and politics—there are 10 partisan hacks who have trouble making friends with people of the opposite party for every 1 activist who actually leaves her county to do some electioneering. You hear a lot about parents who don’t want their kid associating with what they see as the children of unduly (lax / anal-retentive) parents, and these people might urgently defend their views at, e.g., a dinner party, but you rarely hear of campaigns where a parent goes around trying to convince all her closest friends (let alone the whole community) that X parenting style ruins kids’ lives. Hell, people usually don’t even do that when they think mercury in vaccines causes autism.
People believe that they believe that parenting, politics, and religion have consequences, but they don’t actually believe it. That’s my opinion, anyway.
By the way, this comment has inspired me to make a top level post on the distinction between belief in belief and internalization
That’s more a case of people saving their own kids before saving their neighbors’. If it’s sufficiently hard to save oneself, people won’t always get to the save one’s neighbor part.
That makes plenty of sense, Eugine_Nier, but the premise of this whole little exchange (admittedly, several layers up in the comment thread) was that at least some people do care enough to try to save their neighbors, and only refrain because of social norms against being annoyingly evangelical.
In particular violating that social norm would make it harder for them to save themselves.
That sounds like a factual belief rather than an opinion.
And I think you’re right that most people haven’t internalized a sense of the consequences of their beliefs, although they may consciously recognize that they have consequences. This isn’t surprising, people have a pretty general weakness at internalizing beliefs when they pertain to things they can’t observe up close on a regular basis.
I think there’s probably a salient difference between things you only believe you believe though, and things that you believe but haven’t truly internalized. I really do believe, for instance, that more than a billion people in this world suffer from starvation. I can confidently make predictions contingent on it being true. But if I had really internalized that belief, it would have a significantly greater bearing on my actions than it does.
Difference?
Opinions are subjective, and thus can’t be confirmed or denied as matters of fact. Perhaps people will sometimes try to employ “in my opinion” as a fully general defense against having their statements disputed, but some beliefs are opinions and some are not.
Let’s unpack this. When some statement is expressed “as an opinion”, does the statement have any meaning? If it doesn’t have any meaning, that’s a serious problem. If it does, can we inquire about its correctness? If we can’t, that’s rather surprising, give an example of when that happens.
The statement does have meaning, but it’s subjective to the person expressing it. For instance, I might say that “In my opinion, Cowboy Bebop is the greatest animated series ever made.” It has factual implications; I may predict that I will enjoy watching Cowboy Bebop more than any other animated series, or notice more artistic choices that I consider to be well done. But I will not be able to predict that other people will enjoy Cowboy Bebop more than other series, or have similarly positive assessments of its artistic merit. I could make those predictions for anyone I knew to have the same preferences and values as I do, and I can provide arguments in favor of those preferences and values, but I can’t provide evidence for them.
A factual claim can well be limited by one’s inability to communicate its truth to others, that doesn’t make that claim any less about the world, it just indicates a certain technical difficulty in managing it. Furthermore, if the claim is about your emotions, as you suggest with your example, and you set out to figure out a way of communicating or re-examining it (like with any other factual claim), then you can find creative ways of doing so, such as taking measurements of brain activity in the relevant contexts.
A statement of opinion can certainly be factual, in that it is objectively true that it is your opinion. If I say that I believe that kicking dogs is wrong, this can certainly be a true statement, but it’s a statement about me. You can’t go out into the world and measure the wrongness of kicking dogs.
If it were only Mass Driver’s opinion that people do not internalize their beliefs on matters such as politics, parenting or religion, you might be able to confirm that he believed it, but you would not expect to be able to test its truth by examining the behavior of other people. If the belief does carry the expectation that you would be able to test its truth by examining the behavior of other people, then it’s not really an opinion.
This is exactly what I’m going to do. And the world will be filled with mass-produced goodness-of-not-kicking-dogs. But you won’t be there to see its moral hollowness, because you’re made out of atoms.
Then it can well be incorrect, for example because of misremembered detail, biased account or as an intentional lie. My previous comment describes a possible way of getting a second account of its correctness other than through your own words. Recall that what we started with was your statement, which this discussion seems to clearly disarm:
A statement of opinion can be a lie, its truth value is simply only observable as an effect on the person making it.
My original statement was imprecise, but I’m confused as to why you would take issue with the idea that there’s a distinction between statements that are and are not opinions.
Because statements that “can’t be confirmed or denied as matters of fact” are improper beliefs and shouldn’t be allowed to take precious attention in one’s mind. What you call “opinions” are either such statements and should be exorcised, or not, in which case whether they are to be confirmed or denied as matters of fact is the main and only question to entertain about them, the reason to keep them around, even if no further observations can help with knowing their status by indulging inefficient use of existing evidence.
Would it have helped if I had said that opinions are normative rather than positive?
Helped for what purpose? Have we made progress on the interpretation of your words where my arguments more easily apply? Do you see the problem with your statement now?
For normative statements, all the same points hold, but it’s more difficult to argue, and this position is less widely accepted. Let’s make sure we agree on factual side first.
Which statement of mine are you asking if I see a problem with? My original description of what distinguishes opinions was imprecise, but I was confused by the idea that you thought such a description was necessary at all. I still see no problem with stating that Mass_Driver’s assertion did not qualify as an opinion.
Well, to me, persuasion is usually annoying because it is so blunt. Prolonged exposure to media that proclaim a neutral point of view (such as Wikipedia, where it is enforced by consensus) has seemingly “programmed” me so that biased arguments feel like attempts to rewire my brain.
When I try to convince someone, instead of stating my beliefs outright, I try to poke logical gaps in the opponent’s, in the hope that it will get them to think critically.
Is it your experience that people don’t take offense at this though, particularly in matters of identity critical beliefs? It certainly isn’t mine.
I would change “another person” to “a non-evangelist,” because that sentence as is has some unfortunate implications.
It seems like it’s better to circumvent the taboo on persuasion than it is to remove it. You don’t want to be evangelized all of the time- and so maybe a polite rationalist society would set aside Sunday as Persuasion Day, where people are encouraged to change other’s minds and have their own minds changed.
Or, perhaps we would have a situation like now, where if you want to argue you head to an internet forum, and if you want to have your mind changed you head to a good internet forum.
I haven’t yet encountered any evangelists criticizing other evangelists for not trying hard enough, if that’s what you mean.
If you mean that it implies that I’ve criticized evangelists for not trying hard enough, that’s entirely deliberate.
The first is what I meant; I imagine you must not attend gatherings of evangelists, then. Because there’s definitely strong encouragement to evangelize and criticism of those that don’t (but how well the criticism is hid is just a manner of politeness).
And, actually, the criticism might be hiding in plain sight. The difference between “evangelize” and “persuade” is just whether or not you make a value judgment once you’ve strip out the implication that you’re persuading someone of the Christian gospel. Have you really never heard anyone criticize someone else for not trying to persuade a third party?