If you’re backing a cause which doesn’t inspire the action you think it deserves, and you find yourself twisting the truth a bit for dramatic effect, how strong evidence is that that your cause is less worthy than you think it is? Can you give examples where you would go ahead and twist the truth anyway?
Ideally, I would estimate the negative effects: how many people would later learn I lied and abandon my cause, and how enemies of the cause might use the fact I lied against it, and the reputational harm to my other causes and to my allies.
To stop me from lying, a moral theory that says lying is wrong for me as a person would have to give this greater weight than some non-negligible factor in the success of a cause which could drastically affect millions of people for the better, e.g. Abolition.
Ideally, I would estimate the negative effects: how many people would later learn I lied and abandon my cause, and how enemies of the cause might use the fact I lied against it, and the reputational harm to my other causes and to my allies.
Not to mention the damage the people who believe your lies might do by acting on them.
I had in mind lies that were intended to be acted on, to further my cause.
For instance, suppose my cause is to prevent the growth of a hole in the ozone layer. I tell people they must stop using CFCs. Actually it would be enough to limit the use of CFCs below some sustainable limit. But not everyone is going to listen to me, and I need to offset their CFC-use with even lower levels of usage from my followers. So I lie to my followers and tell them everyone in the world must stop using CFCs absolutely for the ozone hole to mend. That’s a lie I want them to act on.
There are other kinds of reasons why one might lie in the service of a cause, where my logic doesn’t hold. For instance, suppose my cause is to win a war. I need to convince my people to keep fighting and not accept the enemy’s armistice terms. So I lie to them, saying the enemy is building a magical doomsday weapon that can strike our people from afar, and only taking over the enemy’s lands can prevent its construction. After we win the war, my people torture and kill many of the enemy population because they refuse to reveal the location of the doomsday weapon I made up.
In this case, I didn’t want people to actually act on the lie; I just wanted its side effect of making them fight in my war.
There are other cases. For example, the main supporters of my cause happen to come from the Purple Tribe, whose religion says the germ theory of disease is false. I know they’re wrong, but to gain their support for my cause, I must lie and publicly say they are right. Then they help me win my cause, while I help them stop effective disease prevention measures—a successful alliance.
Suppose for the sake of the example that it was a huge threat, caused purely by CFCs, but limiting CFCs (instead of stopping using them entirely) would have been enough to resolve the issue.
Ideally, I would estimate the negative effects: how many people would later learn I lied and abandon my cause, and how enemies of the cause might use the fact I lied against it, and the reputational harm to my other causes and to my allies.
To stop me from lying, a moral theory that says lying is wrong for me as a person would have to give this greater weight than some non-negligible factor in the success of a cause which could drastically affect millions of people for the better, e.g. Abolition.
Not to mention the damage the people who believe your lies might do by acting on them.
I had in mind lies that were intended to be acted on, to further my cause.
For instance, suppose my cause is to prevent the growth of a hole in the ozone layer. I tell people they must stop using CFCs. Actually it would be enough to limit the use of CFCs below some sustainable limit. But not everyone is going to listen to me, and I need to offset their CFC-use with even lower levels of usage from my followers. So I lie to my followers and tell them everyone in the world must stop using CFCs absolutely for the ozone hole to mend. That’s a lie I want them to act on.
There are other kinds of reasons why one might lie in the service of a cause, where my logic doesn’t hold. For instance, suppose my cause is to win a war. I need to convince my people to keep fighting and not accept the enemy’s armistice terms. So I lie to them, saying the enemy is building a magical doomsday weapon that can strike our people from afar, and only taking over the enemy’s lands can prevent its construction. After we win the war, my people torture and kill many of the enemy population because they refuse to reveal the location of the doomsday weapon I made up.
In this case, I didn’t want people to actually act on the lie; I just wanted its side effect of making them fight in my war.
There are other cases. For example, the main supporters of my cause happen to come from the Purple Tribe, whose religion says the germ theory of disease is false. I know they’re wrong, but to gain their support for my cause, I must lie and publicly say they are right. Then they help me win my cause, while I help them stop effective disease prevention measures—a successful alliance.
Unfortunately, if your lie is successful people will act on it anyway.
Well, that raises issues about just how serious a threat was the “hole in the ozone layer”, and how much if anything it had to do with CFCs.
Suppose for the sake of the example that it was a huge threat, caused purely by CFCs, but limiting CFCs (instead of stopping using them entirely) would have been enough to resolve the issue.