I’ve seen this often in problems like climate change or animal exploitation:
“The solution is up to others. The powerful. The governments. The policy makers.”
In this way people frequently delegate their share of responsibility to more powerful or visible entities.
To illustrate with an hypothetical example: If we suddenly found out that mobile phone frequencies destroy the planet, instead of stopping using them, many people would say:
“My actions won’t make any difference. Instead it’s up to the government to ban cell phones. Why should I be the fool that starts sacrificing, while everybody else keeps enjoying cell phones?”
But the only reason the government needs to ban cell phones is that the world is full of irresponsible people who need to be coerced into doing the right thing!
Does this phenomenon have a name? Does anybody here know the underlying psychological mechanism? Is it a genuine blindness about the sea being made up of millions of small droplets? An excuse to avoid responsibility? Something else?
Beyond normal consequentialism (as discussed in other answers), there’s a game theory angle, where if you aren’t trying to model a norm into existence, it’s worthwhile to only follow the norm once it’s agreed violators will be punished.
See paulfchristiano’s post on Moral Public Goods, which argues that you will often get into situations where people would be in favor of a norm were that norm enforced, while not being in favor of the behavior the norm calls for when the norm is not enforced.
Everything has an opportunity cost. I’d claim that when impact is very small, it is almost always the case that the opportunity cost is not worthwhile. In general, one can have far more impact by focusing on one or two high-impact actions rather than spending the same aggregate time/effort on lots of little things.
Much more detail is in The Epsilon Fallacy; also see the comments on that post for some significant counterarguments.
(I’m definitely not claiming that the psychological mechanism by which people ignore small-impact actions is to think through all of this rationally. But I do think that people have basically-correct instincts in this regard, at least when political signalling is not involved.)
Your example about cell phones is a prisoner dilemma. The choice to continue using the cell phone has more utility for each individual participant if they are the only person who would stop using it. At the same time it there would be higher utility for everyone, if everyone would choose cooperate in the prisoner dilemma and stop using their cell phone.
Having a government legislate that everyone picks cooperate in a prisoners dilemma is a way to solve the prisoner dilemma.
Even if a person wants to do something about a problem, it’s often much more impactful to donate to an effective charity then to change personal behavior.
The recent founders pledge article on climate change that illustrates that principle for climate change. Animal Charity Evaluators might not be the most trustworthy source but when it comes to the numbers I see from EA’s the same principle seems true for that area as well.
Not sure if you meant “then” or if it was a typo for “than”, but either way I have an observation:
One can do both things: donate to an effective charity and change personal behavior, no?
One example I like is: vegan lifestyle vs. vegan activism.
Activism is a lot more impactful than becoming vegan oneself. By far. Because of the potential amount of people reached, and because activism can make a dent in group behavior and culture. One could even theoretically participate in activism while not even being vegan… and have more impact than a non-activist vegan!
BUT… then I pictured a scenario: All of humankind participating in vegan activism, claiming we should stop animal exploitation… while at the same time everybody eats meat. That’s just a massive-scale bluff. Collective hipocrisy.
I think that example illustrates nicely the gap we need to bridge between large scale action and personal change. And this is why I believe it’s ideal to avoid comparisons between large scale actions and personal actions. I claim they can and should be simultaneous.
A key concept of effective altruism is that you don’t ask “what would be effective if everybody does it” but focus on tractability, neglectedness and importance when choosing your own actions.
Apart from that it’s a possible feature that everybody eats meat but it’s artifical grown meat for which no animal had to suffer. It’s one of the approaches that the EA’s I know in the field consider tractable.
-
There are good reasons for thinking nuclear power is part of the solution, in the short to medium term, but it’s a major exaggeration to call it the only solution.
And no, not every CC proponent is anti nuclear. NASA scientist James Hansen was one of the earliest proponents of climate change and is pro nuclear.
https://www.google.com/amp/s/amp.theguardian.com/environment/2015/dec/03/nuclear-power-paves-the-only-viable-path-forward-on-climate-change
So? Most people who are concerned about CC don’t want or need expert knowledge, they want politicians to listen to experts.
Maybe, but we are nowhere near “enough”.
“Plans For New Reactors Worldwide(Updated January 2020) Nuclear power capacity worldwide is increasing steadily, with about 50 reactors under construction. Most reactors on order or planned are in the Asian region, though there are major plans for new units in Russia. Significant further capacity is being created by plant upgrading. Plant lifetime extension programmes are maintaining capacity, particularly in the USA. Today there are about 450 nuclear power reactors operating in 30 countries plus Taiwan, with a combined capacity of about 400 GWe. In 2018 these provided 2563 TWh, over 10% of the world’s electricity.About 50 power reactors are currently being constructed in 15 countries (see Table below), notably China, India, Russia and the United Arab Emirates.”
https://www.world-nuclear.org/information-library/current-and-future-generation/plans-for-new-reactors-worldwide.aspx
I don’t support the idea that ordinary people can have a good enough level of understanding of everything it takes to run a society. For that matter they can’t fix their own cars or bodies.
One can vote on goals and leave the implementation to experts. People trying to out think experts tends to lead to nonsense like anti vaccing.
If you think the private sector is never slow, lumbering or inefficient, I have news for you.
10% is 10%,not nothing. Since renewables are a thing, there is no need for nuclear to be 100% of the solution.
vs
These are not contradictory. States are Soylent Green—they’re made of people! There is literally no person who has a good enough level of understanding of everything it takes to run a society. More importantly, societies aren’t “run”, they’re … I don’t know. “followed”? “co-dependently-evolved”? Societies pick (or at least tolerate) the “leaders” that exemplify the confusion in goals that the society has.
Experts have fairly narrow focus, and tend to be just as incorrect as the rest of us outside their field (and often, inside, for fields with heavy political/funding influence).
Possible explanations:
1) Many impacts are not just small, but effectively zero, or even slightly negative. Spending more effort/resources to do things that APPEAR good but actually don’t matter, is a net harm.
2) Some items have threshold or nonlinear impact such that it’s near-zero unless everybody (or at least more than are likely) does them. This gets to second-order arguments of “my example won’t influence the people who need to change”, but the argument does recurse well.
3) The world is, in fact, full of irresponsible people. Unfortunately, it’s mostly governed by those same people.
4) Reasons given for something don’t always match the actual causality. “It wouldn’t matter” is more socially defensible than “I value my comfort over the aggregate effect”.
5) Relative rather than absolute measures—“I’m a sucker” vs “the world is slightly better”.
6) The https://en.wikipedia.org/wiki/Bystander_effect may not be a real thing, but there is an element of social proof in the idea that if most people are doing something, it’s probably OK.
If my action has a zero or infinitesimal positive impact on the relevant problem, while a negative and non-infinitesimal impact on me, cost-benefit analysis concludes I should not do it. I think OP needs to do more work to justify why they think this is not so.
I didn’t claim that is not the case.
You seem to think that an altruist action that harms me but benefits the whole planet should have at least a certain amount X of positive impact on the planet… otherwise it’s not worth certain sacrifices. And to that, I say: Fair enough!
To give an absurd example: Giving up civilized life, and starting to live in the middle of the forest without any technology would be a silly, disproportionate, ineffective sacrifice to do in order to help Climate Change. It’s a nonsensical plan. And I agree with you.
I think what I’m trying to figure out is… how can we maximize benefit to the planet?
Can we aim at a certain ratio of personal sacrifice / benefit to the planet?
Can we even measure the benefit? Does it make sense to take it into account?
Perhaps we should just make the maximum amount of sacrifice we’d willing to do, try to inspire others to to the same, and hope for the best?
What do you think?
Laziness, apathy, indifference, lack of self-responsibility, weakness, stupidity, selfishness, herd mentality?
Ultimately the only person’s behaviour you can change is your own. Either you chose to do better things or you don’t. Lead by example if you care, otherwise you don’t care enough to change.