Further, of course, we know that lightning strikes are not controlled by intelligent beings, while terrorist strikes are.
If there’s a major multi-fatality lightning strike, it’s unlikely to encourage weather phenomena to engage in copycat attacks. Nor will all sorts of counter-lightning measures dissuade clouds from generating static electricity and instead dumping more rain or something.
Right. I think this is one of the key issues. When things like ‘natural’, ‘random’ (both in where, when, and how often they happen) or are otherwise uncontrollable, humans are much keener to accept them. When agency comes into play, it changes the perspective on it completely: “how could we have changed culture/society/national policies/our surveillance system/educational system/messaging/nudges/pick your favorite human-controllable variable” to have prevented this, or prevent it in the future? It’s the very idea that we could influence it and/or that it’s perpetuated by ‘one of us’ that makes it so salient and disturbing. From a consequentialist perspective, it’s definitely not rational, and we shouldn’t (ideally) affect our allocation of resources to combat threats.
Is there a particular bias that covers “caring about something more, however irrelevant/not dangerous, just because a perceived intelligent agent was responsible?”
Well, there are definitely forms that are irrational, but there’s also the perfectly rational factor of having to account for feedback loops.
We don’t have to consider that shifting resources from lightning death prevention to terrorism prevention will increase the base rate of lightning strikes; we do have to consider that a shift in the other direction can increase (or perhaps decrease) the base rate of terrorist activity. It is thus inherently hard to compare the expected effect of a dollar of lightning strike prevention against a dollar of terrorism prevention, over and above the uncertainties involved in comparing the expected effect of (say) a dollar of lightning strike prevention against a dollar of large asteroid collision protection.
Further, of course, we know that lightning strikes are not controlled by intelligent beings, while terrorist strikes are.
If there’s a major multi-fatality lightning strike, it’s unlikely to encourage weather phenomena to engage in copycat attacks. Nor will all sorts of counter-lightning measures dissuade clouds from generating static electricity and instead dumping more rain or something.
Right. I think this is one of the key issues. When things like ‘natural’, ‘random’ (both in where, when, and how often they happen) or are otherwise uncontrollable, humans are much keener to accept them. When agency comes into play, it changes the perspective on it completely: “how could we have changed culture/society/national policies/our surveillance system/educational system/messaging/nudges/pick your favorite human-controllable variable” to have prevented this, or prevent it in the future? It’s the very idea that we could influence it and/or that it’s perpetuated by ‘one of us’ that makes it so salient and disturbing. From a consequentialist perspective, it’s definitely not rational, and we shouldn’t (ideally) affect our allocation of resources to combat threats.
Is there a particular bias that covers “caring about something more, however irrelevant/not dangerous, just because a perceived intelligent agent was responsible?”
Well, there are definitely forms that are irrational, but there’s also the perfectly rational factor of having to account for feedback loops.
We don’t have to consider that shifting resources from lightning death prevention to terrorism prevention will increase the base rate of lightning strikes; we do have to consider that a shift in the other direction can increase (or perhaps decrease) the base rate of terrorist activity. It is thus inherently hard to compare the expected effect of a dollar of lightning strike prevention against a dollar of terrorism prevention, over and above the uncertainties involved in comparing the expected effect of (say) a dollar of lightning strike prevention against a dollar of large asteroid collision protection.