If the victim is someone whom I have never met or interacted with, and am unlikely to meet or interact with, I shrug and think ‘so what? so many people get mugged every day, why should I worry about this one in particular?’
Am I the only person here who is in any way moved by accounts of specific victims? Nonfiction writers can switch you to near-mode too, or at least they can to me.
OK, so you care about detailed accounts. Doesn’t that suggest that if you, y’know, knew more details about all those people being mugged, you would care more? So it’s just ignorance that leads you to discount their suffering?
Fictional accounts … well, people never have been great at distinguishing between imagination and reality, which, if you think about it, is actually really useful.
Really? My System 2 thinks System 2 is annoyingly incapable of seeing details, and System 1 is annoyingly incapable of seeing the big picture, and wants to use System 1 as a sort of zoom function to approximate something less broken.
Like army1987, I can be moved by accounts of specific victims, whether they are fictional or not. There is a bug here, and the bug is this; that I am moved the same amount by an otherwise identical fictional or nonfictional account, where the nonfictional account contains no-one with whom I have ever interacted.
That is, simply knowing that an account is non-fictional doesn’t affect my emotional reaction, one way or another. (This doesn’t mean I am entirely without sympathy for people I have never met—it simply means that I have equivalent sympathy for fictional characters). This is a bug; ideally, my emotional reaction should take into account such an important detail as whether or not something really happened. After all, what detail could be more important?
It’s not a bug, it’s a feature (in some contexts).
Consider you were playing 2 games of online chess against an anonymous opponent. You barely lose the first one. Now you’re feeling the spirit of competition, your blood boiling for revenge! Should you force yourself to relinquish the thrill of the contest, because “it doesn’t really matter”? That would be no fun! :-(
If you’re reading a work of fiction, knowing it is fiction, why are you doing so? Because emotional investment is fun? Why would you then sabotage your enjoyment by trying to downsize your emotional investment, since “it’s not real”? Also no fun! :-(
If the flawed heuristic you are employing in a certain context works in your favor in that context, switching it off would be dumb (although being vaguely aware of it would not be).
I’m not sure I’d characterize that as a “bug”, more a feature we need to be aware of and take into account.
If you weren’t moved by fictional scenarios, you wouldn’t be able to empathize with people in those scenarios—including your future self! We mostly predict other people’s actions by using our own brain as a black box, imaging ourselves in their situation and how we would react, so there goes any situation featuring other humans. And we couldn’t daydream or enjoy fiction, either.
Would it be useful to turn it off? Maaaybe, but as long as you don’t start taking hypothetical people’s wishes into account, and stop reading stuff that triggers you, you’re fine—I bet the consequences for misuse would be higher than the marginal benefits.
I don’t think that empathising with fictional characters should be turned off. I just think that properly calibrated emotions should take all factors into account, with properly relevant weightings. I notice that my emotions do not seem to be taking the ‘reality’ factor into account, and I therefore conclude that my emotions are poorly calibrated.
My future self would be a potentially real scenario, and thus would deserve all the emotional investment appropriate for a situation that may well come to pass. (He also gets the emotional investment for being me, which is quite large).
I’m not sure whether I should be feeling more sympathy for strangers, or less sympathy for fictional people.
So … are you saying that they’re poorly calibrated, but that’s fine and nothing to worry about as long as we don’t forget it and start giving imaginary people moral weight? Because if so, I agree with you on this.
More or less. I’m also saying that it might be nice if they were better calibrated. It’s not urgent or particularly important, it’s just something about myself that I noticed at the start of this discussion that I hadn’t noticed before.
Am I the only person here who is in any way moved by accounts of specific victims? Nonfiction writers can switch you to near-mode too, or at least they can to me.
If the account is detailed enough, it does move me, but not much more than an otherwise identical account that I know is fictional.
Phew! I was getting worried there.
OK, so you care about detailed accounts. Doesn’t that suggest that if you, y’know, knew more details about all those people being mugged, you would care more? So it’s just ignorance that leads you to discount their suffering?
Fictional accounts … well, people never have been great at distinguishing between imagination and reality, which, if you think about it, is actually really useful.
No, I mean that more details will switch my System 1 into near mode. My System 2 thinks that’s a bug, not a feature.
Really? My System 2 thinks System 2 is annoyingly incapable of seeing details, and System 1 is annoyingly incapable of seeing the big picture, and wants to use System 1 as a sort of zoom function to approximate something less broken.
I guess I’m unusual in this regard?
Like army1987, I can be moved by accounts of specific victims, whether they are fictional or not. There is a bug here, and the bug is this; that I am moved the same amount by an otherwise identical fictional or nonfictional account, where the nonfictional account contains no-one with whom I have ever interacted.
That is, simply knowing that an account is non-fictional doesn’t affect my emotional reaction, one way or another. (This doesn’t mean I am entirely without sympathy for people I have never met—it simply means that I have equivalent sympathy for fictional characters). This is a bug; ideally, my emotional reaction should take into account such an important detail as whether or not something really happened. After all, what detail could be more important?
It’s not a bug, it’s a feature (in some contexts).
Consider you were playing 2 games of online chess against an anonymous opponent. You barely lose the first one. Now you’re feeling the spirit of competition, your blood boiling for revenge! Should you force yourself to relinquish the thrill of the contest, because “it doesn’t really matter”? That would be no fun! :-(
If you’re reading a work of fiction, knowing it is fiction, why are you doing so? Because emotional investment is fun? Why would you then sabotage your enjoyment by trying to downsize your emotional investment, since “it’s not real”? Also no fun! :-(
If the flawed heuristic you are employing in a certain context works in your favor in that context, switching it off would be dumb (although being vaguely aware of it would not be).
Oh, it does matter. There’s a real opponent there. That’s reality.
You make a good point.
I’m not sure I’d characterize that as a “bug”, more a feature we need to be aware of and take into account.
If you weren’t moved by fictional scenarios, you wouldn’t be able to empathize with people in those scenarios—including your future self! We mostly predict other people’s actions by using our own brain as a black box, imaging ourselves in their situation and how we would react, so there goes any situation featuring other humans. And we couldn’t daydream or enjoy fiction, either.
Would it be useful to turn it off? Maaaybe, but as long as you don’t start taking hypothetical people’s wishes into account, and stop reading stuff that triggers you, you’re fine—I bet the consequences for misuse would be higher than the marginal benefits.
I don’t think that empathising with fictional characters should be turned off. I just think that properly calibrated emotions should take all factors into account, with properly relevant weightings. I notice that my emotions do not seem to be taking the ‘reality’ factor into account, and I therefore conclude that my emotions are poorly calibrated.
My future self would be a potentially real scenario, and thus would deserve all the emotional investment appropriate for a situation that may well come to pass. (He also gets the emotional investment for being me, which is quite large).
I’m not sure whether I should be feeling more sympathy for strangers, or less sympathy for fictional people.
So … are you saying that they’re poorly calibrated, but that’s fine and nothing to worry about as long as we don’t forget it and start giving imaginary people moral weight? Because if so, I agree with you on this.
More or less. I’m also saying that it might be nice if they were better calibrated. It’s not urgent or particularly important, it’s just something about myself that I noticed at the start of this discussion that I hadn’t noticed before.
Fair enough. Tapping out, since this seems to have resolved itself.