I’m not sure I’d characterize that as a “bug”, more a feature we need to be aware of and take into account.
If you weren’t moved by fictional scenarios, you wouldn’t be able to empathize with people in those scenarios—including your future self! We mostly predict other people’s actions by using our own brain as a black box, imaging ourselves in their situation and how we would react, so there goes any situation featuring other humans. And we couldn’t daydream or enjoy fiction, either.
Would it be useful to turn it off? Maaaybe, but as long as you don’t start taking hypothetical people’s wishes into account, and stop reading stuff that triggers you, you’re fine—I bet the consequences for misuse would be higher than the marginal benefits.
I don’t think that empathising with fictional characters should be turned off. I just think that properly calibrated emotions should take all factors into account, with properly relevant weightings. I notice that my emotions do not seem to be taking the ‘reality’ factor into account, and I therefore conclude that my emotions are poorly calibrated.
My future self would be a potentially real scenario, and thus would deserve all the emotional investment appropriate for a situation that may well come to pass. (He also gets the emotional investment for being me, which is quite large).
I’m not sure whether I should be feeling more sympathy for strangers, or less sympathy for fictional people.
So … are you saying that they’re poorly calibrated, but that’s fine and nothing to worry about as long as we don’t forget it and start giving imaginary people moral weight? Because if so, I agree with you on this.
More or less. I’m also saying that it might be nice if they were better calibrated. It’s not urgent or particularly important, it’s just something about myself that I noticed at the start of this discussion that I hadn’t noticed before.
I’m not sure I’d characterize that as a “bug”, more a feature we need to be aware of and take into account.
If you weren’t moved by fictional scenarios, you wouldn’t be able to empathize with people in those scenarios—including your future self! We mostly predict other people’s actions by using our own brain as a black box, imaging ourselves in their situation and how we would react, so there goes any situation featuring other humans. And we couldn’t daydream or enjoy fiction, either.
Would it be useful to turn it off? Maaaybe, but as long as you don’t start taking hypothetical people’s wishes into account, and stop reading stuff that triggers you, you’re fine—I bet the consequences for misuse would be higher than the marginal benefits.
I don’t think that empathising with fictional characters should be turned off. I just think that properly calibrated emotions should take all factors into account, with properly relevant weightings. I notice that my emotions do not seem to be taking the ‘reality’ factor into account, and I therefore conclude that my emotions are poorly calibrated.
My future self would be a potentially real scenario, and thus would deserve all the emotional investment appropriate for a situation that may well come to pass. (He also gets the emotional investment for being me, which is quite large).
I’m not sure whether I should be feeling more sympathy for strangers, or less sympathy for fictional people.
So … are you saying that they’re poorly calibrated, but that’s fine and nothing to worry about as long as we don’t forget it and start giving imaginary people moral weight? Because if so, I agree with you on this.
More or less. I’m also saying that it might be nice if they were better calibrated. It’s not urgent or particularly important, it’s just something about myself that I noticed at the start of this discussion that I hadn’t noticed before.
Fair enough. Tapping out, since this seems to have resolved itself.