I agree with you on the complexity of value. However, perhaps we are imagining the ideal way of aggregating all those complex values differently. I absolutely agree that the simple models I keep proposing for individual values are spherical cows, and ignore a lot of nuance. I just don’t see things working radically differently when the nuance is added in, and the values aggregated.
That sounds like a really complex discussion though, and I don’t think either of us is likely to convince the other without a novel’s worth of text. However, perhaps I can convince you that you already are suppressing some impulses, and that this isn’t always disastrous. (Though it certainly can be, if you choose the wrong ones.)
there aren’t large benefits to be gained by discarding some emotions and values.
Isn’t that what akrasia is?
If I find that part of me values one marshmallow now at the expense of 2 later, and I don’t endorse this upon reflection, wouldn’t it make sense to try and decrease such impulses? Removing them may be unnecessarily extreme, but perhaps that’s what some nootropics do.
Similarly, if I were to find that I gained a sadistic pleasure from something, I wouldn’t endorse that outside of well defined S&M. If I had an alcoholism problem, I’d similarly dislike my desire for alcohol. I suspect that strongly associating cigarettes with disgust is helpful in counteracting the impulse to smoke.
If I understand correctly, some Buddhist try to eliminate suffering by eliminating their desires. I find this existentially terrifying. However, I think that boosting and suppressing these sorts of impulses is precisely what psychologists call conditioning. A world where none refines or updates their natural impulses is just as unsettling as the Buddhist suppression of all values.
So, even if you don’t agree that there are cases where we should suppress certain pro-social emotions, do you agree with my characterization of antisocial emotions and grey area impulses like akrasia?
(I’m using values, impulses, emotions, etc fairly interchangeably here. If what I’m saying isn’t clear, let me know and I can try to dig into the distinctions.)
I think I understand your point better now, and I agree with it.
My conscious, deliberative, speaking self definitely wants to be rid of akrasia and to reduce time discounting. If I could self modify to remove akrasia, I definitely would. But I don’t want to get rid of emotional empathy, or filial love, or the love of cats that makes me sometimes feed strays. I wouldn’t do it if I could. This isn’t something I derive from or defend by higher principles, it’s just how I am.
I have other emotions I would reduce or even remove, given the chance. Like anger and jealousy. These can be moral emotions no less than empathy—righteous anger, justice and fairness. It stands to reason some people might feel this way about any other emotion or desire, including empathy. When these things already aren’t part of the values their conscious self identifies with, they want to reduce or discard them.
And since I can be verbally, rationally convinced to want things, I can be convinced to want to discard emotions I previously didn’t.
It’s a good thing that we’re very bad at actually changing our emotional makeup. The evolution of values over time can lead to some scary attractor states. And I wouldn’t want to permanently discard one feeling during a brief period of obsession with something else! Because actual changes take a lot of time and effort, we usually only go through with the ones we’re really resolved about, which is a good condition to have. (Also, how can you want to develop an emotion you’ve never had? Do you just end up with very few emotions?)
Agreed. I’ll add 2 things that support of your point, though.
First, the Milgram experiment seems to suggest that even seemingly antisocial impulses like stubbornness can be extremely valuable. Sticking to core values rather than conforming likely led more people to resist the Nazis.
Also, I didn’t bring it up earlier because it undermines my point, but apparently sociopaths have smaller amygdalas than normal, while kidney donors have larger ones, and empathy is linked to that region of the brain. So, we probably could reduce or remove emotional empathy and/or cognitive empathy if we really wanted to. However, I’m not at all inclined to inflict brain damage on myself, even if it could somehow be targeted enough to not interfere with cognitive empathy or anything else.
So, more generally, even reversible modification worries me, and the idea of permanently changing our values scares the shit out of me. For humanity as a whole, although not necessarily small groups of individuals as a means to an end, I don’t endorse most modifications. I would much rather we retain a desire we approve of but which the laws of physics prevent us from satisfying, than to remove that value and be fulfilled.
I agree with you on the complexity of value. However, perhaps we are imagining the ideal way of aggregating all those complex values differently. I absolutely agree that the simple models I keep proposing for individual values are spherical cows, and ignore a lot of nuance. I just don’t see things working radically differently when the nuance is added in, and the values aggregated.
That sounds like a really complex discussion though, and I don’t think either of us is likely to convince the other without a novel’s worth of text. However, perhaps I can convince you that you already are suppressing some impulses, and that this isn’t always disastrous. (Though it certainly can be, if you choose the wrong ones.)
Isn’t that what akrasia is?
If I find that part of me values one marshmallow now at the expense of 2 later, and I don’t endorse this upon reflection, wouldn’t it make sense to try and decrease such impulses? Removing them may be unnecessarily extreme, but perhaps that’s what some nootropics do.
Similarly, if I were to find that I gained a sadistic pleasure from something, I wouldn’t endorse that outside of well defined S&M. If I had an alcoholism problem, I’d similarly dislike my desire for alcohol. I suspect that strongly associating cigarettes with disgust is helpful in counteracting the impulse to smoke.
If I understand correctly, some Buddhist try to eliminate suffering by eliminating their desires. I find this existentially terrifying. However, I think that boosting and suppressing these sorts of impulses is precisely what psychologists call conditioning. A world where none refines or updates their natural impulses is just as unsettling as the Buddhist suppression of all values.
So, even if you don’t agree that there are cases where we should suppress certain pro-social emotions, do you agree with my characterization of antisocial emotions and grey area impulses like akrasia?
(I’m using values, impulses, emotions, etc fairly interchangeably here. If what I’m saying isn’t clear, let me know and I can try to dig into the distinctions.)
I think I understand your point better now, and I agree with it.
My conscious, deliberative, speaking self definitely wants to be rid of akrasia and to reduce time discounting. If I could self modify to remove akrasia, I definitely would. But I don’t want to get rid of emotional empathy, or filial love, or the love of cats that makes me sometimes feed strays. I wouldn’t do it if I could. This isn’t something I derive from or defend by higher principles, it’s just how I am.
I have other emotions I would reduce or even remove, given the chance. Like anger and jealousy. These can be moral emotions no less than empathy—righteous anger, justice and fairness. It stands to reason some people might feel this way about any other emotion or desire, including empathy. When these things already aren’t part of the values their conscious self identifies with, they want to reduce or discard them.
And since I can be verbally, rationally convinced to want things, I can be convinced to want to discard emotions I previously didn’t.
It’s a good thing that we’re very bad at actually changing our emotional makeup. The evolution of values over time can lead to some scary attractor states. And I wouldn’t want to permanently discard one feeling during a brief period of obsession with something else! Because actual changes take a lot of time and effort, we usually only go through with the ones we’re really resolved about, which is a good condition to have. (Also, how can you want to develop an emotion you’ve never had? Do you just end up with very few emotions?)
Agreed. I’ll add 2 things that support of your point, though.
First, the Milgram experiment seems to suggest that even seemingly antisocial impulses like stubbornness can be extremely valuable. Sticking to core values rather than conforming likely led more people to resist the Nazis.
Also, I didn’t bring it up earlier because it undermines my point, but apparently sociopaths have smaller amygdalas than normal, while kidney donors have larger ones, and empathy is linked to that region of the brain. So, we probably could reduce or remove emotional empathy and/or cognitive empathy if we really wanted to. However, I’m not at all inclined to inflict brain damage on myself, even if it could somehow be targeted enough to not interfere with cognitive empathy or anything else.
So, more generally, even reversible modification worries me, and the idea of permanently changing our values scares the shit out of me. For humanity as a whole, although not necessarily small groups of individuals as a means to an end, I don’t endorse most modifications. I would much rather we retain a desire we approve of but which the laws of physics prevent us from satisfying, than to remove that value and be fulfilled.