I have a question related to the “Not the same person” part, the answer to which is a crux for me.
Let’s suppose you are imagining a character who is experiencing some feeling. Can that character be feeling what it feels, while you feel something different? Can you be sad while your character is happy, or vice versa?
I find that I can’t—if I imagine someone happy, I feel what I imagine they are feeling—this is the appeal of daydreams. If I imagine someone angry during an argument, I myself feel that feeling. There is no other person in my mind having a separate feeling. I don’t think I have the hardware to feel two people’s worth of feelings at once, I think what’s happening is that my neural hardware is being hijacked to run a simulation of a character, and while this is happening I enter into the mental state of that character, and in important respects my other thoughts and feelings on my own behalf stop.
So for me, I think my mental powers are not sufficient to create a moral patient separate from myself. I can set my mind to simulating what someone different from real-me would be like, and have the thoughts and feelings of that character follow different paths than my thoughts would, but I understand “having a conversation between myself and an imagined character”, which you treat as evidence there are two people involved, as a kind of task-switching, processor-sharing arrangement—there are bottlenecks in my brain that prevent me from running two people at once, and the closest I can come is thinking as one conversation partner and then the next and then back to the first. I can’t, for example, have one conversation partner saying something while the other is not paying attention because they’re thinking of what to say next and only catches half of what was said and so responds inappropriately, which is a thing that I hear is not uncommon in real conversations between two people. And if the imagined conversation involves a pause which in a conversation between two people would involve two internal mental monologues, I can’t have those two mental monologues at once. I fully inhabit each simulation/imagined character as it is speaking, and only one at a time as it is thinking.
If this is true for you as well, then in a morally relevant respect I would say that you and whatever characters you create are only one person. If you create a character who is suffering, and inhabit that character mentally such that you are suffering, that’s bad because you are suffering, but it’s not 2x bad because you and your character are both suffering—in that moment of suffering, you and your character are one person, not two.
I can imagine a future AI with the ability to create and run multiple independent human-level simulations of minds and watch them interact and learn from that interaction, and perhaps go off and do something in the world while those simulations persist without it being aware of their experiences any more. And for such an AI, I would say it ought not to create entities that have bad lives. And if you can honestly say that your brain is different than mine in such a way that you can imagine a character and you have the mental bandwidth to run it fully independently from yourself, with its own feelings that you know somehow other than having it hijack the feeling-bits of your brain and use them to generate feelings which you feel while what you were feeling before is temporarily on pause (which is how I experience the feelings of characters I imagine), and because of this separation you could wander off and do other things with your life and have that character suffer horribly with no ill effects to you except the feeling that you’d done something wrong… then yeah, don’t do that. If you could do it for more than one imagined character at a time, that’s worse, definitely don’t.
But if you’re like me, I think “you imagined a character and that character suffered” is functionally/morally equivalent to “you imagined a character and one person (call it you or your character, doesn’t matter) suffered”—which, in principle that’s bad unless there’s some greater good to be had from it, but it’s not worse than you suffering for some other reason.
Having written the above, I went away and came back with a clearer way to express it: For suffering-related (or positive experience related) calculations, one person = one stream of conscious experience, two people = two streams of conscious experience. My brain can only do one stream of conscious experience at a time, so I’m not worried that by imagining characters, I’ve created a bunch of people. But I would worry that something with different hardware than me could.
I have a question related to the “Not the same person” part, the answer to which is a crux for me.
Let’s suppose you are imagining a character who is experiencing some feeling. Can that character be feeling what it feels, while you feel something different? Can you be sad while your character is happy, or vice versa?
I find that I can’t—if I imagine someone happy, I feel what I imagine they are feeling—this is the appeal of daydreams. If I imagine someone angry during an argument, I myself feel that feeling. There is no other person in my mind having a separate feeling. I don’t think I have the hardware to feel two people’s worth of feelings at once, I think what’s happening is that my neural hardware is being hijacked to run a simulation of a character, and while this is happening I enter into the mental state of that character, and in important respects my other thoughts and feelings on my own behalf stop.
So for me, I think my mental powers are not sufficient to create a moral patient separate from myself. I can set my mind to simulating what someone different from real-me would be like, and have the thoughts and feelings of that character follow different paths than my thoughts would, but I understand “having a conversation between myself and an imagined character”, which you treat as evidence there are two people involved, as a kind of task-switching, processor-sharing arrangement—there are bottlenecks in my brain that prevent me from running two people at once, and the closest I can come is thinking as one conversation partner and then the next and then back to the first. I can’t, for example, have one conversation partner saying something while the other is not paying attention because they’re thinking of what to say next and only catches half of what was said and so responds inappropriately, which is a thing that I hear is not uncommon in real conversations between two people. And if the imagined conversation involves a pause which in a conversation between two people would involve two internal mental monologues, I can’t have those two mental monologues at once. I fully inhabit each simulation/imagined character as it is speaking, and only one at a time as it is thinking.
If this is true for you as well, then in a morally relevant respect I would say that you and whatever characters you create are only one person. If you create a character who is suffering, and inhabit that character mentally such that you are suffering, that’s bad because you are suffering, but it’s not 2x bad because you and your character are both suffering—in that moment of suffering, you and your character are one person, not two.
I can imagine a future AI with the ability to create and run multiple independent human-level simulations of minds and watch them interact and learn from that interaction, and perhaps go off and do something in the world while those simulations persist without it being aware of their experiences any more. And for such an AI, I would say it ought not to create entities that have bad lives. And if you can honestly say that your brain is different than mine in such a way that you can imagine a character and you have the mental bandwidth to run it fully independently from yourself, with its own feelings that you know somehow other than having it hijack the feeling-bits of your brain and use them to generate feelings which you feel while what you were feeling before is temporarily on pause (which is how I experience the feelings of characters I imagine), and because of this separation you could wander off and do other things with your life and have that character suffer horribly with no ill effects to you except the feeling that you’d done something wrong… then yeah, don’t do that. If you could do it for more than one imagined character at a time, that’s worse, definitely don’t.
But if you’re like me, I think “you imagined a character and that character suffered” is functionally/morally equivalent to “you imagined a character and one person (call it you or your character, doesn’t matter) suffered”—which, in principle that’s bad unless there’s some greater good to be had from it, but it’s not worse than you suffering for some other reason.
Having written the above, I went away and came back with a clearer way to express it: For suffering-related (or positive experience related) calculations, one person = one stream of conscious experience, two people = two streams of conscious experience. My brain can only do one stream of conscious experience at a time, so I’m not worried that by imagining characters, I’ve created a bunch of people. But I would worry that something with different hardware than me could.