More generally (since I have no idea why we’re using torture in this example and I find it distasteful to keep doing so) I’m pretty confident saying that any process that computes all and only the states of John Smith’s brain during some experience X involves John experiencing X, regardless of how those states are represented and stored, and regardless of how the computation is performed.
I picked the torture example, because I’m not sure what “John experiences X” really means, once you taboo all the confusing terms about personal identity and consciousness”—but I think the moral question is a “territory” question, not a “map” question.
The “all states and only the states of the brain” part confuses me. Suppose we do time-slicing; the computer takes turns simulating John and simulating Richard. That can’t be a moral distinction. I suspect it will take some very careful phrasing to find a definition for “all states and only those states” that isn’t obviously wrong.
Well, as above, I’m pretty confident that re-computing the table causes John to experience X (in addition to causing there to have been a John to experience it). I’m not confident what I want to say about the moral implications of identical recomputations of an event that has a certain moral character. My intuitions conflict.
Yah. After thinking about this for a couple of days the only firm conclusion I have is that moral intuition doesn’t work in these cases. I have a slight worry that thinking too hard about these sorts of hypotheticals will damage my moral intuition for the real-world cases—but I don’t think this is anything more than a baby basilisk at most.
I picked the torture example, because I’m not sure what “John experiences X” really means, once you taboo all the confusing terms about personal identity and consciousness”—but I think the moral question is a “territory” question, not a “map” question.
I don’t quite understand this. If a given event is not an example of John experiencing torture, then how is the moral status of John experiencing torture relevant?
The “all states and only the states of the brain” part confuses me.
I wasn’t trying to argue that if this condition is not met, then there is no moral difficulty, I was just trying to narrow my initial claim to one I could make with confidence.
If I remove the “and only” clause I open myself up to a wide range of rabbit holes that confuse my intuitions, such as “we generate the GLUT of all possible future experiences John might have, including both torture and a wildly wonderful life”.
the only firm conclusion I have is that moral intuition doesn’t work in these cases.
IME moral intuitions do work in these cases, but they conflict, so it becomes necessary to think carefully about tradeoffs and boundary conditions to come up with a more precise and consistent formulation of those intuitions. That said, changing the intuitions themselves is certainly simpler, but has obvious difficulties.
I picked the torture example, because I’m not sure what “John experiences X” really means, once you taboo all the confusing terms about personal identity and consciousness”—but I think the moral question is a “territory” question, not a “map” question.
The “all states and only the states of the brain” part confuses me. Suppose we do time-slicing; the computer takes turns simulating John and simulating Richard. That can’t be a moral distinction. I suspect it will take some very careful phrasing to find a definition for “all states and only those states” that isn’t obviously wrong.
Yah. After thinking about this for a couple of days the only firm conclusion I have is that moral intuition doesn’t work in these cases. I have a slight worry that thinking too hard about these sorts of hypotheticals will damage my moral intuition for the real-world cases—but I don’t think this is anything more than a baby basilisk at most.
I don’t quite understand this. If a given event is not an example of John experiencing torture, then how is the moral status of John experiencing torture relevant?
I wasn’t trying to argue that if this condition is not met, then there is no moral difficulty, I was just trying to narrow my initial claim to one I could make with confidence.
If I remove the “and only” clause I open myself up to a wide range of rabbit holes that confuse my intuitions, such as “we generate the GLUT of all possible future experiences John might have, including both torture and a wildly wonderful life”.
IME moral intuitions do work in these cases, but they conflict, so it becomes necessary to think carefully about tradeoffs and boundary conditions to come up with a more precise and consistent formulation of those intuitions. That said, changing the intuitions themselves is certainly simpler, but has obvious difficulties.