Yes. I agree with that. The problem is that the same argument goes through for D—no real computationally-limited observer can distinguish an encryption of a happy brain from the encryption of a brain in pain. But they are really different: with high probability there’s no possible encryption key under which we have a happy brain. (Edited original to clarify this.)
And to make it worse, there’s a continuum between C and D as we shrink the size of the key; computationally-limited observers can gradually tell that it’s a brain-in-pain.
And there’s a continuum from D to E as we increase the size of the key—a one-time pad is basically a key the size of the data. The bigger the key, the more possible brains an encrypted data set maps onto, and at some point it becomes quite likely that a happy brain is also contained within the possible brains.
But anyhow, I’d start caring less as early as B (for Nozick’s Experience Machine reasons) - since my caring is on a continuum, it doesn’t even raise any edge-case issues that the reality is on a continuum as well.
And to make it worse, there’s a continuum between C and D as we shrink the size of the key; computationally-limited observers can gradually tell that it’s a brain-in-pain.
So it is a brain in pain. The complexity of the key just hides the fact.
Except “it” refers to the key and the “random” bits...not just the random bits, and not just the key. Both the bits and the key contain information about the mind. Deleting either the pseudo random bits or the key deletes the mind.
If you only delete the key, then there is a continuum of how much you’ve deleted the mind, as a function of how possible it is to recover the key. How much information was lost? How easy is it to recover? As the key becomes more complex, more and more of the information which makes it a mind rather than a random computation is in the key.
But they are really different: with high probability there’s no possible encryption key under which we have a happy brain.
In the case where only one possible key in the space of keys leads to a mind, we haven’t actually lost any information about the mind by deleting the key—doing a search through the space of all keys will eventually lead us to find the correct one.
I think the moral dimension lies in stuff that pin down a mind from the space of possible computations.
Yes. I agree with that. The problem is that the same argument goes through for D—no real computationally-limited observer can distinguish an encryption of a happy brain from the encryption of a brain in pain. But they are really different: with high probability there’s no possible encryption key under which we have a happy brain. (Edited original to clarify this.)
And to make it worse, there’s a continuum between C and D as we shrink the size of the key; computationally-limited observers can gradually tell that it’s a brain-in-pain.
And there’s a continuum from D to E as we increase the size of the key—a one-time pad is basically a key the size of the data. The bigger the key, the more possible brains an encrypted data set maps onto, and at some point it becomes quite likely that a happy brain is also contained within the possible brains.
But anyhow, I’d start caring less as early as B (for Nozick’s Experience Machine reasons) - since my caring is on a continuum, it doesn’t even raise any edge-case issues that the reality is on a continuum as well.
So it is a brain in pain. The complexity of the key just hides the fact.
Except “it” refers to the key and the “random” bits...not just the random bits, and not just the key. Both the bits and the key contain information about the mind. Deleting either the pseudo random bits or the key deletes the mind.
If you only delete the key, then there is a continuum of how much you’ve deleted the mind, as a function of how possible it is to recover the key. How much information was lost? How easy is it to recover? As the key becomes more complex, more and more of the information which makes it a mind rather than a random computation is in the key.
In the case where only one possible key in the space of keys leads to a mind, we haven’t actually lost any information about the mind by deleting the key—doing a search through the space of all keys will eventually lead us to find the correct one.
I think the moral dimension lies in stuff that pin down a mind from the space of possible computations.