Agreed. The really annoying part is that because, as you say:
Subjective experience is pretty much unique in that it is never observed by anyone other than the subject
It’s very difficult to point to evidence that subjective experience is just private by definition (as in, if it wasn’t uniquely yours it wouldn’t be subjective), rather than being private by virtue of having some special super-physical status that makes it impossible to share. The two theories predict the same experimental results in pretty much all cases.
I think that saying “subjective experience is private” can be rephrased as saying that “our ability to describe reality/the physical world is clearly incomplete”. Dualism happens when folks use the Typical Mind Fallacy to convert this fact about how we describe reality into an actual split between “physical stuff” and “the non-physical” that is held to be always true, regardless of the observer.
Ah, now see there I think I disagree a little. I think saying “subjective experience is private” is just expressing an analytic truth. We define subjective experience as being experience as it occurs to an individual, and therefore subjective experience can only be known by the individual. This is not to say that people’s experiences can’t be identical to one another, rather it just says that my experiences can’t be your experiences because if they were they’d be your experiences and not my experiences. So saying “subjective experience is private” doesn’t tell us anything new if we already knew what subjective experience was.
The mistake comes when people look for an explanation for why they experience their own sensations but have to hear about other people’s second hand. You don’t need an explanation for this, it’s necessarily true!
Of course I might have misunderstood you. If so, sorry.
I think saying “subjective experience is private” is just expressing an analytic truth.
I’m not sure this is right, actually. Consider a least convenient case: a world populated by conscious beings (such as AI’s) whose subjective experience is actually made up of simple numbers, e.g bytes stored in a memory address space. (Of course this assumes that Platonic numbers actually exist, if only as perceived by the AI’s. Let’s just concede this for the sake of argument.) Suppose further that any AI can read every other AI’s memory. Then the AI’s could know everything there is to know about each other’s experiences, yet any one experience is still “subjective” in a sense, because it is associated with a single individual.
I think that if the AI read one another’s memory by copying the files across and opening them with remember.exe, then reading another AI’s memory would feel like remembering something that happened to the reader. In that case there would be no subjective experience, because Argency.AI would be able to relive Bogus.AI’s memories as though they were his own—experiences would be public, objective.
Alternatively, if the AI just look at each other’s files and consciously interpret them as I might interpret words that you had written on a page describing an experience, they’re in exactly the same circumstances as us, in which case I think my earlier argument holds.
But such experiences still aren’t subjective in the sense of “private”. I don’t see what you are getting at.
If subjective=private, your AIs don’t have subjective experience. Setting up another definition of subjective doesn’t stop subjective=private from being analytically true or true at all. There are lots
of things associated with individauls, such as names, which are not subjective.
Agreed. The really annoying part is that because, as you say:
It’s very difficult to point to evidence that subjective experience is just private by definition (as in, if it wasn’t uniquely yours it wouldn’t be subjective), rather than being private by virtue of having some special super-physical status that makes it impossible to share. The two theories predict the same experimental results in pretty much all cases.
I think that saying “subjective experience is private” can be rephrased as saying that “our ability to describe reality/the physical world is clearly incomplete”. Dualism happens when folks use the Typical Mind Fallacy to convert this fact about how we describe reality into an actual split between “physical stuff” and “the non-physical” that is held to be always true, regardless of the observer.
Ah, now see there I think I disagree a little. I think saying “subjective experience is private” is just expressing an analytic truth. We define subjective experience as being experience as it occurs to an individual, and therefore subjective experience can only be known by the individual. This is not to say that people’s experiences can’t be identical to one another, rather it just says that my experiences can’t be your experiences because if they were they’d be your experiences and not my experiences. So saying “subjective experience is private” doesn’t tell us anything new if we already knew what subjective experience was.
The mistake comes when people look for an explanation for why they experience their own sensations but have to hear about other people’s second hand. You don’t need an explanation for this, it’s necessarily true!
Of course I might have misunderstood you. If so, sorry.
I’m not sure this is right, actually. Consider a least convenient case: a world populated by conscious beings (such as AI’s) whose subjective experience is actually made up of simple numbers, e.g bytes stored in a memory address space. (Of course this assumes that Platonic numbers actually exist, if only as perceived by the AI’s. Let’s just concede this for the sake of argument.) Suppose further that any AI can read every other AI’s memory. Then the AI’s could know everything there is to know about each other’s experiences, yet any one experience is still “subjective” in a sense, because it is associated with a single individual.
I think that if the AI read one another’s memory by copying the files across and opening them with remember.exe, then reading another AI’s memory would feel like remembering something that happened to the reader. In that case there would be no subjective experience, because Argency.AI would be able to relive Bogus.AI’s memories as though they were his own—experiences would be public, objective.
Alternatively, if the AI just look at each other’s files and consciously interpret them as I might interpret words that you had written on a page describing an experience, they’re in exactly the same circumstances as us, in which case I think my earlier argument holds.
But such experiences still aren’t subjective in the sense of “private”. I don’t see what you are getting at. If subjective=private, your AIs don’t have subjective experience. Setting up another definition of subjective doesn’t stop subjective=private from being analytically true or true at all. There are lots of things associated with individauls, such as names, which are not subjective.