Honestly Illusionism is just really hard to take seriously. Whatever consciousness is, I have better evidence it exists than anything else since it is the only thing I actually experience directly. I should pretend it isn’t real...why exactly? Am I talking to slightly defective P-zombies?
If the computer emitted it for the same reasons...is a clear example of a begging the question fallacy. If a computer claimed to be conscious because it was conscious, then it logically has to be conscious, but that is the possible dispute in the first place. If you claim consciousness isn’t real, then obviously computers can’t be conscious. Note, that you aren’t talking about real illusionism if you don’t think we are p-zombies. Only the first of the two possibilities you mentioned is Illusionism if I recall correctly.
You seem like one of the many people trying to systematize things they don’t really understand. It’s an understandable impulse, but leads to an illusion of understanding (which is the only thing that leads to a systemization like Illusionism seems like frustrated people claiming there is nothing to see here.) If you want a systemization of consciousness that doesn’t claim things it doesn’t know, then assume consciousness is the self-reflective and experiential part of the mind that controls and directs large parts of the overall mind. There is no need to state what causes it.
If a machine fails to be self-reflective or experiential then it clearly isn’t conscious. It seems pretty clear that modern AI is neither. It probably fails the test of even being a mind in any way, but that’s debatable.
Is it possible for a machine to be conscious? Who knows. I’m not going to bet against it, but current techniques seem incredibly unlikely to do it.
Whatever consciousness is, I have better evidence it exists than anything else since it is the only thing I actually experience directly.
In an out-of-body experience, you can “directly experience” your mind floating on the other side of the room. But your mind is not in fact floating on the other side of the room.
So what you call a “direct experience”, I call a “perception”. And perceptions can be mistaken—e.g. optical illusions.
So, write down a bulleted list of properties of your own consciousness. Every one of the items on your list is a perception that you have made about your own consciousness. How many of those bulleted items are veridical perceptions—perceiving an aspect of your own consciousness as it truly is—and how many of them are misperceptions? If you say “none is a misperception”, how do you know, and why does it differ from all other types of human perception in that respect, and how do you make sense of the fact that some people report that they were previously mistaken about properties of their own consciousness (e.g. “enlightened” Buddhists reflecting on their old beliefs)?
Or if you allow that some of the items on your bulleted list may be misperceptions, why not all of them??
It seems pretty clear that modern AI is neither
To be clear, this post is about AGI, which doesn’t exist yet, not “modern AI”, which does.
Honestly Illusionism is just really hard to take seriously. Whatever consciousness is, I have better evidence it exists than anything else since it is the only thing I actually experience directly. I should pretend it isn’t real...why exactly? Am I talking to slightly defective P-zombies?
If the computer emitted it for the same reasons...is a clear example of a begging the question fallacy. If a computer claimed to be conscious because it was conscious, then it logically has to be conscious, but that is the possible dispute in the first place. If you claim consciousness isn’t real, then obviously computers can’t be conscious. Note, that you aren’t talking about real illusionism if you don’t think we are p-zombies. Only the first of the two possibilities you mentioned is Illusionism if I recall correctly.
You seem like one of the many people trying to systematize things they don’t really understand. It’s an understandable impulse, but leads to an illusion of understanding (which is the only thing that leads to a systemization like Illusionism seems like frustrated people claiming there is nothing to see here.)
If you want a systemization of consciousness that doesn’t claim things it doesn’t know, then assume consciousness is the self-reflective and experiential part of the mind that controls and directs large parts of the overall mind. There is no need to state what causes it.
If a machine fails to be self-reflective or experiential then it clearly isn’t conscious. It seems pretty clear that modern AI is neither. It probably fails the test of even being a mind in any way, but that’s debatable.
Is it possible for a machine to be conscious? Who knows. I’m not going to bet against it, but current techniques seem incredibly unlikely to do it.
In an out-of-body experience, you can “directly experience” your mind floating on the other side of the room. But your mind is not in fact floating on the other side of the room.
So what you call a “direct experience”, I call a “perception”. And perceptions can be mistaken—e.g. optical illusions.
So, write down a bulleted list of properties of your own consciousness. Every one of the items on your list is a perception that you have made about your own consciousness. How many of those bulleted items are veridical perceptions—perceiving an aspect of your own consciousness as it truly is—and how many of them are misperceptions? If you say “none is a misperception”, how do you know, and why does it differ from all other types of human perception in that respect, and how do you make sense of the fact that some people report that they were previously mistaken about properties of their own consciousness (e.g. “enlightened” Buddhists reflecting on their old beliefs)?
Or if you allow that some of the items on your bulleted list may be misperceptions, why not all of them??
To be clear, this post is about AGI, which doesn’t exist yet, not “modern AI”, which does.
This comment:
...could have been phrased as:
I do agree with your rephrasing. That is exactly what I mean (though with a different emphasis.).