I notice that I am still confused. In the past I hit ‘ignore’ when people talked about consciousness, but lets try ‘explain’ today.
The original post states:
Would we disagree over what my computer can do?
What about an animal instead of my computer? Would we feel the same philosophical confusion over any given capability of an average chicken? An average human?
Does this mean that if two systems have almost the same capabilities that we would then also expect them to have a similar chance of receiving the consciousness label? In other words: is consciousness, like IQ, a lossy compression of other properties? I would like to compare this idea with the comment from RichardKennaway, who asks:
Everyone reading this, please take a moment to see whether you have any sensation that you might describe by those words.
I personally have never thought anything like ‘I am feeling so conscious today’ or even ‘I am actively experiencing consciousness’. However, as I expected consciousness to be some sort of label attached to systems with certain properties, along with the observation that I am human and pretty much all other humans claim to be conscious, my prior for me being conscious is pretty high. But I don’t feel that I can answer ‘Yes’ to RichardKennaway’s question.
I have a hard time believing that my not feeling conscious is significant evidence to sway my prior away from me being conscious (especially since feelings are pretty easy to manipulate), i.e. my current solution is to reject RichardKennaway’s question as insignificant (feeling conscious might not be all that related to being conscious). My questions are: are there better tests for being conscious available (presumably no, since the whole point of the discussion is that consciousness is hard to quantify), and if not then why should we bother to discuss whether or not something is conscious at all?
I am sorry that my post isn’t all that coherent, I seem to be having trouble identifying exactly what my problem with ‘consciousness’ is.
Small note: while writing this reply I noticed that the original post can be interpreted as an explanation of what I am trying to say: if consciousness is a lossy compression then whether something has it or not becomes a moot point if we know plenty other properties, so if I notice that I can do most things that other humans also can (abstract reasoning, planning and executing strategies, using tools etc.) then whether or not I am conscious should be a moot point as it does not influence any prediction of my actions.
I notice that I am still confused. In the past I hit ‘ignore’ when people talked about consciousness, but lets try ‘explain’ today.
The original post states:
Does this mean that if two systems have almost the same capabilities that we would then also expect them to have a similar chance of receiving the consciousness label? In other words: is consciousness, like IQ, a lossy compression of other properties? I would like to compare this idea with the comment from RichardKennaway, who asks:
I personally have never thought anything like ‘I am feeling so conscious today’ or even ‘I am actively experiencing consciousness’. However, as I expected consciousness to be some sort of label attached to systems with certain properties, along with the observation that I am human and pretty much all other humans claim to be conscious, my prior for me being conscious is pretty high. But I don’t feel that I can answer ‘Yes’ to RichardKennaway’s question.
I have a hard time believing that my not feeling conscious is significant evidence to sway my prior away from me being conscious (especially since feelings are pretty easy to manipulate), i.e. my current solution is to reject RichardKennaway’s question as insignificant (feeling conscious might not be all that related to being conscious). My questions are: are there better tests for being conscious available (presumably no, since the whole point of the discussion is that consciousness is hard to quantify), and if not then why should we bother to discuss whether or not something is conscious at all?
I am sorry that my post isn’t all that coherent, I seem to be having trouble identifying exactly what my problem with ‘consciousness’ is.
Small note: while writing this reply I noticed that the original post can be interpreted as an explanation of what I am trying to say: if consciousness is a lossy compression then whether something has it or not becomes a moot point if we know plenty other properties, so if I notice that I can do most things that other humans also can (abstract reasoning, planning and executing strategies, using tools etc.) then whether or not I am conscious should be a moot point as it does not influence any prediction of my actions.