Maybe I missed this, but did you ever write up the Monday/Tuesday game with your views on consciousness? On Monday, consciousness is an algorithm running on a brain, and when people say they have consciously experienced something, they are reporting the output of this algorithm. On Tuesday, the true ontology of mind resembles the ontology of transcendental phenomenology. What’s different?
I’m also confused about why an algorithm couldn’t represent a mass of entangled electrons.
Oh, also: imagine that SIAI makes an AI. Why should they make it conscious at all? They’re just trying to create an intelligence, not a consciousness. Surely, even if consciousness requires whatever it is you think it requires, an intelligence does not.
Indeed. Is my cat conscious? It’s certainly an agent (it appears to have its own drives and motivations), with considerable intelligence (for a cat) and something I’d call creativity (it’s an ex-stray with a remarkable ability to work out how to get into places with food it’s after).
And the answer appears to be: yes. “The absence of a neocortex does not appear to preclude an organism from experiencing affective states. Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Nonhuman animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates.”
Maybe I missed this, but did you ever write up the Monday/Tuesday game with your views on consciousness? On Monday, consciousness is an algorithm running on a brain, and when people say they have consciously experienced something, they are reporting the output of this algorithm. On Tuesday, the true ontology of mind resembles the ontology of transcendental phenomenology. What’s different?
I’m also confused about why an algorithm couldn’t represent a mass of entangled electrons.
Oh, also: imagine that SIAI makes an AI. Why should they make it conscious at all? They’re just trying to create an intelligence, not a consciousness. Surely, even if consciousness requires whatever it is you think it requires, an intelligence does not.
Indeed. Is my cat conscious? It’s certainly an agent (it appears to have its own drives and motivations), with considerable intelligence (for a cat) and something I’d call creativity (it’s an ex-stray with a remarkable ability to work out how to get into places with food it’s after).
And the answer appears to be: yes. “The absence of a neocortex does not appear to preclude an organism from experiencing affective states. Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Nonhuman animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates.”