It has the weird aspect of putting consciousness on a continuum,
I find I feel less confused about consciousness when thinking of it as a continuum. I’m reminded of this, from Heinlein:
“Am not going to argue whether a machine can ‘really’ be alive, ‘really’ be self-aware. Is a virus self-aware? Nyet. How about oyster? I doubt it. A cat? Almost certainly. A human? Don’t know about you, tovarishch, but I am.”
I find I feel less confused about consciousness when thinking of it as a continuum. I’m reminded of this, from Heinlein:
Absolutely. I do too. I just realized that the continuum provides another interesting question.
Is the following scale of consciousness correct?
Human > Chimp > Dog > Toad > Any possible AI with no biological components
The biological requirement seems to imply this. It seems wrong to me.