That is probably the best answer. It has the weird aspect of putting consciousness on a continuum, and one that isn’t easy to quantify. If someone with 50% cyber brain cells is 50% conscious, but their behavior is the same as as a 100% biological, 100% conscious brain it’s a little strange.
Also, it means that consciousness isn’t a binary variable. For this to make sense consciousness must be a continuum. That is an important point to make regardless of the definition we use.
It has the weird aspect of putting consciousness on a continuum,
I find I feel less confused about consciousness when thinking of it as a continuum. I’m reminded of this, from Heinlein:
“Am not going to argue whether a machine can ‘really’ be alive, ‘really’ be self-aware. Is a virus self-aware? Nyet. How about oyster? I doubt it. A cat? Almost certainly. A human? Don’t know about you, tovarishch, but I am.”
That is probably the best answer. It has the weird aspect of putting consciousness on a continuum, and one that isn’t easy to quantify. If someone with 50% cyber brain cells is 50% conscious, but their behavior is the same as as a 100% biological, 100% conscious brain it’s a little strange.
Also, it means that consciousness isn’t a binary variable. For this to make sense consciousness must be a continuum. That is an important point to make regardless of the definition we use.
I find I feel less confused about consciousness when thinking of it as a continuum. I’m reminded of this, from Heinlein:
Absolutely. I do too. I just realized that the continuum provides another interesting question.
Is the following scale of consciousness correct?
Human > Chimp > Dog > Toad > Any possible AI with no biological components
The biological requirement seems to imply this. It seems wrong to me.