Thanks. I know you don’t want to continue discussion; but I note, for others reading this, that in this explanation, you’re using the word “conscious” to mean “at the center of attention”. This is not the same question I’m asking, which is about “consciousness” as “the experience of qualia”.
I made my comment because it’s very important to know whether experiencing qualia is efficient. Is there any reason to expect that future AIs will have qualia; or can they do what they want to do just as well (maybe better) by not having that feature? If experiencing qualia does not confer an advantage to an AI, then we’re headed for a universe devoid of qualia. That’s the big lose for the universe.
Avoiding that common qualia/attention confusion is reason enough not to taboo “qualia”, which is more precise than “consciousness”.
Thanks. I know you don’t want to continue discussion; but I note, for others reading this, that in this explanation, you’re using the word “conscious” to mean “at the center of attention”. This is not the same question I’m asking, which is about “consciousness” as “the experience of qualia”.
I made my comment because it’s very important to know whether experiencing qualia is efficient. Is there any reason to expect that future AIs will have qualia; or can they do what they want to do just as well (maybe better) by not having that feature? If experiencing qualia does not confer an advantage to an AI, then we’re headed for a universe devoid of qualia. That’s the big lose for the universe.
Avoiding that common qualia/attention confusion is reason enough not to taboo “qualia”, which is more precise than “consciousness”.