Any AGI would be able to examine itself, so if that is the definition of consciousness, every intelligence would be conscious. But Eliezer denies the latter, so he also implicitly denies that definition of consciousness.
I’m not sure I am parsing correctly what you’ve wrote. It may rest with your use of the word “intelligence”- how are you defining that term?
You could replace it with “AI.” Any AI can examine itself, so any AI will be conscious, if consciousness is or results from examining itself. I agree with this, but Eliezer does not.
I’m not sure I am parsing correctly what you’ve wrote. It may rest with your use of the word “intelligence”- how are you defining that term?
You could replace it with “AI.” Any AI can examine itself, so any AI will be conscious, if consciousness is or results from examining itself. I agree with this, but Eliezer does not.