he defines consciousness as “what an algorithm implementing complex social games feels like when reflecting on itself”.
In that case I’ll not use the word consciousness and abstract away to “things which I ascribe moral weight to”, (which I think is a fair assumption given the later discussion of eating “BBQ GPT-3 wings” etc.)
Eliezer’s claim is therefore something along the lines of: “I only care about the suffering of algorithms which implement complex social games and reflect on themselves” or possibly “I only care about the suffering of algorithms which are capable of (and currently doing a form of) self-modelling”.
I’ve not seen nearly enough evidence to convince me of this.
I don’t expect to see a consciousness particle called a qualon. I more expect to see something like: “These particular brain activity patterns which are robustly detectable in an fMRI are extremely low in sleeping people, higher in dreaming people, higher still in awake people and really high in people on LSD and types of zen meditation.”
In that case I’ll not use the word consciousness and abstract away to “things which I ascribe moral weight to”, (which I think is a fair assumption given the later discussion of eating “BBQ GPT-3 wings” etc.)
Eliezer’s claim is therefore something along the lines of: “I only care about the suffering of algorithms which implement complex social games and reflect on themselves” or possibly “I only care about the suffering of algorithms which are capable of (and currently doing a form of) self-modelling”.
I’ve not seen nearly enough evidence to convince me of this.
I don’t expect to see a consciousness particle called a qualon. I more expect to see something like: “These particular brain activity patterns which are robustly detectable in an fMRI are extremely low in sleeping people, higher in dreaming people, higher still in awake people and really high in people on LSD and types of zen meditation.”