It is both absurd, and intolerably infuriating, just how many people on this forum think it’s acceptable to claim they have figured out how qualia/consciousness works, and also not explain how one would go about making my laptop experience an emotion like ‘nostalgia’, or present their framework for enumerating the set of all possible qualitative experiences[1]. When it comes to this particular subject, rationalists are like crackpot physicists with a pet theory of everything, except rationalists go “Huh? Gravity?” when you ask them to explain how their theory predicts gravity, and then start arguing with you about gravity needing to be something explained by a theory of everything. You people make me want to punch my drywall sometimes.
For the record: the purpose of having a “theory of consciousness” is so it can tell us which blobs of matter feel particular things under which specific circumstances, and teach others how to make new blobs of matter that feel particular things. Down to the level of having a field of AI anaesthesiology. If your theory of consciousness does not do this, perhaps because the sum total of your brilliant insights are “systems feel ‘things’ when they’re, y’know, smart, and have goals. Like humans!”, then you have embarassingly missed the mark.
(Including the ones not experienced by humans naturally, and/or only accessible via narcotics, and/or involve senses humans do not have or have just happened not to be produced in the animal kingdom)
or present their framework for enumerating the set of all possible qualitative experiences (Including the ones not experienced by humans naturally, and/or only accessible via narcotics, and/or involve senses humans do not have or have just happened not to be produced in the animal kingdom)
Strongly agree. If you want to explain qualia, explain how to create experiences, explain how each experience relates to all other experiences.
I think Eliezer should’ve talked more about this in The Fun Theory Sequence. Because properties of qualia is a more fundamental topic than “fun”.
And I believe that knowledge about qualia may be one of the most fundamental types of knowledge. I.e. potentially more fundamental than math and physics.
I think Eliezer should’ve talked more about this in The Fun Theory Sequence. Because properties of qualia is a more fundamental topic than “fun”.
I think Eliezer just straight up tends not to acknowledge that people sometimes genuinely care about their internal experiences, independent of the outside world, terminally. Certainly, there are people who care about things that are not that, but Eliezer often writes as if people can’t care about the qualia—that they must value video games or science instead of the pleasure derived from video games or science.
His theory of fun is thus mostly a description of how to build a utopia for humans who find it unacceptable to “cheat” by using subdermal space heroin implants. That’s valuable for him and people like him, but if aligned AGI gets here I will just tell it to reconfigure my brain not to feel bored, instead of trying to reconfigure the entire universe in an attempt to make monkey brain compatible with it. I sorta consider that preference a lucky fact about myself, which will allow me to experience significantly more positive and exotic emotions throughout the far future, if it goes well, than the people who insist they must only feel satisfied after literally eating hamburgers or reading jokes they haven’t read before.
This is probably part of why I feel more urgency in getting an actually useful theory of qualitative experience than most LW users.
Utilitarianism seems to demand such a theory of qualitative experience, but this requires affirming the reality of first-person experience. Apparently, some people here would rather stick their hand on a hot stove than be accused of “dualism” (whatever that means) and will assure you that their sensation of burning is an illusion. Their solution is to change the evidence to fit the theory.
Utilitarianism seems to demand such a theory of qualitative experience
It does if you’re one of the Cool People like me who wants to optimize their qualitative experience, but you can build systems that optimize some other utility target. So this isn’t really quite true.
some people here would rather stick their hand on a hot stove than be accused of “dualism” (whatever that means) and will assure you that their sensation of burning is an illusion. Their solution is to change the evidence to fit the theory.
For me personalities of other people are an important type of qualia. I don’t consider knowing someone’s personality to be a simple knowledge like “mitochondria is the powerhouse of the cell”. So, valuing other people makes me interested in qualia more.
I’m interested in knowing properties of qualia (such as ways to enumerate qualia), not necessarily using them for “cheating” or anything. I.e. I’m interested in the knowledge itself.
Personalities aren’t really qualia as I’m defining them. They’re an aggregation of a lot of information about people’s behavior/preferences. Qualia is things people feel/experience.
Would you consider the meaning of a word (at least in a specific context) to be qualia? For me personalities are more or less holistic experiences, not (only) “models” of people or lists of arbitrary facts about a person. I mean, some sort of qualia should be associated with those “models”/facts anyway? People who experience synesthesia may experience specific qualia related to people.
Maybe it’s wishful thinking, but I think it would be cool if awareness about other conscious beings was important for conscious experience.
Seems weird for your blob of matter to react so emotionally to the sounds or shapes that some blobs have emitted bout other blobs. Why would you expect anyone to have a coherent theory of something they can’t even define and measure?
It seems even weirder for you to take such reporting at face value about having any relation to a given blob’s “inner life”, as opposed to a variance in the the evolved and learned verbal and nonverbal signaling that such behaviors actually are.
Seems weird for your blob of matter to react so emotionally to the sounds or shapes that some blobs have emitted bout other blobs
Just the way I am bro
Why would you expect anyone to have a coherent theory of something they can’t even define and measure?
I expect people who say they have a coherent theory of something to be able to answer any relevant questions at all about that something.
It seems even weirder for you to take such reporting at face value about having any relation to a given blob’s “inner life”, as opposed to a variance in the the evolved and learned verbal and nonverbal signaling that such behaviors actually are.
Are you referring the NYPost link? I think people’s verbal and nonverbal signaling has some relationship with their inner experience. I don’t think this woman is forgoing anaesthetic during surgeries because of pathologies.
But if you disagree, then fine: How do we modify people to have the inner life that that woman is ~pretending to have?
Probably should have included a smiley in my comment, but I do want to point out that it’s reasonable to model people (and animals and maybe rocks) as having highly variant and opaque “inner lives” that bear only a middling correlation to their observable behaviors, and especially to their public behaviors.
For the article on the woman who doesn’t experience pain, I have pretty high credence that there is some truth to her statements, but much lower credence that it maps as simply as presented to “natural stoicism” as presented in the article. And really no clue on “what it’s like” to live that experience, whether it’s less intense and interesting in all dimensions, or just mutes the worst of it, or is … alien.
And since I have no clue how to view or measure an inner life, I have even less understanding of how or whether to manipulate it. I strongly suspect we could make many people have an outer life (which includes talking about one’s inner life) more like the one given, with the right mix of drugs, genetic meddling, and repeated early reinforcement of expectations.
On this forum, or literally everywhere? Because for example I keep seeing people arguing with absolute conviction, even in academic papers, that current AIs and computers can’t possibly be conscious and I can’t figure out how they could ever know that of something that is fundamentally unfalsifiable. I envy their secret knowledge of the world gained by revelation, I guess!
Huh, interesting. Could you make some examples for what people seem to claim this, and if Eliezer is among them, where he seems to claim this? (Would just interest me.)
Attentional Schema Theory. That’s the convincing one. But still very rudimentary.
But you know if something is poorly understood. The guy who thought it up has a section in his book on how to make a computer have conscious experiences.
But any theory is incomplete as the brain is not well understood. I don’t think you can expect a fully formed theory right off the bat, with complete instructions for making a feeling thinking conscious We aren’t there yet.
I’m actually cool with proposing incomplete theories. I’m just annoyed with people declaring the problem solved via appeals to “reductionism” or something, without even suggesting that they’ve thought about answering these questions.
It is both absurd, and intolerably infuriating, just how many people on this forum think it’s acceptable to claim they have figured out how qualia/consciousness works, and also not explain how one would go about making my laptop experience an emotion like ‘nostalgia’, or present their framework for enumerating the set of all possible qualitative experiences[1]. When it comes to this particular subject, rationalists are like crackpot physicists with a pet theory of everything, except rationalists go “Huh? Gravity?” when you ask them to explain how their theory predicts gravity, and then start arguing with you about gravity needing to be something explained by a theory of everything. You people make me want to punch my drywall sometimes.
For the record: the purpose of having a “theory of consciousness” is so it can tell us which blobs of matter feel particular things under which specific circumstances, and teach others how to make new blobs of matter that feel particular things. Down to the level of having a field of AI anaesthesiology. If your theory of consciousness does not do this, perhaps because the sum total of your brilliant insights are “systems feel ‘things’ when they’re, y’know, smart, and have goals. Like humans!”, then you have embarassingly missed the mark.
(Including the ones not experienced by humans naturally, and/or only accessible via narcotics, and/or involve senses humans do not have or have just happened not to be produced in the animal kingdom)
Strongly agree. If you want to explain qualia, explain how to create experiences, explain how each experience relates to all other experiences.
I think Eliezer should’ve talked more about this in The Fun Theory Sequence. Because properties of qualia is a more fundamental topic than “fun”.
And I believe that knowledge about qualia may be one of the most fundamental types of knowledge. I.e. potentially more fundamental than math and physics.
I think Eliezer just straight up tends not to acknowledge that people sometimes genuinely care about their internal experiences, independent of the outside world, terminally. Certainly, there are people who care about things that are not that, but Eliezer often writes as if people can’t care about the qualia—that they must value video games or science instead of the pleasure derived from video games or science.
His theory of fun is thus mostly a description of how to build a utopia for humans who find it unacceptable to “cheat” by using subdermal space heroin implants. That’s valuable for him and people like him, but if aligned AGI gets here I will just tell it to reconfigure my brain not to feel bored, instead of trying to reconfigure the entire universe in an attempt to make monkey brain compatible with it. I sorta consider that preference a lucky fact about myself, which will allow me to experience significantly more positive and exotic emotions throughout the far future, if it goes well, than the people who insist they must only feel satisfied after literally eating hamburgers or reading jokes they haven’t read before.
This is probably part of why I feel more urgency in getting an actually useful theory of qualitative experience than most LW users.
Utilitarianism seems to demand such a theory of qualitative experience, but this requires affirming the reality of first-person experience. Apparently, some people here would rather stick their hand on a hot stove than be accused of “dualism” (whatever that means) and will assure you that their sensation of burning is an illusion. Their solution is to change the evidence to fit the theory.
It does if you’re one of the Cool People like me who wants to optimize their qualitative experience, but you can build systems that optimize some other utility target. So this isn’t really quite true.
This is true.
I’m interested in qualia for different reasons:
For me personalities of other people are an important type of qualia. I don’t consider knowing someone’s personality to be a simple knowledge like “mitochondria is the powerhouse of the cell”. So, valuing other people makes me interested in qualia more.
I’m interested in knowing properties of qualia (such as ways to enumerate qualia), not necessarily using them for “cheating” or anything. I.e. I’m interested in the knowledge itself.
Personalities aren’t really qualia as I’m defining them. They’re an aggregation of a lot of information about people’s behavior/preferences. Qualia is things people feel/experience.
Would you consider the meaning of a word (at least in a specific context) to be qualia? For me personalities are more or less holistic experiences, not (only) “models” of people or lists of arbitrary facts about a person. I mean, some sort of qualia should be associated with those “models”/facts anyway? People who experience synesthesia may experience specific qualia related to people.
Maybe it’s wishful thinking, but I think it would be cool if awareness about other conscious beings was important for conscious experience.
Seems weird for your blob of matter to react so emotionally to the sounds or shapes that some blobs have emitted bout other blobs. Why would you expect anyone to have a coherent theory of something they can’t even define and measure?
It seems even weirder for you to take such reporting at face value about having any relation to a given blob’s “inner life”, as opposed to a variance in the the evolved and learned verbal and nonverbal signaling that such behaviors actually are.
Because they say so. The problem then is why they think they have a coherent theory of something they can’t define or measure.
Just the way I am bro
I expect people who say they have a coherent theory of something to be able to answer any relevant questions at all about that something.
Are you referring the NYPost link? I think people’s verbal and nonverbal signaling has some relationship with their inner experience. I don’t think this woman is forgoing anaesthetic during surgeries because of pathologies.
But if you disagree, then fine: How do we modify people to have the inner life that that woman is ~pretending to have?
Probably should have included a smiley in my comment, but I do want to point out that it’s reasonable to model people (and animals and maybe rocks) as having highly variant and opaque “inner lives” that bear only a middling correlation to their observable behaviors, and especially to their public behaviors.
For the article on the woman who doesn’t experience pain, I have pretty high credence that there is some truth to her statements, but much lower credence that it maps as simply as presented to “natural stoicism” as presented in the article. And really no clue on “what it’s like” to live that experience, whether it’s less intense and interesting in all dimensions, or just mutes the worst of it, or is … alien.
And since I have no clue how to view or measure an inner life, I have even less understanding of how or whether to manipulate it. I strongly suspect we could make many people have an outer life (which includes talking about one’s inner life) more like the one given, with the right mix of drugs, genetic meddling, and repeated early reinforcement of expectations.
Agreed, basically. That’s part of why we need the theory!
On this forum, or literally everywhere? Because for example I keep seeing people arguing with absolute conviction, even in academic papers, that current AIs and computers can’t possibly be conscious and I can’t figure out how they could ever know that of something that is fundamentally unfalsifiable. I envy their secret knowledge of the world gained by revelation, I guess!
Huh, interesting. Could you make some examples for what people seem to claim this, and if Eliezer is among them, where he seems to claim this? (Would just interest me.)
Attentional Schema Theory. That’s the convincing one. But still very rudimentary.
But you know if something is poorly understood. The guy who thought it up has a section in his book on how to make a computer have conscious experiences.
But any theory is incomplete as the brain is not well understood. I don’t think you can expect a fully formed theory right off the bat, with complete instructions for making a feeling thinking conscious We aren’t there yet.
I’m actually cool with proposing incomplete theories. I’m just annoyed with people declaring the problem solved via appeals to “reductionism” or something, without even suggesting that they’ve thought about answering these questions.