“Moral weights depend on intensity of conscient experience.”—Just going to note that I’ve no particular reason to concede this point at the moment, so don’t directly consider the next question a question of moral weight; I’d rather disassociate it first:
Is there … any particular reason to expect intensity of conscious experience to grow ‘super-additively’, such that a tiny conscious mind experiences 1 intensity units, but a mind ten times as large experiences (since you reject linear, we’ll step up to the exponential) 1024 intensity units? Given our general inability to exist as every mass of brain, what makes this more intuitive than no, marginal, or linear increase in intensity?
Personally, I would be actively surprised to spend time as a lower-brain-mass conscious animal and report that my experiences were (exceptionally) less intense. Why do our intuitions differ on this?
Not all mammals have an Anterior Cingulate Cortex. For birds, there is an analogous structure, Nidopallium Caudolaterale, that has a comparable function but is present primarily in social birds.
I’m not saying that other animals don’t respond to pain, but the processing and the association of pain with social emotions (which non-social animals presumably lack) is missing.
That certainly seems distinct from brain mass, though (except that it takes a certain amount to implement in the first place). I’d expect similar variation in feeling pain by becoming different neurologies of human; I know there are many reported variations in perception of felt pain inside our species already.
Indeed. Women are known to report higher pain sensitivity than men. It also decreases with age. There are genes that are known to be involved. Anxiety increases pain perception, good health reduces it. It is possible to adapt to pain to some degree. Meditation is said to tune out pain (anecdotal evidence: I can tune out pain from, e.g., small burns).
I mostly agree in the fact that while conscience intensity is the ontological basis of moral weights, there are other relevant layers. On the hand conscience looks to be some function of integrated information and computation in a network.
IIT for example suggests some entropic combinatorial measure, that very likely would explode.
In any case we are trapped in our own existence, so inter subjective comparison is both necessary and mostly depending on intuition.
But that’s in the limit. A function of electron = 0, ant = 1, cockroach = 4, mouse = 300 fits it just as well as electron = 0, ant = 1, cockroach = 2, mouse = 2^75, as does electron = 0, ant = 100, cockroach = 150, mouse = 200.
Suppose my intuition is that the ‘conscious experience’ of ‘an iPhone’ varies based on what software is running on it. If it could run a thorough emulation of an ant and have its sensory inputs channeled to that emulation, it would be more likely to have conscious experience in a meaningful-to-me way than if nobody bothered (presuming ants do implement at least a trivial conscious experience).
(I guess that there’s not necessarily something that it’s like to be an iPhone, by default, but the hardware complexity could theoretically support an iAnt, which there is it is something that it’s like to be?)
This is also my intuition: the intensity of experience depends on the integrated information flow or the system and the nature of the experience depends on the software details.
Then iPhones have far more limited maximum intensity experience than ants, and ants maximum experience intensity is only a fraction of that of a mouse.
“Moral weights depend on intensity of conscient experience.”—Just going to note that I’ve no particular reason to concede this point at the moment, so don’t directly consider the next question a question of moral weight; I’d rather disassociate it first:
Is there … any particular reason to expect intensity of conscious experience to grow ‘super-additively’, such that a tiny conscious mind experiences 1 intensity units, but a mind ten times as large experiences (since you reject linear, we’ll step up to the exponential) 1024 intensity units? Given our general inability to exist as every mass of brain, what makes this more intuitive than no, marginal, or linear increase in intensity?
Personally, I would be actively surprised to spend time as a lower-brain-mass conscious animal and report that my experiences were (exceptionally) less intense. Why do our intuitions differ on this?
It depends on the type of animal. It might well be that social animals feel pain very differently than non-social animals.
https://www.perplexity.ai/search/Find-evidence-supporting-_ZlYNrCuSSK5HNQMy4GOkA
Not all mammals have an Anterior Cingulate Cortex. For birds, there is an analogous structure, Nidopallium Caudolaterale, that has a comparable function but is present primarily in social birds.
I’m not saying that other animals don’t respond to pain, but the processing and the association of pain with social emotions (which non-social animals presumably lack) is missing.
That certainly seems distinct from brain mass, though (except that it takes a certain amount to implement in the first place). I’d expect similar variation in feeling pain by becoming different neurologies of human; I know there are many reported variations in perception of felt pain inside our species already.
Indeed. Women are known to report higher pain sensitivity than men. It also decreases with age. There are genes that are known to be involved. Anxiety increases pain perception, good health reduces it. It is possible to adapt to pain to some degree. Meditation is said to tune out pain (anecdotal evidence: I can tune out pain from, e.g., small burns).
I mostly agree in the fact that while conscience intensity is the ontological basis of moral weights, there are other relevant layers. On the hand conscience looks to be some function of integrated information and computation in a network.
IIT for example suggests some entropic combinatorial measure, that very likely would explode.
In any case we are trapped in our own existence, so inter subjective comparison is both necessary and mostly depending on intuition.
Because in the limit your intuition is that the experience of an electron is inexistent. The smaller the brain, the closer to inanimate matter.
But that’s in the limit. A function of electron = 0, ant = 1, cockroach = 4, mouse = 300 fits it just as well as electron = 0, ant = 1, cockroach = 2, mouse = 2^75, as does electron = 0, ant = 100, cockroach = 150, mouse = 200.
What about an IPhone? It looks similar to a ant in terms of complexity; Less annoying too…
Suppose my intuition is that the ‘conscious experience’ of ‘an iPhone’ varies based on what software is running on it. If it could run a thorough emulation of an ant and have its sensory inputs channeled to that emulation, it would be more likely to have conscious experience in a meaningful-to-me way than if nobody bothered (presuming ants do implement at least a trivial conscious experience).
(I guess that there’s not necessarily something that it’s like to be an iPhone, by default, but the hardware complexity could theoretically support an iAnt, which there is it is something that it’s like to be?)
This is also my intuition: the intensity of experience depends on the integrated information flow or the system and the nature of the experience depends on the software details.
Then iPhones have far more limited maximum intensity experience than ants, and ants maximum experience intensity is only a fraction of that of a mouse.