I think you are onto the right idea with your analogy, but if you work through the implications, it should be clear that if qualia are truly not functionally important, than we shouldn’t value them.
I mean, to use your analogy—if we discover brains that lack the equivalent of the pointless internal light bulb, should we value them any different?
If they are important, then it is highly likely our intelligent machines will also have them.
I find it far more likely that qualia are a necessary consequence of the massively connected probablistic induction the brain uses, and our intelligent machines will have similar qualia.
Evolution wouldn’t have created the light bulb type structures—complex adaptations must pay for themselves.
If they are important, then it is highly likely our intelligent machines will also have them.
I agree that qualia probably have fitness importance (or are the spandrel of something that does), but I’m not very sure that algorithms in general that implement probabilistic induction similar to our brain’s are also likely to have qualia. Couldn’t it plausibly be an implementation-specific effect, that would not necessarily be reproduced by a similar but non-identical reverse-engineered system?
It is possible, but I don’t find it plausible, partly because I understand qualia to be nearly unavoidable side effects of the whole general category of probabilistic induction engines like our brain, and I belive that practical AGI will necessarily use similar techniques.
Qualia are related to word connotations and the subconscious associative web: everything that happens in such a cognitive engine, every thought, experience or neural stimulus, has a huge web of pseudo-random complex associations that impose small but measurably statistical influence across the whole system.
The experience of perceiving one wavelength of light will have small but measurable differences on every cognitive measure, from mood to types of thoughts one may experience afterwards, and so on. Self-reflecting on how these associative traces ‘feel’ from the inside leads to qualia.
I think you are onto the right idea with your analogy, but if you work through the implications, it should be clear that if qualia are truly not functionally important, than we shouldn’t value them.
I mean, to use your analogy—if we discover brains that lack the equivalent of the pointless internal light bulb, should we value them any different?
If they are important, then it is highly likely our intelligent machines will also have them.
I find it far more likely that qualia are a necessary consequence of the massively connected probablistic induction the brain uses, and our intelligent machines will have similar qualia.
Evolution wouldn’t have created the light bulb type structures—complex adaptations must pay for themselves.
I agree that qualia probably have fitness importance (or are the spandrel of something that does), but I’m not very sure that algorithms in general that implement probabilistic induction similar to our brain’s are also likely to have qualia. Couldn’t it plausibly be an implementation-specific effect, that would not necessarily be reproduced by a similar but non-identical reverse-engineered system?
It is possible, but I don’t find it plausible, partly because I understand qualia to be nearly unavoidable side effects of the whole general category of probabilistic induction engines like our brain, and I belive that practical AGI will necessarily use similar techniques.
Qualia are related to word connotations and the subconscious associative web: everything that happens in such a cognitive engine, every thought, experience or neural stimulus, has a huge web of pseudo-random complex associations that impose small but measurably statistical influence across the whole system.
The experience of perceiving one wavelength of light will have small but measurable differences on every cognitive measure, from mood to types of thoughts one may experience afterwards, and so on. Self-reflecting on how these associative traces ‘feel’ from the inside leads to qualia.