Well, we don’t agree about fish [...] I don’t think it’s *right*
Understanding what you mean by “right”, I think I might agree; it’s not complete, it’s not especially close to certainty.
It’s difficult to apply the mirror chamber’s reduction of anthropic measure across different species (it was only necessitated for comparing over a pair of very similar experiences), and I’m not sure the biomass difference between fishbrain and humanbrain is such that anthropics can be used either, meaning… well, we can conclude, from the amount of rock in the universe, and the tiny amount of humans in the universe, and our being humans instead of rock, that it is astronomically unlikely that anthropic measure binds in significant quantities to rock. If it did, we would almost certainly have woken up in a different sort of place. But for fish, perhaps the numbers are not large enough for us to draw a similar conclusion. (Again, I’m realizing the validity of that sort of argument doesn’t clearly entail from the mirror chamber, though I think it is suggested by it)
I think my real reasons for going with pescatarianism are being fed into from other sources, here. It’s not just the anthropic measure thing. Also receiving a strong push from my friends in neuroscience who claim that the neurology of fish is just way too simple to be given a lot of experiential weight, in the same way that a thermostat is too simple for us to think anything is suffering when … [reexamines the assumptions]...
Hmm. I no longer believe their reasoning there (I should talk to them again I guess). I have seen too many bastards say “but that’s merely a machine so it couldn’t have conscious experience” of systems that probably would have conscious experience, and here they are saying that a biological reinforcement learning system that observably learns from painful experience could not truly suffer. It’s not clear that there’s a difference between that and suffering. I think fish suffer. The quantity must be small, but this is not enough to conclude that it’s negligible.
(… qualia == the class of observations upon which indexical claims can be conditioned?? (I think I’m going to have to write this up properly and do a post))
On the note of *qualia* (providing in case it helps)
DD says this in BoI when he first uses the word:
Intelligence in the general-purpose sense that Turing meant is one of a constellation of attributes of the human mind that have been puzzling philosophers for millennia; others include consciousness, free will, and meaning. A typical such puzzle is that of qualia (singular quale, which rhymes with ‘baalay’) – meaning the subjective aspect of sensations. So for instance the sensation of seeing the colour blue is a quale. Consider the following thought experiment. You are a biochemist with the misfortune to have been born with a genetic defect that disables the blue receptors in your retinas. Consequently you have a form of colour blindness in which you are able to see only red and green, and mixtures of the two such as yellow, but anything purely blue also looks to you like one of those mixtures. Then you discover a cure that will cause your blue receptors to start working. Before administering the cure to yourself, you can confidently make certain predictions about what will happen if it works. One of them is that, when you hold up a blue card as a test, you will see a colour that you have never seen before. You can predict that you will call it ‘blue’, because you already know what the colour of the card is called (and can already check which colour it is with a spectrophotometer). You can also predict that when you first see a clear daytime sky after being cured you will experience a similar quale to that of seeing the blue card. But there is one thing that neither you nor anyone else could predict about the outcome of this experiment, and that is: what blue will look like. Qualia are currently neither describable nor predictable – a unique property that should make them deeply problematic to anyone with a scientific world view (though, in the event, it seems to be mainly philosophers who worry about it).
and under “terminology” at the end of the chapter:
Quale (plural qualia) The subjective aspect of a sensation.
Understanding what you mean by “right”, I think I might agree; it’s not complete, it’s not especially close to certainty.
It’s difficult to apply the mirror chamber’s reduction of anthropic measure across different species (it was only necessitated for comparing over a pair of very similar experiences), and I’m not sure the biomass difference between fishbrain and humanbrain is such that anthropics can be used either, meaning… well, we can conclude, from the amount of rock in the universe, and the tiny amount of humans in the universe, and our being humans instead of rock, that it is astronomically unlikely that anthropic measure binds in significant quantities to rock. If it did, we would almost certainly have woken up in a different sort of place. But for fish, perhaps the numbers are not large enough for us to draw a similar conclusion. (Again, I’m realizing the validity of that sort of argument doesn’t clearly entail from the mirror chamber, though I think it is suggested by it)
I think my real reasons for going with pescatarianism are being fed into from other sources, here. It’s not just the anthropic measure thing. Also receiving a strong push from my friends in neuroscience who claim that the neurology of fish is just way too simple to be given a lot of experiential weight, in the same way that a thermostat is too simple for us to think anything is suffering when … [reexamines the assumptions]...
Hmm. I no longer believe their reasoning there (I should talk to them again I guess). I have seen too many bastards say “but that’s merely a machine so it couldn’t have conscious experience” of systems that probably would have conscious experience, and here they are saying that a biological reinforcement learning system that observably learns from painful experience could not truly suffer. It’s not clear that there’s a difference between that and suffering. I think fish suffer. The quantity must be small, but this is not enough to conclude that it’s negligible.
(… qualia == the class of observations upon which indexical claims can be conditioned?? (I think I’m going to have to write this up properly and do a post))
On the note of *qualia* (providing in case it helps)
DD says this in BoI when he first uses the word:
and under “terminology” at the end of the chapter:
This is in Ch7 which is about AGI.