Unless you can taboo “conscious” in such a way that that made sense, I’m gonna substitute “intelligent” for “conscious” there (which is clearly what I meant, in context.)
The point with bees is that, as a “hive mind”, they act as an optimizer without any individual intention.
I’m gonna substitute “intelligent” for “conscious” there
I don’t see that you can substitute “intelligent” for “conscious”. Perhaps they are correlated, but they’re certainly not the same. I’m definitely more intelligent than my dog, but am I more conscious? Probably not. My dog seems to experience the world just as vividly as I do. (Knowing this for certain requires solving the hard problem of consciousness, but that’s where the evidence seems to point.)
(which is clearly what I meant, in context.)
It’s clear to you because you wrote it, but it wasn’t clear to me.
Well yes, that’s the illusion of transparency for you. I assure you, I was using conscious as a synonym for intelligent. Were you interpreting it as “able to experience qualia”? Because that is both a tad tautological and noticeably different from the argument I’ve been making here.
Whatever. We’re getting offtopic.
If you value optimizer’s goals regardless of intelligence—whether valuing a bugs desires as much as a human’s, a hivemind’s goals less than it’s individual members or an evolution’s goals anywhere—you get results that do not appear to correlate with anything you could call human morality. If I have misinterpreted your beliefs, I would like to know how. If I have interpreted them correctly, I would like to see how you reconcile this with saving orphans by tipping over the ant farm.
If ants experience qualia at all, which is highly uncertain, they probably don’t experience them to the same extent that humans do. Therefore, their desires are not as important. On the issue of the moral relevance of insects, the general consensus among utilitarians seems to be that we have no idea how vividly insects can experience the world, if at all, so we are in no position to rate their moral worth; and we should invest more into research on insect qualia.
I think it’s pretty obvious that (e.g.) dogs experience the world about as vividly as humans do, so all else being equal, kicking a dog is about as bad as kicking a human. (I won’t get into the question of killing because it’s massively more complicated.)
I would like to seehow you reconcile this with saving orphans by tipping over the ant farm.
I cannot say whether this is right or wrong because we don’t know enough about ant qualia, but I would guess that a single human’s experience is worth the experience of at least hundreds of ants, possibly a lot more.
you get results that do not appear to correlate with anything you could call human morality.
Like what, besides the orphans-ants thing? I don’t know if you’ve misinterpreted my beliefs unless I have a better idea of what you think I believe. That said, I do believe that a lot of “human morality” is horrendously incorrect.
I think it’s pretty obvious that (e.g.) dogs experience the world about as vividly as humans do, so all else being equal, kicking a dog is about as bad as kicking a human.
This isn’t obvious to me. And it is especially not obvious given that dogs are a species where one of the primary selection effects has been human sympathy.
You make a good point about human sympathy. Still, if you look at biological and neurological evidence, it appears that dogs are built in pretty much the same ways we are. They have the same senses—in fact, their senses are stronger in some cases. They have the same evolutionary reasons to react to pain. The parts of their brains responsible for pain look the same as ours. The biggest difference is probably that we have cerebral cortexes and they don’t, but that part of the brain isn’t especially important in responding to physical pain. Other forms of pain, yes; and I would agree that humans can feel some negative states more strongly than dogs can. But it doesn’t look like physical pain is one of those states.
If ants experience qualia at all, which is highly uncertain, they probably don’t experience them to the same extent that humans do. Therefore, their desires are not as important.
GOSH REALLY.
I think it’s pretty obvious that (e.g.) dogs experience the world about as vividly as humans do, so all else being equal, kicking a dog is about as bad as kicking a human. (I won’t get into the question of killing because it’s massively more complicated.)
Once again, you fail to provide the slightest justification for valuing dogs as much as humans; if this was “obvious” we wouldn’t be arguing, would we? Dogs are intelligent enough to be worth a non-negligable amount, but if we value all pain equally you should feel the same way about, say, mice, or … ants.
I would like to see how you reconcile this with saving orphans by tipping over the ant farm.
I cannot say whether this is right or wrong because we don’t know enough about ant qualia, but I would guess that a single human’s experience is worth the experience of at least hundreds of ants, possibly a lot more.
Like what, besides the orphans-ants thing? I don’t know if you’ve misinterpreted my beliefs unless I have a better idea of what you think I believe. That said, I do believe that a lot of “human morality” is horrendously incorrect.
How, exactly, can human morality be “incorrect”? What are you comparing it to?
if we value all pain equally you should feel the same way about, say, mice, or … ants.
Not if mice or ants don’t feel as much pain as humans do. Equal pain is equally valuable, no matter the species. But unequal pain is not equally valuable.
Huh? You value individualbees, yet not ants?
I worded my comment poorly. I didn’t mean to imply that bees are necessarily conscious. I’ve edited my comment to reflect this.
How, exactly, can human morality be “incorrect”? What are you comparing it to?
Well I’d have to get into metaethics to answer this, which I’m not very good at. I don’t think such a conversation would be fruitful.
GOSH REALLY.
Yes, really. You seemed to think that I believe ants were worth as much as humans, so I explained why I don’t believe that.
Firstly, I thought you said we were discussing disutility, not pain?
Secondly, could we taboo consciousness? It seems to mean all things to all people in discussions like this.
Thirdly, you claimed human morality was incorrect; I was under the impression that we were analyzing human morality. If you are working to a different standard to humanity (which I doubt) then perhaps a change in terminology is in order? If you are, in fact, a human, and as such the “morality” under discussion here is that of humans, then your statement makes no sense.
Assuming the second possibility, you’re right; there is no need to get into metaethics as long as we focus on actual (human) ethics.
Not if mice or ants don’t feel as much pain as humans do. Equal pain is equally valuable, no matter the species. But unequal pain is not equally valuable.
What conceivable test would verify if one organism feels more pain than another organism?
Good question. I don’t know of any such test, although I’m reluctant to say that it doesn’t exist. That’s why it’s important to do research in this area.
Some kind of brain scans? Probably not very useful on insects, etc, but would probably work for, say, chickens vs. chimpanzees.
Okay, say you had some kind of nociceptor analysis machine (or, for that matter, whatever you think “pain” will eventually reduce to). Would it count the number of discrete cociceptors or would it measure cociceptor mass? What if we encountered extra-terrestrial life that didn’t have any (of whatever it is that we have reduced “pain” to)? Would they then count for nothing in your moral calculus?
To me, this whole things feels like we are trying to multiply apples by oranges and divide by zebras. Also, it seems problematic from an institutional design perspective, due to poor incentive structure. It would reward those persons that self-modify towards being more utility-monster-like on the margin.
Well, there’s neurologically sophisticated Earthly life with neural organization very different from mammals’, come to that.
I’m not neurologist enough to give an informed account of how an octopus’s brain differs from a rhesus monkey’s, but I’m almost sure its version of nociception would look quite different. Though they’ve got an opioid receptor system, so maybe this is more basal than I thought.
I remember reading that crustaceans don’t have the part of the brain that processes pain. I don’t feel bad about throwing live crabs into boiling water.
If true, that is interesting. On the other hand, whether or not something feels pain seems like a much easier problem to solve than how much pain something feels relative to something else.
Unless you can taboo “conscious” in such a way that that made sense, I’m gonna substitute “intelligent” for “conscious” there (which is clearly what I meant, in context.)
The point with bees is that, as a “hive mind”, they act as an optimizer without any individual intention.
I don’t see that you can substitute “intelligent” for “conscious”. Perhaps they are correlated, but they’re certainly not the same. I’m definitely more intelligent than my dog, but am I more conscious? Probably not. My dog seems to experience the world just as vividly as I do. (Knowing this for certain requires solving the hard problem of consciousness, but that’s where the evidence seems to point.)
It’s clear to you because you wrote it, but it wasn’t clear to me.
Well yes, that’s the illusion of transparency for you. I assure you, I was using conscious as a synonym for intelligent. Were you interpreting it as “able to experience qualia”? Because that is both a tad tautological and noticeably different from the argument I’ve been making here.
Whatever. We’re getting offtopic.
If you value optimizer’s goals regardless of intelligence—whether valuing a bugs desires as much as a human’s, a hivemind’s goals less than it’s individual members or an evolution’s goals anywhere—you get results that do not appear to correlate with anything you could call human morality. If I have misinterpreted your beliefs, I would like to know how. If I have interpreted them correctly, I would like to see how you reconcile this with saving orphans by tipping over the ant farm.
If ants experience qualia at all, which is highly uncertain, they probably don’t experience them to the same extent that humans do. Therefore, their desires are not as important. On the issue of the moral relevance of insects, the general consensus among utilitarians seems to be that we have no idea how vividly insects can experience the world, if at all, so we are in no position to rate their moral worth; and we should invest more into research on insect qualia.
I think it’s pretty obvious that (e.g.) dogs experience the world about as vividly as humans do, so all else being equal, kicking a dog is about as bad as kicking a human. (I won’t get into the question of killing because it’s massively more complicated.)
I cannot say whether this is right or wrong because we don’t know enough about ant qualia, but I would guess that a single human’s experience is worth the experience of at least hundreds of ants, possibly a lot more.
Like what, besides the orphans-ants thing? I don’t know if you’ve misinterpreted my beliefs unless I have a better idea of what you think I believe. That said, I do believe that a lot of “human morality” is horrendously incorrect.
This isn’t obvious to me. And it is especially not obvious given that dogs are a species where one of the primary selection effects has been human sympathy.
You make a good point about human sympathy. Still, if you look at biological and neurological evidence, it appears that dogs are built in pretty much the same ways we are. They have the same senses—in fact, their senses are stronger in some cases. They have the same evolutionary reasons to react to pain. The parts of their brains responsible for pain look the same as ours. The biggest difference is probably that we have cerebral cortexes and they don’t, but that part of the brain isn’t especially important in responding to physical pain. Other forms of pain, yes; and I would agree that humans can feel some negative states more strongly than dogs can. But it doesn’t look like physical pain is one of those states.
GOSH REALLY.
Once again, you fail to provide the slightest justification for valuing dogs as much as humans; if this was “obvious” we wouldn’t be arguing, would we? Dogs are intelligent enough to be worth a non-negligable amount, but if we value all pain equally you should feel the same way about, say, mice, or … ants.
Huh? You value individual bees, yet not ants?
How, exactly, can human morality be “incorrect”? What are you comparing it to?
See my reply here.
Not if mice or ants don’t feel as much pain as humans do. Equal pain is equally valuable, no matter the species. But unequal pain is not equally valuable.
I worded my comment poorly. I didn’t mean to imply that bees are necessarily conscious. I’ve edited my comment to reflect this.
Well I’d have to get into metaethics to answer this, which I’m not very good at. I don’t think such a conversation would be fruitful.
Yes, really. You seemed to think that I believe ants were worth as much as humans, so I explained why I don’t believe that.
Firstly, I thought you said we were discussing disutility, not pain?
Secondly, could we taboo consciousness? It seems to mean all things to all people in discussions like this.
Thirdly, you claimed human morality was incorrect; I was under the impression that we were analyzing human morality. If you are working to a different standard to humanity (which I doubt) then perhaps a change in terminology is in order? If you are, in fact, a human, and as such the “morality” under discussion here is that of humans, then your statement makes no sense.
Assuming the second possibility, you’re right; there is no need to get into metaethics as long as we focus on actual (human) ethics.
What conceivable test would verify if one organism feels more pain than another organism?
Good question. I don’t know of any such test, although I’m reluctant to say that it doesn’t exist. That’s why it’s important to do research in this area.
Some kind of brain scans? Probably not very useful on insects, etc, but would probably work for, say, chickens vs. chimpanzees.
Okay, say you had some kind of nociceptor analysis machine (or, for that matter, whatever you think “pain” will eventually reduce to). Would it count the number of discrete cociceptors or would it measure cociceptor mass? What if we encountered extra-terrestrial life that didn’t have any (of whatever it is that we have reduced “pain” to)? Would they then count for nothing in your moral calculus?
To me, this whole things feels like we are trying to multiply apples by oranges and divide by zebras. Also, it seems problematic from an institutional design perspective, due to poor incentive structure. It would reward those persons that self-modify towards being more utility-monster-like on the margin.
Well, there’s neurologically sophisticated Earthly life with neural organization very different from mammals’, come to that.
I’m not neurologist enough to give an informed account of how an octopus’s brain differs from a rhesus monkey’s, but I’m almost sure its version of nociception would look quite different. Though they’ve got an opioid receptor system, so maybe this is more basal than I thought.
I remember reading that crustaceans don’t have the part of the brain that processes pain. I don’t feel bad about throwing live crabs into boiling water.
Really? I remember reading the opposite. Many times. If you’re regularly boiling them alive, have you considered researching this?
I’m not regularly boiling them alive, but I researched it a little anyway. Here’s a study often used to show that crustaceans DO feel pain: http://forms.mbl.edu/research/services/iacuc/pdf/pain_hermit_crabs.pdf
If true, that is interesting. On the other hand, whether or not something feels pain seems like a much easier problem to solve than how much pain something feels relative to something else.