In the recentdiscussionshere about the value of animals several people have argued that what matters is “sentience”, or the ability to feel. This goes back to at least Bentham with “The question is not, Can they reason? nor, Can they talk? but, Can they suffer?”
Is “can they feel pain” or “can they feel pleasure” really the right question, though? Let’s say we research the biological correlates of pleasure until we understand how to make a compact and efficient network of neurons that constantly experiences maximum pleasure. Because we’ve thrown out nearly everything else a brain does, this has the potential for orders of magnitude more sentience per gram of neurons than anything currently existing. A group of altruists intend to create a “happy neuron farm” of these: is this valuable? How valuable?
(Or say a supervillian is creating a “sad neuron farm”. How important is it that we stop them? Does it matter at all?)
Valuing Sentience: Can They Suffer?
In the recent discussions here about the value of animals several people have argued that what matters is “sentience”, or the ability to feel. This goes back to at least Bentham with “The question is not, Can they reason? nor, Can they talk? but, Can they suffer?”
Is “can they feel pain” or “can they feel pleasure” really the right question, though? Let’s say we research the biological correlates of pleasure until we understand how to make a compact and efficient network of neurons that constantly experiences maximum pleasure. Because we’ve thrown out nearly everything else a brain does, this has the potential for orders of magnitude more sentience per gram of neurons than anything currently existing. A group of altruists intend to create a “happy neuron farm” of these: is this valuable? How valuable?
(Or say a supervillian is creating a “sad neuron farm”. How important is it that we stop them? Does it matter at all?)