That’s why it’s bad when mentally disabled people suffer, and would be even if we discovered that they were secretly not human.
Define “not human”. If someone is, say, completely acephalus, I feel justified in not worrying much about their suffering. Suffering requires a certain degree of sentience to be appreciated and be called, well, suffering. In humans I also think that our unique ability to conceptualise ourselves in space and time heightens the weight of suffering significantly. We don’t just suffer at a time. We suffer, we remember not suffering in the past, we dread more future suffering, and so on so forth. Animals don’t all necessarily live in the present (well, hard to tell, but many behaviours don’t seem to lean that way) but they do seem to have a smaller and less complex time horizon than ours.
The problem is the distinction between suffering as “harmful thing you react to” and the qualia of suffering. Learning behaviours that lead you to avoid things associated with negative feedback isn’t hard; any reinforcement learning system can do that just fine. If I spin up trillions of instances of a chess engine that is always condemned to lose no matter how it plays, am I creating the new worst thing in the world?
Obviously what feels to us like it’s worth worrying about is “there is negative feedback, and there is something that it feels like to experience that feedback in a much more raw way than just a rational understanding that you shouldn’t do that again”. And it’s not obvious when that line is crossed in information-processing systems. We know it’s crossed for us. Similarity to us does matter because it means similarity in brain structure and thus higher prior that something works kind of in the same way with respect to this specific matter.
Insects are about as different as it gets from us while still counting as having a nervous system that actually does a decent amount of processing. Insects barely have brains. We probably aren’t that far off from being able to decently simulate an EM of an insect. I am not saying insects can’t possibly be suffering, but they’re the least likely class of animals to be, barring stuff like jellyfish and corals. And if we go with the negative utilitarian view that any life containing net negative utility is as good as worse than non-existence, and insect suffering matters this much, then you might as well advocate total Earth-wide ecocide of the entire biosphere (which to be sure, is just about what you’d get if you mercy-extinguished a clade as vital as insects).
Define “not human”. If someone is, say, completely acephalus, I feel justified in not worrying much about their suffering. Suffering requires a certain degree of sentience to be appreciated and be called, well, suffering. In humans I also think that our unique ability to conceptualise ourselves in space and time heightens the weight of suffering significantly. We don’t just suffer at a time. We suffer, we remember not suffering in the past, we dread more future suffering, and so on so forth. Animals don’t all necessarily live in the present (well, hard to tell, but many behaviours don’t seem to lean that way) but they do seem to have a smaller and less complex time horizon than ours.
The problem is the distinction between suffering as “harmful thing you react to” and the qualia of suffering. Learning behaviours that lead you to avoid things associated with negative feedback isn’t hard; any reinforcement learning system can do that just fine. If I spin up trillions of instances of a chess engine that is always condemned to lose no matter how it plays, am I creating the new worst thing in the world?
Obviously what feels to us like it’s worth worrying about is “there is negative feedback, and there is something that it feels like to experience that feedback in a much more raw way than just a rational understanding that you shouldn’t do that again”. And it’s not obvious when that line is crossed in information-processing systems. We know it’s crossed for us. Similarity to us does matter because it means similarity in brain structure and thus higher prior that something works kind of in the same way with respect to this specific matter.
Insects are about as different as it gets from us while still counting as having a nervous system that actually does a decent amount of processing. Insects barely have brains. We probably aren’t that far off from being able to decently simulate an EM of an insect. I am not saying insects can’t possibly be suffering, but they’re the least likely class of animals to be, barring stuff like jellyfish and corals. And if we go with the negative utilitarian view that any life containing net negative utility is as good as worse than non-existence, and insect suffering matters this much, then you might as well advocate total Earth-wide ecocide of the entire biosphere (which to be sure, is just about what you’d get if you mercy-extinguished a clade as vital as insects).