I would advise against answering a question of the form “How many of animal X would you trade for one average human,” because of the likelihood of rewriting your values by making a verbal (or in this case, written) commitment to an estimate influenced by scope insensitivity, and the greater availability of what goes on in a human’s head than in an animal’s.
In general, I think trying to weigh secular values against sacred values is a recipe for reducing the amount you care about the former.
It could be a feature if the secular value is a big house or something, and the sacred value is whatever you might donate to an effective charity for.
It’s definitely not a feature if the sacred value is sacred to the individual because when they imagine compromising it, they imagine scorn from their peers, and if society has handed it down as a sacred value of that sort since antiquity.
Also, not all values that people will treat as lexically lower than their most sacred values (In reality, there are probably more than 2 tiers of values, of course) are things you would probably want to get rid of. Most of fun theory is probably much lower on the hierarchy of things that cannot be traded away than human life, and yet you still want concern for it to have a significant role in shaping the future.
And then there’s taboo tradeoffs between a certain amount of a thing, and a smaller amount of the same thing, and following the kind of thought process I warned against leads you to into the territory of clear madness like choosing specks over torture no matter the number of specks.
A more troubling counterargument to what I said is that no matter what you do you are living an answer to the question, so you can’t just ignore it. This is true if you are an effective altruist (who has already rejected working on existential risk), and trying to decide whether to focus on helping humans or helping animals. Then you really need to do a utilitarian calculus that requires that number.
If I needed that number, I would first try to spend some time around the relevant sort of animal (or the closest approximation I could get), and try to gather as much information as possible about what it was cognitively capable of, hang out with some animal-loving hippies to counteract social pressure in favor of valuing humans infinitely more than animals. Then I might try and figure things out indirectly, through separate comparisons to a third variable (Perhaps my own time? I don’t think I feel any hard taboo tradeoffs at work when I think about how much time I’d spend to help animals or how much I’d spend to help humans, though maybe I’ve just worn out the taboo by thinking about trading my time for lives as much as I have (Edit: that sounded a lot more evil than is accurate. To clarify, killing people to save time does sound horrifying to me, but not bothering to save distant strangers doesn’t)).
I would advise against answering a question of the form “How many of animal X would you trade for one average human,” because of the likelihood of rewriting your values by making a verbal (or in this case, written) commitment to an estimate influenced by scope insensitivity, and the greater availability of what goes on in a human’s head than in an animal’s.
In general, I think trying to weigh secular values against sacred values is a recipe for reducing the amount you care about the former.
If I understand the sacred/secular terminology correctly, then this seems like a feature, not a bug.
It could be a feature if the secular value is a big house or something, and the sacred value is whatever you might donate to an effective charity for.
It’s definitely not a feature if the sacred value is sacred to the individual because when they imagine compromising it, they imagine scorn from their peers, and if society has handed it down as a sacred value of that sort since antiquity.
Also, not all values that people will treat as lexically lower than their most sacred values (In reality, there are probably more than 2 tiers of values, of course) are things you would probably want to get rid of. Most of fun theory is probably much lower on the hierarchy of things that cannot be traded away than human life, and yet you still want concern for it to have a significant role in shaping the future.
And then there’s taboo tradeoffs between a certain amount of a thing, and a smaller amount of the same thing, and following the kind of thought process I warned against leads you to into the territory of clear madness like choosing specks over torture no matter the number of specks.
A more troubling counterargument to what I said is that no matter what you do you are living an answer to the question, so you can’t just ignore it. This is true if you are an effective altruist (who has already rejected working on existential risk), and trying to decide whether to focus on helping humans or helping animals. Then you really need to do a utilitarian calculus that requires that number.
If I needed that number, I would first try to spend some time around the relevant sort of animal (or the closest approximation I could get), and try to gather as much information as possible about what it was cognitively capable of, hang out with some animal-loving hippies to counteract social pressure in favor of valuing humans infinitely more than animals. Then I might try and figure things out indirectly, through separate comparisons to a third variable (Perhaps my own time? I don’t think I feel any hard taboo tradeoffs at work when I think about how much time I’d spend to help animals or how much I’d spend to help humans, though maybe I’ve just worn out the taboo by thinking about trading my time for lives as much as I have (Edit: that sounded a lot more evil than is accurate. To clarify, killing people to save time does sound horrifying to me, but not bothering to save distant strangers doesn’t)).