I’m still not clear what constitutes an object/being and what does not. Is a proton an object?
Fundamentally I think you’re having an understandably difficult applying a binary classification system (value/not value) to a real continuous system. The continuity of value, where things are valuable based on their degree of sentience, or degree of life which I outlined above resolves this to some extent.
I still don’t see how this is fundamentally about altruism. Altruism, loosely defined, is a value system that does not privilege the self over similar beings, but except for very extended definitions of self, that’s not what is going on in your example at all. The reason I bring this up is because the difficulty you pose is a difficulty we deal with every day. Your agent is suffering from choosing between many possible futures which all contain some things he/she/it values such that choosing some of those things sacrifices other “valuable” things. I fail to see how this is substantially different than any trip I make to the grocery store. Your concern about animals preying on other animals (A and B are mutually exclusive) seems directly analogous to my decision to buy either name brand Fruit Loops or store brand Color Circles. Both my money, and my preference for Fruit Loops have value, but I have no difficulty deciding that one is more valuable than the other, and I certainly don’t give up and burn the store down rather than make a decision.
I’m still not clear what constitutes an object/being and what does not. Is a proton an object?
Fundamentally I think you’re having an understandably difficult applying a binary classification system (value/not value) to a real continuous system. The continuity of value, where things are valuable based on their degree of sentience, or degree of life which I outlined above resolves this to some extent.
I still don’t see how this is fundamentally about altruism. Altruism, loosely defined, is a value system that does not privilege the self over similar beings, but except for very extended definitions of self, that’s not what is going on in your example at all. The reason I bring this up is because the difficulty you pose is a difficulty we deal with every day. Your agent is suffering from choosing between many possible futures which all contain some things he/she/it values such that choosing some of those things sacrifices other “valuable” things. I fail to see how this is substantially different than any trip I make to the grocery store. Your concern about animals preying on other animals (A and B are mutually exclusive) seems directly analogous to my decision to buy either name brand Fruit Loops or store brand Color Circles. Both my money, and my preference for Fruit Loops have value, but I have no difficulty deciding that one is more valuable than the other, and I certainly don’t give up and burn the store down rather than make a decision.