It’s not clear to me what you mean by value. To say that something has value is to say that is more valuable than other things. This is why at the end of your progression valuing everything becomes equivalent to valuing nothing.
Yes, that’s another attractor, to my mind. Stuart 7 doesn’t value everything, though; he values objects/beings, and dislikes the destruction of these. That’s why he still has preferences.
But the example was purely illustrative of the general idea.
I’m still not clear what constitutes an object/being and what does not. Is a proton an object?
Fundamentally I think you’re having an understandably difficult applying a binary classification system (value/not value) to a real continuous system. The continuity of value, where things are valuable based on their degree of sentience, or degree of life which I outlined above resolves this to some extent.
I still don’t see how this is fundamentally about altruism. Altruism, loosely defined, is a value system that does not privilege the self over similar beings, but except for very extended definitions of self, that’s not what is going on in your example at all. The reason I bring this up is because the difficulty you pose is a difficulty we deal with every day. Your agent is suffering from choosing between many possible futures which all contain some things he/she/it values such that choosing some of those things sacrifices other “valuable” things. I fail to see how this is substantially different than any trip I make to the grocery store. Your concern about animals preying on other animals (A and B are mutually exclusive) seems directly analogous to my decision to buy either name brand Fruit Loops or store brand Color Circles. Both my money, and my preference for Fruit Loops have value, but I have no difficulty deciding that one is more valuable than the other, and I certainly don’t give up and burn the store down rather than make a decision.
Yes, that’s another attractor, to my mind. Stuart 7 doesn’t value everything, though; he values objects/beings, and dislikes the destruction of these. That’s why he still has preferences.
But the example was purely illustrative of the general idea.
I’m still not clear what constitutes an object/being and what does not. Is a proton an object?
Fundamentally I think you’re having an understandably difficult applying a binary classification system (value/not value) to a real continuous system. The continuity of value, where things are valuable based on their degree of sentience, or degree of life which I outlined above resolves this to some extent.
I still don’t see how this is fundamentally about altruism. Altruism, loosely defined, is a value system that does not privilege the self over similar beings, but except for very extended definitions of self, that’s not what is going on in your example at all. The reason I bring this up is because the difficulty you pose is a difficulty we deal with every day. Your agent is suffering from choosing between many possible futures which all contain some things he/she/it values such that choosing some of those things sacrifices other “valuable” things. I fail to see how this is substantially different than any trip I make to the grocery store. Your concern about animals preying on other animals (A and B are mutually exclusive) seems directly analogous to my decision to buy either name brand Fruit Loops or store brand Color Circles. Both my money, and my preference for Fruit Loops have value, but I have no difficulty deciding that one is more valuable than the other, and I certainly don’t give up and burn the store down rather than make a decision.