This is just confusing moral anti-realism with egoism. The point is that it makes no sense for anti-realists to worry about the probability of being mistaken about the truth of a moral fact, but it might make sense to worry about the probability of your value system evolving in a direction that causes you to regret prior decisions. Although I suspect that it only makes sense to worry about this when your uncertainty is very high (i.e. you are confused about the issue and are not sure how you will feel after you’ve had a chance to think it through).
Regardless, the comment that I replied to above is either confused or disingenuous. It is entirely consistent for anti-realists to agonize over ethical decisions, act with strictly altruistic motivations and all the rest of it.
This is just confusing moral anti-realism with egoism. The point is that it makes no sense for anti-realists to worry about the probability of being mistaken about the truth of a moral fact, but it might make sense to worry about the probability of your value system evolving in a direction that causes you to regret prior decisions. Although I suspect that it only makes sense to worry about this when your uncertainty is very high (i.e. you are confused about the issue and are not sure how you will feel after you’ve had a chance to think it through).
You realize that’s an argument against moral anti-realism right?
If it is, it’s not a very good one.
Regardless, the comment that I replied to above is either confused or disingenuous. It is entirely consistent for anti-realists to agonize over ethical decisions, act with strictly altruistic motivations and all the rest of it.