Beats me, but I don’t see how a system that’s not guaranteed to keep its values fixed in the first place can be relied upon not to store its values in such a way that epistemic insights won’t alter them. If there’s some reason I should rely on it not to do so, I’d love an explanation (or pointer to an explanation) of that reason.
Certainly, I have no confidence that I’m architected so that epistemic insights can’t alter whatever it is in my brain that we’re talking about when we talk about my “terminal values.”
Beats me, but I don’t see how a system that’s not guaranteed to keep its values fixed in the first place can be relied upon not to store its values in such a way that epistemic insights won’t alter them. If there’s some reason I should rely on it not to do so, I’d love an explanation (or pointer to an explanation) of that reason.
Certainly, I have no confidence that I’m architected so that epistemic insights can’t alter whatever it is in my brain that we’re talking about when we talk about my “terminal values.”