If there’s isn’t a tiny grain of rationality at the core of that infinite regression, you’re in great trouble.
You did catch that I’m talking about a terminal value, right? It’s the nature of those that you want them because you want them, not because they lead to something else that you want. I want everybody to be happy. That’s a terminal value. If you ask me why I want that, I’m going to have some serious trouble answering, because there is no answer. I just want it, and there’s nothing that I know of that I want more, or that I would consider a good reason to give up that goal.
All I can do is point to the sky and hope that people will choose to pay less attention to the finger than what it indicates.
Right now, it’s pointing at “don’t make this mistake”, which I was unlikely to do anyway, but now I have the opportunity to point the mistake out to you, so you can (if you choose to; I can’t force you) stop making it, which would raise the rationality around here, which seems like a good thing to me. Or, I can not point it out, and you keep doing what you’re doing. It’s like one of those lottery problems, and I concluded that the chance of one or both of us becoming more rational was worth the cost of having this discussion. (And, it paid off at least somewhat—I think I have enough insight into that particular mistake to be able to avoid it without avoiding the situation entirely, now.)
You did catch that I’m talking about a terminal value, right? It’s the nature of those that you want them because you want them, not because they lead to something else that you want. I want everybody to be happy. That’s a terminal value. If you ask me why I want that, I’m going to have some serious trouble answering, because there is no answer. I just want it, and there’s nothing that I know of that I want more, or that I would consider a good reason to give up that goal.
Right now, it’s pointing at “don’t make this mistake”, which I was unlikely to do anyway, but now I have the opportunity to point the mistake out to you, so you can (if you choose to; I can’t force you) stop making it, which would raise the rationality around here, which seems like a good thing to me. Or, I can not point it out, and you keep doing what you’re doing. It’s like one of those lottery problems, and I concluded that the chance of one or both of us becoming more rational was worth the cost of having this discussion. (And, it paid off at least somewhat—I think I have enough insight into that particular mistake to be able to avoid it without avoiding the situation entirely, now.)