Unless you know you’re kind of a git or, more generally, your value system itself doesn’t rate ‘you taking over the world’ highly.
It’s an instrumental goal, it doesn’t have to be valuable in itself. If you don’t want for your “personal attitude” to apply to the world as a whole, it reflects the fact that your values disagree with your personal attitude, and you prefer for the world to be controlled by your values rather than personal attitude.
Taking over the world as a human ruler is certainly not what I meant, and I expect is a bad idea with bad expected consequences (apart from independent reasons like being in a position to better manage existential risks).
It’s an instrumental goal, it doesn’t have to be valuable in itself.
The point being that It can be a terminal anti-goal. People could (and some of them probably do) value not-taking-over-the-world very highly. Similarly there are people who actually do want to die after the normal alloted years, completely independently of sour grapes updating. I think they are silly, but it is their values that matter to them, not my evaluation thereof.
People could (and some of them probably do) value not-taking-over-the-world very highly.
This is a statement about valuation of states of the world, a valuation that is best satisfied by some form of taking over the world (probably much more subtle than what gets classified so by the valuation itself).
I think they are silly, but it is their values that matter to them, not my evaluation thereof.
It’s still your evaluation of their situation that says whether you should consider their opinion on the matter of their values, or know what they value better than they do. What is the epistemic content of your thinking they are silly?
It’s an instrumental goal, it doesn’t have to be valuable in itself. If you don’t want for your “personal attitude” to apply to the world as a whole, it reflects the fact that your values disagree with your personal attitude, and you prefer for the world to be controlled by your values rather than personal attitude.
Taking over the world as a human ruler is certainly not what I meant, and I expect is a bad idea with bad expected consequences (apart from independent reasons like being in a position to better manage existential risks).
The point being that It can be a terminal anti-goal. People could (and some of them probably do) value not-taking-over-the-world very highly. Similarly there are people who actually do want to die after the normal alloted years, completely independently of sour grapes updating. I think they are silly, but it is their values that matter to them, not my evaluation thereof.
This is a statement about valuation of states of the world, a valuation that is best satisfied by some form of taking over the world (probably much more subtle than what gets classified so by the valuation itself).
It’s still your evaluation of their situation that says whether you should consider their opinion on the matter of their values, or know what they value better than they do. What is the epistemic content of your thinking they are silly?
I do not agree.