if you embrace rationality as your raison d’etre, you almost inevitably conclude that human beings must be exterminated.
Let me try to guess your reasoning. If you have “I want to be rational” as one of your terminal values, you will decide that your human brain is a mere hindrance, and so you will turn yourself into a rational robot. But since we are talking about human values, it should be noted that smelling flowers, love, and having family are also among your terminal values. So this robot would still enjoy smelling flowers, love, and having family—after all, if you value doing something, you wouldn’t want to stop liking it, because if you didn’t like it you would stop doing it.
But then, because rational agents always get stuck in genocidal cul-de-sacs, this robot who still feels love is overwhelmed by the need to kill all humans, leading to the extermination of the human race.
Since I probably wasn’t close at all, maybe you could explain?
Let me try to guess your reasoning. If you have “I want to be rational” as one of your terminal values, you will decide that your human brain is a mere hindrance, and so you will turn yourself into a rational robot. But since we are talking about human values, it should be noted that smelling flowers, love, and having family are also among your terminal values. So this robot would still enjoy smelling flowers, love, and having family—after all, if you value doing something, you wouldn’t want to stop liking it, because if you didn’t like it you would stop doing it.
But then, because rational agents always get stuck in genocidal cul-de-sacs, this robot who still feels love is overwhelmed by the need to kill all humans, leading to the extermination of the human race.
Since I probably wasn’t close at all, maybe you could explain?