I think you’re wrong about your own preferences. In particular, can you think of any specific humans that you like? Surely the value of humanity is at least the value of those people.
Then there may, indeed, be no rational argument (or any argument) that will convince you; a fundamental disagreement on values is not a question of rationality. If the disagreement is sufficiently large—the canonical example around here being the paperclip maximiser—then it may be impossible to settle it outside of force. Now, as you are not claiming to be a clippy—what happened to Clippy, anyway? - you are presumably human at least genetically, so you’ll forgive me if I suspect a certain amount of signalling in your misanthropic statements. So your real disagreement with LW thoughts may not be so large as to require force. How about if we just set aside a planet for you, and the rest of us spread out into the universe, promising not to bother you in the future?
But is there a rational argument for that? Because on a gut level, I just don’t like humans all that much.
I think you’re wrong about your own preferences. In particular, can you think of any specific humans that you like? Surely the value of humanity is at least the value of those people.
Then there may, indeed, be no rational argument (or any argument) that will convince you; a fundamental disagreement on values is not a question of rationality. If the disagreement is sufficiently large—the canonical example around here being the paperclip maximiser—then it may be impossible to settle it outside of force. Now, as you are not claiming to be a clippy—what happened to Clippy, anyway? - you are presumably human at least genetically, so you’ll forgive me if I suspect a certain amount of signalling in your misanthropic statements. So your real disagreement with LW thoughts may not be so large as to require force. How about if we just set aside a planet for you, and the rest of us spread out into the universe, promising not to bother you in the future?