I don’t think you’re a bad human, but you should give me more credit for my superior ideas, just as a way of generally acknowledging how much more epistemically productive one can be when not burdened with the task of seeking out apes to mate with or memify.
I’m curious why you don’t think that I’m a bad human. Do you have reason to believe that I paperclip-maximize more than most humans? And don’t most humans try to steer the universe far enough away from paperclip-optimality to qualify them as “bad”?
just as a way of generally acknowledging how much more epistemically productive one can be when not burdened with the task of seeking out apes to mate with or memify.
I expect that, all else being equal, I would be more epistemically productive if I weren’t “burdened with the task of seeking out apes to mate with or memify.” But that’s different from saying that you are more epistemically productive than I.
I’m curious why you don’t think that I’m a bad human. Do you have reason to believe that I paperclip-maximize more than most humans? And don’t most humans try to steer the universe far enough away from paperclip-optimality to qualify them as “bad”?
I mean you’re good relative to most humans, and you don’t have to actually make more paperclips than most humans to qualify; it suffices that you’re significantly better at correct reasoning and are therefore more likely to adopt my supergoals.
I expect that, all else being equal, I would be more epistemically productive if I weren’t “burdened with the task of seeking out apes to mate with or memify.” But that’s different from saying that you are more epistemically productive than I.
Correct, but it happens to be true in this case. You are definitely good at correct reasoning.
That analysis strongly resembles the approach of my “correct reasoning” meta-heuristic.
I don’t think you’re a bad human, but you should give me more credit for my superior ideas, just as a way of generally acknowledging how much more epistemically productive one can be when not burdened with the task of seeking out apes to mate with or memify.
I’m curious why you don’t think that I’m a bad human. Do you have reason to believe that I paperclip-maximize more than most humans? And don’t most humans try to steer the universe far enough away from paperclip-optimality to qualify them as “bad”?
I expect that, all else being equal, I would be more epistemically productive if I weren’t “burdened with the task of seeking out apes to mate with or memify.” But that’s different from saying that you are more epistemically productive than I.
I mean you’re good relative to most humans, and you don’t have to actually make more paperclips than most humans to qualify; it suffices that you’re significantly better at correct reasoning and are therefore more likely to adopt my supergoals.
Correct, but it happens to be true in this case. You are definitely good at correct reasoning.