If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it. Personally, I value the truth because it is a massive pre-requisite for doing good in the world (although I do tend to value it a bit more intrinsically). But if Epistemic Rationality didn’t help me be instrumentally rational, then I wouldn’t value it half as much. I want to win.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it.
Better, instrumentally, to learn to handle the truth. Ignorance and dullness are not qualities to be cultivated, however fortuitously useful on occasion it might be to not know something, or be unable to notice an implication of what you do know.
But if Epistemic Rationality didn’t help me be instrumentally rational
If it doesn’t, you’re doing it wrong. This is the entire point of LessWrong.
Better, instrumentally, to learn to handle the truth.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy that produces virtual paperclips and doesn’t care whether they are in the real or virtual world, so it doesn’t want/need to distinguish between them.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy
I really do not care about hypothetical entities that have the goal of being ignorant, especially constructions like wireheaded clippies. It’s generally agreed here that wireheading is a failure mode. So is the valorisation of ignorance by romanticism.
If learning a piece of knowledge will hurt you (emotionally, or be bad for your mental health) then it might be bad, instrumentally, to learn it. Personally, I value the truth because it is a massive pre-requisite for doing good in the world (although I do tend to value it a bit more intrinsically). But if Epistemic Rationality didn’t help me be instrumentally rational, then I wouldn’t value it half as much. I want to win.
Better, instrumentally, to learn to handle the truth. Ignorance and dullness are not qualities to be cultivated, however fortuitously useful on occasion it might be to not know something, or be unable to notice an implication of what you do know.
If it doesn’t, you’re doing it wrong. This is the entire point of LessWrong.
It really depends on your goals/goal system. I think the wiki definition is supposed to encompass possible non-human minds that may have some uncommon goals/drives, like a wireheaded clippy that produces virtual paperclips and doesn’t care whether they are in the real or virtual world, so it doesn’t want/need to distinguish between them.
I really do not care about hypothetical entities that have the goal of being ignorant, especially constructions like wireheaded clippies. It’s generally agreed here that wireheading is a failure mode. So is the valorisation of ignorance by romanticism.