“Death shall lose” as an attitude may not be strictly correct, but under the circumstances it was instrumentally rational as demonstrated by the fact that it worked.
No, that demonstrated (at least, gave some small amount of evidence that) some people may be able to use self delusion to useful effect. In fact, there are dozens of post here on the subject. But if Harry wants to actually fight death instead of just use his beliefs for the purpose of signalling then denial doesn’t cut it. If he can’t even judge the probabilities of death defeating strategies succeeding then how can he be expected to choose between them rationally?
No, I say if Harry is deliberately deceiving herself because he thinks it is instrumentally rational then that would be a bigger concern than if he had a case of simple naivety.
instead of just use his beliefs for the purpose of signalling
“Death” isn’t a particularly cohesive force. There’s no central armory which, if emptied or sabotaged, would simultaneously disable everything that kills us. Ending a Dementor isn’t ‘just signaling;’ in doing that, Harry permanently removed something which would otherwise have gone on to destroy countless objects and minds. However many Dementors there are on Earth, Harry is now equipped to defeat them all in, at worst, linear time, which would also e.g. stop the ongoing atrocity at Azkaban.
For that matter, Harry doesn’t seem to be deliberately, consciously deceiving himself. He just did something, said what he believed, and it worked. The rationality of whatever it is he did is clear in hindsight, specifically because it worked.
Is there any course of action you can think of that Harry could have taken under the circumstances, which would have ‘actually fought death’ more effectively than what he did?
The rationality of whatever it is he did is clear in hindsight, specifically because it worked.
No, you are fundamentally confused about what rationality means. Betting your entire life savings at even odds that an unbiased dice roll comes up 6 is irrational even if in hindsight it worked. Eleizer’s catch phrase just confuses people.
Is there any course of action you can think of that Harry could have taken under the circumstances, which would have ‘actually fought death’ more effectively than what he did?
Killed the dementor the same way he did, except making claims based off a sane model of reality.
For some background on just why self delusion is harmful for people with the kind of goals that Harry has see Eliezer’s Ethical Injunctions. An excerpt:
Self-deceptions are the worst kind of black swan bets, much worse than lies, because without knowing the true state of affairs, you can’t even guess at what the penalty will be for your self-deception. They only have to blow up once to undo all the good they ever did. One single time when you pray to God after discovering a lump, instead of going to a doctor. That’s all it takes to undo a life. All the happiness that the warm thought of an afterlife ever produced in humanity, has now been more than cancelled by the failure of humanity to institute systematic cryonic preservations after liquid nitrogen became cheap to manufacture. And I don’t think that anyone ever had that sort of failure in mind as a possible blowup, when they said, “But we need religious beliefs to cushion the fear of death.” That’s what black swan bets are all about—the unexpected blowup.
First, I’m not so sure Harry’s claims are as crazy as you’re making them out to be. There’s at least one charm which violates the second law of thermodynamics, which means some basic assumptions about what’s possible and what isn’t need to be reworked.
Second, you’re comparing the immediate, apparently permanent and total defeat of a Dementor to the warm-fuzzy feelings from religion, and you’re also comparing the risk of Harry being wrong about the possibility of eliminating death to the risk of someone with strong religious beliefs neglecting proper medical care. Both comparisons are deeply flawed, due to substitution effects.
If someone wants warm fuzzy feelings, they can get them from something other than religion. A good meal, hanging out with friends, arguing about fanfiction, or even certain types of recreational drug use, provide comparable benefits without the same risks. Other people in the MoRiverse have tried to destroy dementors before, but Harry is apparently the first to succeed, so substitutes simply aren’t available. Considering the way partial transmutation works, Harry’s attitude toward death may very well be an inextricable part of the technique.
If Harry is wrong, and people will continue to die until the human race goes extinct and all evidence that we ever were slowly fades toward heat-death, if it’s really true that nothing can be done about all that, it’s not clear (to me at least) how he’s making the situation worse by trying. Hastening the collapse by a few minutes, using up resources that might otherwise have produced a slightly more amusing light-show near the end? Insufficient data for a meaningful conclusion, if you ask me. For all we know, his insane obsessions might provide a net benefit to humanity in the long term. If someone is wrong about faith-healing, the consequences are much less ambiguous: sickness and death, which could have been prevented.
Are you saying that there’s some way to end death which would, for whatever perverse reason, elude anyone totally determined to find it, but be discoverable by those with a more nuanced attitude? That there’s some better, but mutually-exclusive goal? What, exactly, is the black-swan risk you’re worried about here?
Killed the dementor the same way he did, except making claims based off a sane model of reality.
It sounds to me like you’re just upset that he used the wrong ritual but it worked anyway.
Second, you’re comparing the immediate, apparently permanent and total defeat of a Dementor to the warm-fuzzy feelings from religion, and you’re also comparing the risk of Harry being wrong about the possibility of eliminating death to the risk of someone with strong religious beliefs neglecting proper medical care. Both comparisons are deeply flawed, due to substitution effects.
I’m not doing either of those things. I did refer you to a document that explains why the author of HP:MoR believes self delusion is a mistake when it comes to important beliefs. That document did include extreme examples to demonstrate the principle tangibly.
It sounds to me like you’re just upset
I downvoted this. I am disagreeing with you because you are confused about what rational decisions are. I have explained the reasons.
that he used the wrong ritual but it worked anyway.
It didn’t work. Nor did it fail—success or failure in defeating death hasn’t happened yet. I have no reason to expect that self delusion would prevent Harry from killing a dementor, which is why I never suggested that it would.
That article was about doing things you know to be wrong, in pursuit of a flawed ‘greater good.’ The specific worst-case was believing something you know to be false. What knowably false belief are you saying Harry has accepted in the face of contravening evidence?
“Death shall lose” as an attitude may not be strictly correct, but under the circumstances it was instrumentally rational as demonstrated by the fact that it worked.
Ah, in that case I apologize for miscommunicating. By ‘strictly correct’ I meant ‘literally, objectively true in the context of the story.’ Whether Harry’s goal is in fact possible most likely won’t be revealed for quite some time; spilling the beans now wouldn’t be dramatic. But, by the same token, it’s not (yet?) knowably false.
I agree that Harry is being extremely, perhaps excessively, confident about something he can’t really prove, and that such behavior is risky. However, it’s an acceptable sort of risk, since he can always find contrary evidence later and change his mind, do something else with the rest of his life. The sort of risk entrepreneurs take. He hasn’t hit any self-modifying point-of-no-return.
What’s confusing in discussions such as this is the lack of a clear definition of self-deception.
Minds are complex. They contain stuff other than conscious verbal beliefs, things like gut-level feelings (aliefs?), unconscious assumptions, imagery, emotions, desires.
We absolutely suck at conveying mental phenomena other than explicit beliefs and attempts to do so result in silliness like “believe in yourself” or “just do it”.
This leads to two problems. First, it is not clear what you mean about self-deception. Trying to deliberately alter your beliefs is obviously bad. But what about controlling your attention? Do I self-decieve about something by refusing to look at it? What about influencing emotions through positive mental imagery? Or using a relaxation technique to calm myself down?
The second problem is that when someone says “I will win” you can’t be sure wheter he really means “I expressly believe that my success is certain” or maybe “I know of the possibility of failure but refuse to bring it to the forefront of awareness. I feel energized, motivated and determined to achieve my goal.” The second option seems like a more reasonable interpretation, unless you already have reasons to suspect the speaker of being an idiot.
Killed the dementor the same way he did, except making claims based off a sane model of reality.
Which incorrect claim, specifically, is an example of what you talk about? Death will lose more or less inevitably, under the condition that civilization survives (and death has no say in whether it does).
Which incorrect claim, specifically, is an example of what you talk about?
p(death is defeated). Not p(death is defeated | civilization survives).
Death will lose more or less inevitably, under the condition that civilization survives (and death has no say in whether it does).
Yes, more or less. The most obvious cases where it wouldn’t are
If one of Robin’s speculated Malthusian futures came to pass. Or
If someone goes and creates a dystopian singularity. (For example, if a well intentioned AI researcher implements CEV, gives humanity what it wishes for and it turns out that humans are coherently extrapolatably as silly as Dumbledore.)
No, that demonstrated (at least, gave some small amount of evidence that) some people may be able to use self delusion to useful effect. In fact, there are dozens of post here on the subject. But if Harry wants to actually fight death instead of just use his beliefs for the purpose of signalling then denial doesn’t cut it. If he can’t even judge the probabilities of death defeating strategies succeeding then how can he be expected to choose between them rationally?
No, I say if Harry is deliberately deceiving herself because he thinks it is instrumentally rational then that would be a bigger concern than if he had a case of simple naivety.
“Death” isn’t a particularly cohesive force. There’s no central armory which, if emptied or sabotaged, would simultaneously disable everything that kills us. Ending a Dementor isn’t ‘just signaling;’ in doing that, Harry permanently removed something which would otherwise have gone on to destroy countless objects and minds. However many Dementors there are on Earth, Harry is now equipped to defeat them all in, at worst, linear time, which would also e.g. stop the ongoing atrocity at Azkaban.
For that matter, Harry doesn’t seem to be deliberately, consciously deceiving himself. He just did something, said what he believed, and it worked. The rationality of whatever it is he did is clear in hindsight, specifically because it worked.
Is there any course of action you can think of that Harry could have taken under the circumstances, which would have ‘actually fought death’ more effectively than what he did?
No, you are fundamentally confused about what rationality means. Betting your entire life savings at even odds that an unbiased dice roll comes up 6 is irrational even if in hindsight it worked. Eleizer’s catch phrase just confuses people.
Killed the dementor the same way he did, except making claims based off a sane model of reality.
For some background on just why self delusion is harmful for people with the kind of goals that Harry has see Eliezer’s Ethical Injunctions. An excerpt:
First, I’m not so sure Harry’s claims are as crazy as you’re making them out to be. There’s at least one charm which violates the second law of thermodynamics, which means some basic assumptions about what’s possible and what isn’t need to be reworked.
Second, you’re comparing the immediate, apparently permanent and total defeat of a Dementor to the warm-fuzzy feelings from religion, and you’re also comparing the risk of Harry being wrong about the possibility of eliminating death to the risk of someone with strong religious beliefs neglecting proper medical care. Both comparisons are deeply flawed, due to substitution effects.
If someone wants warm fuzzy feelings, they can get them from something other than religion. A good meal, hanging out with friends, arguing about fanfiction, or even certain types of recreational drug use, provide comparable benefits without the same risks. Other people in the MoRiverse have tried to destroy dementors before, but Harry is apparently the first to succeed, so substitutes simply aren’t available. Considering the way partial transmutation works, Harry’s attitude toward death may very well be an inextricable part of the technique.
If Harry is wrong, and people will continue to die until the human race goes extinct and all evidence that we ever were slowly fades toward heat-death, if it’s really true that nothing can be done about all that, it’s not clear (to me at least) how he’s making the situation worse by trying. Hastening the collapse by a few minutes, using up resources that might otherwise have produced a slightly more amusing light-show near the end? Insufficient data for a meaningful conclusion, if you ask me. For all we know, his insane obsessions might provide a net benefit to humanity in the long term. If someone is wrong about faith-healing, the consequences are much less ambiguous: sickness and death, which could have been prevented.
Are you saying that there’s some way to end death which would, for whatever perverse reason, elude anyone totally determined to find it, but be discoverable by those with a more nuanced attitude? That there’s some better, but mutually-exclusive goal? What, exactly, is the black-swan risk you’re worried about here?
It sounds to me like you’re just upset that he used the wrong ritual but it worked anyway.
I’m not doing either of those things. I did refer you to a document that explains why the author of HP:MoR believes self delusion is a mistake when it comes to important beliefs. That document did include extreme examples to demonstrate the principle tangibly.
I downvoted this. I am disagreeing with you because you are confused about what rational decisions are. I have explained the reasons.
It didn’t work. Nor did it fail—success or failure in defeating death hasn’t happened yet. I have no reason to expect that self delusion would prevent Harry from killing a dementor, which is why I never suggested that it would.
That article was about doing things you know to be wrong, in pursuit of a flawed ‘greater good.’ The specific worst-case was believing something you know to be false. What knowably false belief are you saying Harry has accepted in the face of contravening evidence?
The one you conceded at the beginning of this conversation. This is the entire basis of the disagreement:
Ah, in that case I apologize for miscommunicating. By ‘strictly correct’ I meant ‘literally, objectively true in the context of the story.’ Whether Harry’s goal is in fact possible most likely won’t be revealed for quite some time; spilling the beans now wouldn’t be dramatic. But, by the same token, it’s not (yet?) knowably false.
I agree that Harry is being extremely, perhaps excessively, confident about something he can’t really prove, and that such behavior is risky. However, it’s an acceptable sort of risk, since he can always find contrary evidence later and change his mind, do something else with the rest of his life. The sort of risk entrepreneurs take. He hasn’t hit any self-modifying point-of-no-return.
What’s confusing in discussions such as this is the lack of a clear definition of self-deception.
Minds are complex. They contain stuff other than conscious verbal beliefs, things like gut-level feelings (aliefs?), unconscious assumptions, imagery, emotions, desires. We absolutely suck at conveying mental phenomena other than explicit beliefs and attempts to do so result in silliness like “believe in yourself” or “just do it”.
This leads to two problems. First, it is not clear what you mean about self-deception. Trying to deliberately alter your beliefs is obviously bad. But what about controlling your attention? Do I self-decieve about something by refusing to look at it? What about influencing emotions through positive mental imagery? Or using a relaxation technique to calm myself down?
The second problem is that when someone says “I will win” you can’t be sure wheter he really means “I expressly believe that my success is certain” or maybe “I know of the possibility of failure but refuse to bring it to the forefront of awareness. I feel energized, motivated and determined to achieve my goal.” The second option seems like a more reasonable interpretation, unless you already have reasons to suspect the speaker of being an idiot.
Which incorrect claim, specifically, is an example of what you talk about? Death will lose more or less inevitably, under the condition that civilization survives (and death has no say in whether it does).
p(death is defeated). Not p(death is defeated | civilization survives).
Yes, more or less. The most obvious cases where it wouldn’t are
If one of Robin’s speculated Malthusian futures came to pass. Or
If someone goes and creates a dystopian singularity. (For example, if a well intentioned AI researcher implements CEV, gives humanity what it wishes for and it turns out that humans are coherently extrapolatably as silly as Dumbledore.)