If the placebo effect actually worked exactly like that, then yes, you would die while the self-deluded person would do better. However, from personal experience, I highly suspect it doesn’t (I have never had anything that I was told I’d be likely to die from, but I believe even minor illnesses give you some nonzero chance of dying). Here is how I would reason in the world you describe:
There is some probability I will get better from this illness, and some probability I will die.
The placebo effect isn’t magic, it is a real part of the way the mind interacts with the body. It will also decrease my chances of dying.
I don’t want to die.
Therefore I will activate the effect.
To activate the effect for maximum efficiency, I must believe that I will certainly recover.
I have activated the placebo effect. I will recover (Probability: 100%). Max placebo effect achieved!
The world I live in is weird.
In the real world, the above mental gymnastics are not necessary. Think about the things that would make you, personally, feel better during your illness. What makes you feel more comfortable, and less unhappy, when you are ill? For me, the answer is generally a tasty herbal tea, being warm (or cooled down if I’m overheated), and sleeping. If I am not feeling too horrible, I might be up to enjoying a good novel. What would make you feel most comfortable may differ. However, since both of us enjoy thinking rationally, I doubt spouting platitudes like “I have 100% chances of recovery! Yay!” is going to make you personally feel better. Get the benefits of pain reduction and possibly better immune response of the placebo effect by making yourself more physically and mentally comfortable. When I do these things, I don’t think they help me get better because they have some magical ability in and of themselves. I think they will help me get better because of the positive associations I have for them. Hope that helps you in some way.
Well, yea obviously it’s a simplified model to make the math easier, but the end result is the same. The real formula might for example look more like P=0.2+(expectation^2)/3 than P=expectation/2. In that case, the end result is both a real probability and expectation equal to 0.215377 (source: http://www.wolframalpha.com/input/?i=X%3D0.2%2B%28X^2%29%2F3 )
Also, while I used the placebo effect as a dramatic and well known example, it crops up in a myriad other places. I am uncomfortable revealing to much detail, but it has an extremely real and devastating effect on my daily life which means I’m kind of desperate to resolve this and get pissed that people are saying the problem doesn’t exist without showing how mathematically.
You’re asking too general a question. I’ll attempt to guess at your real question and answer it, but that’s notoriously hard. If you want actual help you may have to ask a more concrete question so we can skip the mistaken assumptions on both sides of the conversation. If it’s real and devastating and you’re desperate and the general question goes nowhere, I suggest contacting someone personally or trying to find an impersonal but real example instead of the hypothetical, misleading placebo example (the placebo response doesn’t track calculated probabilities, and it usually only affects subjective perception).
Is the problem you’re having that you want to match your emotional anticipation of success to your calculated probability of success, but you’ve noticed that on some problems your calculated probability of success goes down as your emotional anticipation of success goes down?
If so, my guess is that you’re inaccurately treating several outcomes as necessarily having the same emotional anticipation of success.
Here’s an example: I have often seen people (who otherwise play very well) despair of winning a board game when their position becomes bad, and subsequently make moves that turn their 90% losing position into a 99% losing position. Instead of that, I will reframe my game as finding the best move in the poor circumstances I find myself. Though I have low calculated probability of overall success (10%), I can have quite high emotional anticipation of task success (>80%) and can even be right about that anticipation, retaining my 10% chance rather than throwing 9% of it away due to self-induced despair.
Sounds like we’re finally getting somewhere. Maybe.
I have no way to store calculated probabilities other than as emotional anticipations. Not even the logistical nightmare of writing them down, since they are not introspectively available as numbers and I also have trouble with expressing myself linearly.
I can see how reframing could work for the particular example of game like tasks, however I can’t find similar workaround for the problems I’m facing and even if I could I don’t have the skill to reframe and self modify with sufficient reliability.
One thing that seems like it’s relevant here is that I seem to mainly practice rationality indirectly, by changing the general heuristics, and usually don’t have direct access to the data I’m operating on nor the ability to practice rationality in realtime.
… that last paragraph somehow became more of an analogy because I cant explain it well. Whatever, just don’t take it to literally.
I can see how reframing could work for the particular example of game like tasks, however I can’t find similar workaround for the problems I’m facing and even if I could I don’t have the skill to reframe and self modify with sufficient reliability.
I asked a girl out today shortly after having a conversation with her. She said no and I was crushed. Within five seconds I had reframed as “Woo, I made a move! In daytime in a non-pub environment! Progress on flirting!”
My apologies if the response is flip but I suggest going from “I did the right thing, woo!” to “I made the optimal action given my knowledge, that’s kinda awesome, innit?”
that’s still the same class of problem: “screwed over by circumstances beyond reasonable control”. Stretching it to full generality, “I made the optimal decision given my knowledge, intelligence, rationality, willpower, state of mind, and character flaws”, only makes the framing WORSE because you remember how many things you suck at.
If the placebo effect actually worked exactly like that, then yes, you would die while the self-deluded person would do better. However, from personal experience, I highly suspect it doesn’t (I have never had anything that I was told I’d be likely to die from, but I believe even minor illnesses give you some nonzero chance of dying). Here is how I would reason in the world you describe:
There is some probability I will get better from this illness, and some probability I will die.
The placebo effect isn’t magic, it is a real part of the way the mind interacts with the body. It will also decrease my chances of dying.
I don’t want to die.
Therefore I will activate the effect.
To activate the effect for maximum efficiency, I must believe that I will certainly recover.
I have activated the placebo effect. I will recover (Probability: 100%). Max placebo effect achieved!
The world I live in is weird.
In the real world, the above mental gymnastics are not necessary. Think about the things that would make you, personally, feel better during your illness. What makes you feel more comfortable, and less unhappy, when you are ill? For me, the answer is generally a tasty herbal tea, being warm (or cooled down if I’m overheated), and sleeping. If I am not feeling too horrible, I might be up to enjoying a good novel. What would make you feel most comfortable may differ. However, since both of us enjoy thinking rationally, I doubt spouting platitudes like “I have 100% chances of recovery! Yay!” is going to make you personally feel better. Get the benefits of pain reduction and possibly better immune response of the placebo effect by making yourself more physically and mentally comfortable. When I do these things, I don’t think they help me get better because they have some magical ability in and of themselves. I think they will help me get better because of the positive associations I have for them. Hope that helps you in some way.
Well, yea obviously it’s a simplified model to make the math easier, but the end result is the same. The real formula might for example look more like P=0.2+(expectation^2)/3 than P=expectation/2. In that case, the end result is both a real probability and expectation equal to 0.215377 (source: http://www.wolframalpha.com/input/?i=X%3D0.2%2B%28X^2%29%2F3 )
Also, while I used the placebo effect as a dramatic and well known example, it crops up in a myriad other places. I am uncomfortable revealing to much detail, but it has an extremely real and devastating effect on my daily life which means I’m kind of desperate to resolve this and get pissed that people are saying the problem doesn’t exist without showing how mathematically.
You’re asking too general a question. I’ll attempt to guess at your real question and answer it, but that’s notoriously hard. If you want actual help you may have to ask a more concrete question so we can skip the mistaken assumptions on both sides of the conversation. If it’s real and devastating and you’re desperate and the general question goes nowhere, I suggest contacting someone personally or trying to find an impersonal but real example instead of the hypothetical, misleading placebo example (the placebo response doesn’t track calculated probabilities, and it usually only affects subjective perception).
Is the problem you’re having that you want to match your emotional anticipation of success to your calculated probability of success, but you’ve noticed that on some problems your calculated probability of success goes down as your emotional anticipation of success goes down?
If so, my guess is that you’re inaccurately treating several outcomes as necessarily having the same emotional anticipation of success.
Here’s an example: I have often seen people (who otherwise play very well) despair of winning a board game when their position becomes bad, and subsequently make moves that turn their 90% losing position into a 99% losing position. Instead of that, I will reframe my game as finding the best move in the poor circumstances I find myself. Though I have low calculated probability of overall success (10%), I can have quite high emotional anticipation of task success (>80%) and can even be right about that anticipation, retaining my 10% chance rather than throwing 9% of it away due to self-induced despair.
Sounds like we’re finally getting somewhere. Maybe.
I have no way to store calculated probabilities other than as emotional anticipations. Not even the logistical nightmare of writing them down, since they are not introspectively available as numbers and I also have trouble with expressing myself linearly.
I can see how reframing could work for the particular example of game like tasks, however I can’t find similar workaround for the problems I’m facing and even if I could I don’t have the skill to reframe and self modify with sufficient reliability.
One thing that seems like it’s relevant here is that I seem to mainly practice rationality indirectly, by changing the general heuristics, and usually don’t have direct access to the data I’m operating on nor the ability to practice rationality in realtime.
… that last paragraph somehow became more of an analogy because I cant explain it well. Whatever, just don’t take it to literally.
I asked a girl out today shortly after having a conversation with her. She said no and I was crushed. Within five seconds I had reframed as “Woo, I made a move! In daytime in a non-pub environment! Progress on flirting!”
My apologies if the response is flip but I suggest going from “I did the right thing, woo!” to “I made the optimal action given my knowledge, that’s kinda awesome, innit?”
that’s still the same class of problem: “screwed over by circumstances beyond reasonable control”. Stretching it to full generality, “I made the optimal decision given my knowledge, intelligence, rationality, willpower, state of mind, and character flaws”, only makes the framing WORSE because you remember how many things you suck at.