For actual humans, I’d look into ways of possibly activating the placebo effect without explicit degrees of belief, such as intense visualization of the desired outcome.
This is an interesting idea but I’m skeptical that this would actually work. There are studies which I don’t have the citations for (they are cited in Richard Wiseman’s “59 Seconds”) which strongly suggest that positive thinking in many forms doesn’t actually work. In particular, having people visualize extreme possibilities of success (e.g. how strong they’ll be after they’ve worked out, or how much better looking they will be when they lose weight, etc.) make people less likely to actually succeed (possibly because they spend more time simply thinking about it rather than actually doing it.). This is not strong evidence but it is suggestive evidence that visualization is not sufficient to do that much. These studies didn’t look at medical issues where placebos are more relevant.
any data on if this is actually possible, and if so how to do it? Does it work for other things such as social confidence, positive thinking, etc.?
It certainly SEEMS like it’s the declarative belief itself, not visualizations of outcomes, that cause effects. And the fact so many attempts at perfect deception have failed seems to indicate it’s not possible to disentangle [your best rational belifs] from what your “brain thinks” you believe.
(… I really need some better notation for talking about these kind of things unambiguously.)
It certainly SEEMS like it’s the declarative belief itself, not visualizations of outcomes, that cause effects. And the fact so many attempts at perfect deception have failed seems to indicate it’s not possible to disentangle [your best rational belifs] from what your “brain thinks” you believe.
I’m skeptical as to how common it is for your beliefs to influence anything outside of your head, except through your actions. If your belief X makes Y happen because of method Z, then in order to get Y you only need to know about Z, and that it works. Then you can do Z regardless of X, because what you do mostly screens off what you think.
If you can’t get yourself to do something because of a particular belief, that’s another issue.
No, in humans this is not the case, unless you have a much broader definition of “action” than is useful. For example, other humans can read your intentions and beliefs from your posture and facial expression, the body reacts autonomously to beliefs with stuff like producing drugs and shunting around blood flow, and some entire classes of problems such as mental illness or subjective well being reside entirely in your brain.
Sorry about my last sentence in the previous post sounding dismissive, that was sloppy, and not representative of my views.
I guess my real issue with this is that I don’t think that there’s a 50% placebo, and disagree that the “declarative belief” does things directly. My anticipation of success or failure has an influence on my actions, but a 50% placebo I would imagine would work in real life based on hidden, unanticipated factors to the point that someone with accurate beliefs could say that “my anticipation contributes this much, X contributes this much, Y contributes this much, Z contributes this much, and given my x,y,z I anticipate this” and be pretty much correct.
In the least convenient possible universe, there seems to be enough hacks that rationality enables that I would reject the 50% placebo, and still net a win. I don’t think we live in a universe where the majority of utility is behind 50% placebos.
Why does everyone get stuck on that highly simplified example that I just made like that so that the math would be easy to follow?
Or are you simply saying that placebos and the like are an unavoidable cost of being a rationalist and we just have to deal with it and it’s not that big a cost anyway?
More the latter, with the added caveat that I think that there are fewer things falling under the category of “and the like” than you think there are.
I used to think that my social skills were being damaged by rationality, but then through a combination of “fake it till you make it”, learning a few skills, and dissolving a few false dillemas, they’re now better than they were pre-rationality.
If you want to go into more personal detail, feel free to PM.
It certainly SEEMS like it’s the declarative belief itself, not visualizations of outcomes, that cause effects.
Taboo “declarative”. To me, it sounds like you’re talking about a verbal statement (“declared”), in which case it’s pretty obviously false. AFAIK, priming effects work just fine without words.
For actual humans, I’d look into ways of possibly activating the placebo effect without explicit degrees of belief, such as intense visualization of the desired outcome.
This is an interesting idea but I’m skeptical that this would actually work. There are studies which I don’t have the citations for (they are cited in Richard Wiseman’s “59 Seconds”) which strongly suggest that positive thinking in many forms doesn’t actually work. In particular, having people visualize extreme possibilities of success (e.g. how strong they’ll be after they’ve worked out, or how much better looking they will be when they lose weight, etc.) make people less likely to actually succeed (possibly because they spend more time simply thinking about it rather than actually doing it.). This is not strong evidence but it is suggestive evidence that visualization is not sufficient to do that much. These studies didn’t look at medical issues where placebos are more relevant.
http://articles.latimes.com/2010/dec/22/health/la-he-placebo-effect-20101223
The human brain is a weird thing. Also, see the entire body of self-hypnosis literature.
Another method to try is affirmations.
any data on if this is actually possible, and if so how to do it? Does it work for other things such as social confidence, positive thinking, etc.?
It certainly SEEMS like it’s the declarative belief itself, not visualizations of outcomes, that cause effects. And the fact so many attempts at perfect deception have failed seems to indicate it’s not possible to disentangle [your best rational belifs] from what your “brain thinks” you believe.
(… I really need some better notation for talking about these kind of things unambiguously.)
I’m skeptical as to how common it is for your beliefs to influence anything outside of your head, except through your actions. If your belief X makes Y happen because of method Z, then in order to get Y you only need to know about Z, and that it works. Then you can do Z regardless of X, because what you do mostly screens off what you think.
If you can’t get yourself to do something because of a particular belief, that’s another issue.
No, in humans this is not the case, unless you have a much broader definition of “action” than is useful. For example, other humans can read your intentions and beliefs from your posture and facial expression, the body reacts autonomously to beliefs with stuff like producing drugs and shunting around blood flow, and some entire classes of problems such as mental illness or subjective well being reside entirely in your brain.
Sorry about my last sentence in the previous post sounding dismissive, that was sloppy, and not representative of my views.
I guess my real issue with this is that I don’t think that there’s a 50% placebo, and disagree that the “declarative belief” does things directly. My anticipation of success or failure has an influence on my actions, but a 50% placebo I would imagine would work in real life based on hidden, unanticipated factors to the point that someone with accurate beliefs could say that “my anticipation contributes this much, X contributes this much, Y contributes this much, Z contributes this much, and given my x,y,z I anticipate this” and be pretty much correct.
In the least convenient possible universe, there seems to be enough hacks that rationality enables that I would reject the 50% placebo, and still net a win. I don’t think we live in a universe where the majority of utility is behind 50% placebos.
Why does everyone get stuck on that highly simplified example that I just made like that so that the math would be easy to follow?
Or are you simply saying that placebos and the like are an unavoidable cost of being a rationalist and we just have to deal with it and it’s not that big a cost anyway?
More the latter, with the added caveat that I think that there are fewer things falling under the category of “and the like” than you think there are.
I used to think that my social skills were being damaged by rationality, but then through a combination of “fake it till you make it”, learning a few skills, and dissolving a few false dillemas, they’re now better than they were pre-rationality.
If you want to go into more personal detail, feel free to PM.
Taboo “declarative”. To me, it sounds like you’re talking about a verbal statement (“declared”), in which case it’s pretty obviously false. AFAIK, priming effects work just fine without words.
yea, bad choice of words. Maybe “explicit”, “direct” or “first order” would work better?