Sorry about my last sentence in the previous post sounding dismissive, that was sloppy, and not representative of my views.
I guess my real issue with this is that I don’t think that there’s a 50% placebo, and disagree that the “declarative belief” does things directly. My anticipation of success or failure has an influence on my actions, but a 50% placebo I would imagine would work in real life based on hidden, unanticipated factors to the point that someone with accurate beliefs could say that “my anticipation contributes this much, X contributes this much, Y contributes this much, Z contributes this much, and given my x,y,z I anticipate this” and be pretty much correct.
In the least convenient possible universe, there seems to be enough hacks that rationality enables that I would reject the 50% placebo, and still net a win. I don’t think we live in a universe where the majority of utility is behind 50% placebos.
Why does everyone get stuck on that highly simplified example that I just made like that so that the math would be easy to follow?
Or are you simply saying that placebos and the like are an unavoidable cost of being a rationalist and we just have to deal with it and it’s not that big a cost anyway?
More the latter, with the added caveat that I think that there are fewer things falling under the category of “and the like” than you think there are.
I used to think that my social skills were being damaged by rationality, but then through a combination of “fake it till you make it”, learning a few skills, and dissolving a few false dillemas, they’re now better than they were pre-rationality.
If you want to go into more personal detail, feel free to PM.
Sorry about my last sentence in the previous post sounding dismissive, that was sloppy, and not representative of my views.
I guess my real issue with this is that I don’t think that there’s a 50% placebo, and disagree that the “declarative belief” does things directly. My anticipation of success or failure has an influence on my actions, but a 50% placebo I would imagine would work in real life based on hidden, unanticipated factors to the point that someone with accurate beliefs could say that “my anticipation contributes this much, X contributes this much, Y contributes this much, Z contributes this much, and given my x,y,z I anticipate this” and be pretty much correct.
In the least convenient possible universe, there seems to be enough hacks that rationality enables that I would reject the 50% placebo, and still net a win. I don’t think we live in a universe where the majority of utility is behind 50% placebos.
Why does everyone get stuck on that highly simplified example that I just made like that so that the math would be easy to follow?
Or are you simply saying that placebos and the like are an unavoidable cost of being a rationalist and we just have to deal with it and it’s not that big a cost anyway?
More the latter, with the added caveat that I think that there are fewer things falling under the category of “and the like” than you think there are.
I used to think that my social skills were being damaged by rationality, but then through a combination of “fake it till you make it”, learning a few skills, and dissolving a few false dillemas, they’re now better than they were pre-rationality.
If you want to go into more personal detail, feel free to PM.