There seems to be an important point here, but it all seems a little un-rigorous, rather like you just did a post on methods of alchemy or astrology.
In the end, it turned out that the alchemists were on to something (chemistry and nuclear physics) but the astrologers weren’t. At this level of speculativeness, how can we tell which kind of pre-science this is? Is there good any peer-reviewed research on the predictive value of these ideas?
I often get the feeling that Alicorn’s posts could use more evidence. However, given her status here, I take the very fact that she recommends something as evidence that she has herself encountered good evidence that the recommendation works; you know, Aumann agreement and all that.
Besides, even though it would be nice to see which evidence she has encountered, I know that I wouldn’t bother to read the research if she linked to it. Intellectually, I trust Alicorn’s conclusions. Therefore, I wish to believe in her conclusions; you know, Tarski’s litany and all that.
Emotionally, however, I can’t help but to doubt. Fortunately, I know that I’m liable to being emotionally convinced by unreliable arguments like personal experience stories. That’s why I can’t wait to reach the end of this sequence, with the promised “how Alicorn raised her happiness setpoint” story.
There seems to be an important point here, but it all seems a little un-rigorous, rather like you just did a post on methods of alchemy or astrology.
In the end, it turned out that the alchemists were on to something (chemistry and nuclear physics) but the astrologers weren’t. At this level of speculativeness, how can we tell which kind of pre-science this is? Is there good any peer-reviewed research on the predictive value of these ideas?
I often get the feeling that Alicorn’s posts could use more evidence. However, given her status here, I take the very fact that she recommends something as evidence that she has herself encountered good evidence that the recommendation works; you know, Aumann agreement and all that.
Besides, even though it would be nice to see which evidence she has encountered, I know that I wouldn’t bother to read the research if she linked to it. Intellectually, I trust Alicorn’s conclusions. Therefore, I wish to believe in her conclusions; you know, Tarski’s litany and all that.
Emotionally, however, I can’t help but to doubt. Fortunately, I know that I’m liable to being emotionally convinced by unreliable arguments like personal experience stories. That’s why I can’t wait to reach the end of this sequence, with the promised “how Alicorn raised her happiness setpoint” story.