No meta-analyses, please. The last time I did a literature review on the subject, most papers were of embarrassingly low quality and a meta-analysis can easily obscure that. This rule does not introduce bias: if a meta-analysis is based on high quality studies, then those individual studies are admissible here.
I’m a little confused by this, given that meta-analyses still reference the individual studies that they include. So if people did link meta-analyses, even if you didn’t trust the analysis itself, you could still go through the individual studies included in it to see whether they met your criteria.
But then all the readers have to perform that work and duplicate each other’s efforts, in addition to the commenter (answerer) doing it. And the answerer has to perform that work anyway, in order to establish whether the meta-analysis is comprised of decent quality publications.
This is especially true when the articles are paywalled and this verification costs not only time and effort, but also money (or at least more effort in circumventing the paywall).
There is also a pragmatic reason: When challenging people’s beliefs and asking them for some evidence, they will often respond by throwing a lot of material at the wall and hoping that something sticks, or more likely that the sceptic gives up (e.g., ‘see this 800-pages book for details’ or ‘look up research of Dr. Xyz’ or ‘follow the 57 references on this Wikipedia page’). Pointing at a meta-analysis that one hasn’t verified is exactly that tactic. And if the answerer has verified the meta-analysis, then picking one of the studies is hardly any work.
When challenging people’s beliefs and asking them for some evidence, they will often respond by throwing a lot of material at the wall and hoping that something sticks, or more likely that the sceptic gives up (e.g., ‘see this 800-pages book for details’ or ‘look up research of Dr. Xyz’ or ‘follow the 57 references on this Wikipedia page’). Pointing at a meta-analysis that one hasn’t verified is exactly that tactic. And if the answerer has verified the meta-analysis, then picking one of the studies is hardly any work.
But most LW readers aren’t experts at verifying the validity of psychological studies; I know that I’m not. To the extent that I have formed an opinion about the state of meditation research, it has been by reading things like meta-analyses and books summarizing the research, and trusting the authors to at least be somewhat better at evaluating the quality of the studies than I would be. You say that pointing at a meta-analysis that one hasn’t verified is exactly the tactic of dishonesty—but it can also be the tactic of “honestly sharing the evidence that I’m drawing on, so that if you are interested in looking at this in more detail than I have, you’ll have a slightly easier time doing so”.
I expect most readers here to be in a similar position, assuming that they have even looked at the literature at all. I’m not saying that it’s wrong to ask for a more detailed analysis, but I suspect that you might not get very many answers.
I’m not saying that it’s wrong to ask for a more detailed analysis, but I suspect that you might not get very many answers.
I think your prediction is likely correct. My another motivation (which I didn’t want to name, not to seem hostile), was precisely increasing the cost of posting an answer.
In my previous attempt at answering this question I found that there was a lot of people flooding me with large amounts of vague references. The cost of sifting through those outweighed the benefit (if any) of broadening my search space. To be honest, it was all noise and no signal. But then again, I wasn’t posing the question to a rationalist community.
But most LW readers aren’t experts at verifying the validity of psychological studies; I know that I’m not.
I’m not either. But the manipulations and shortcomings I’ve seen so far were painfully obvious. Maybe I missed some, but still I would rather trust my honest scrutiny, even if lacking expertise in the field, than the academics and journal editors in the field whose incentive system I don’t understand.
One more thing that I think I haven’t voiced clearly enough: If you have your sources at your fingertips and want to share them in a comment, I will be grateful for that. It just might take me a while before I get to them.
I’m a little confused by this, given that meta-analyses still reference the individual studies that they include. So if people did link meta-analyses, even if you didn’t trust the analysis itself, you could still go through the individual studies included in it to see whether they met your criteria.
But then all the readers have to perform that work and duplicate each other’s efforts, in addition to the commenter (answerer) doing it. And the answerer has to perform that work anyway, in order to establish whether the meta-analysis is comprised of decent quality publications.
This is especially true when the articles are paywalled and this verification costs not only time and effort, but also money (or at least more effort in circumventing the paywall).
There is also a pragmatic reason: When challenging people’s beliefs and asking them for some evidence, they will often respond by throwing a lot of material at the wall and hoping that something sticks, or more likely that the sceptic gives up (e.g., ‘see this 800-pages book for details’ or ‘look up research of Dr. Xyz’ or ‘follow the 57 references on this Wikipedia page’). Pointing at a meta-analysis that one hasn’t verified is exactly that tactic. And if the answerer has verified the meta-analysis, then picking one of the studies is hardly any work.
But most LW readers aren’t experts at verifying the validity of psychological studies; I know that I’m not. To the extent that I have formed an opinion about the state of meditation research, it has been by reading things like meta-analyses and books summarizing the research, and trusting the authors to at least be somewhat better at evaluating the quality of the studies than I would be. You say that pointing at a meta-analysis that one hasn’t verified is exactly the tactic of dishonesty—but it can also be the tactic of “honestly sharing the evidence that I’m drawing on, so that if you are interested in looking at this in more detail than I have, you’ll have a slightly easier time doing so”.
I expect most readers here to be in a similar position, assuming that they have even looked at the literature at all. I’m not saying that it’s wrong to ask for a more detailed analysis, but I suspect that you might not get very many answers.
I think your prediction is likely correct. My another motivation (which I didn’t want to name, not to seem hostile), was precisely increasing the cost of posting an answer.
In my previous attempt at answering this question I found that there was a lot of people flooding me with large amounts of vague references. The cost of sifting through those outweighed the benefit (if any) of broadening my search space. To be honest, it was all noise and no signal. But then again, I wasn’t posing the question to a rationalist community.
I’m not either. But the manipulations and shortcomings I’ve seen so far were painfully obvious. Maybe I missed some, but still I would rather trust my honest scrutiny, even if lacking expertise in the field, than the academics and journal editors in the field whose incentive system I don’t understand.
One more thing that I think I haven’t voiced clearly enough: If you have your sources at your fingertips and want to share them in a comment, I will be grateful for that. It just might take me a while before I get to them.
You’re interested in quality rather than quantity.