Some “facts” just set my spidey-sense tingling, and I find it usually well worth the time to check out the references in such case. In general, with the slightest doubt I will at least Google the reference and check out the abstract—this is quick and will at the least guarantee that the source does exist.
Particular things that set my spidey-sense off are:
sensationalistic claims—any that is ostensibly used to shock the reader into action
excess precision—like claims about “56% of all software projects” vs “roughly 60% in the population we sampled”
excess confidence about hard-to-test claims, in software those tend to be “productivity” claims
claims that are ambiguous or that would be very hard to confirm experimentally, e.g. the well-known canard about how many times a day men think about sex; basically “is this even a sane question to ask”
hard-to-find primary sources—when you can’t readily check it, a claim becomes more suspicious
abstracts that don’t contain the claim being put forward—it’s more suspicious when you cite a paper for a tiny bit buried deep within
(ETA) “phone game” claims—a citation to a citation of a citation, etc. with many indirections before reaching the primary source
Let’s look at some specifics of the page you cite—at the outset we note that it’s designed to be sensationalistic, a marketing brochure basically. It’s up to you to factor that into your assessment of how much you trust the references.
“As many as 98,000 people die in hospitals each year as a result of medical errors”—doesn’t feel implausible, but a quick Google for US annual death rate—it turns out to be twice the death rate for suicide. This document seems to contradict the finding, I’d check out the reference
“Doctors spend an average of only ten minutes with patients”—an average like that isn’t too hard to work out from a survey and squares with personal experience, I’d take it at face value
“By some estimates, deaths caused by medicine itself total 225,000 per year”—excess precision for a phrase as vague as “caused by medicine itself”, I’d check out the reference just to know what that’s supposed to mean
“Most published research findings are false”—this is the title of the Ioannidis article, and should not be taken at face value to mean “all research in all fields of medicine”, read the ref for sure
“Up to 30% of patients who die in hospitals have important diseases or lesions that are not discovered until after death”—I’d want to know more about how they estimate this—are we talking about extrapolation from the few deaths which result in autopsy?
“It takes an average of 17 years for scientific research to enter clinical practice”—uh, maybe; somewhat ambiguous (what’s an objective criterion for “entering clinical practice”?)
“In oncology alone, 90% of preclinical cancer studies could not be replicated.”—I get into trouble almost immediately trying to check this reference, given as ” Begley, C. G. (n.d.). Preclinical cancer research, 8–10.”: Google gives me partial matches on the title but no exact matches, (n.d.) means “No date” which is kind of weird; this does have a publication date and even a URL
“deaths from cancer have barely been touched”—I wouldn’t be surprised, cancer is a tough bastard
“If a primary care physician provided all recommended [...] care [...] he would need to work 21.7 hours a day”—excess precision (also a hypothetical, so not really “evidence”); check the source
Also, this is a Web page, so I get suspicious on principle that no hyperlinks are provided.
“As many as 98,000 people die in hospitals each year as a result of medical errors”—doesn’t feel implausible, but a quick Google for US annual death rate—it turns out to be twice the death rate for suicide. This document seems to contradict the finding, I’d check out the reference
I have some experience with this.
Some “facts” just set my spidey-sense tingling, and I find it usually well worth the time to check out the references in such case. In general, with the slightest doubt I will at least Google the reference and check out the abstract—this is quick and will at the least guarantee that the source does exist.
Particular things that set my spidey-sense off are:
sensationalistic claims—any that is ostensibly used to shock the reader into action
excess precision—like claims about “56% of all software projects” vs “roughly 60% in the population we sampled”
excess confidence about hard-to-test claims, in software those tend to be “productivity” claims
claims that are ambiguous or that would be very hard to confirm experimentally, e.g. the well-known canard about how many times a day men think about sex; basically “is this even a sane question to ask”
hard-to-find primary sources—when you can’t readily check it, a claim becomes more suspicious
abstracts that don’t contain the claim being put forward—it’s more suspicious when you cite a paper for a tiny bit buried deep within
(ETA) “phone game” claims—a citation to a citation of a citation, etc. with many indirections before reaching the primary source
Let’s look at some specifics of the page you cite—at the outset we note that it’s designed to be sensationalistic, a marketing brochure basically. It’s up to you to factor that into your assessment of how much you trust the references.
“As many as 98,000 people die in hospitals each year as a result of medical errors”—doesn’t feel implausible, but a quick Google for US annual death rate—it turns out to be twice the death rate for suicide. This document seems to contradict the finding, I’d check out the reference
“Doctors spend an average of only ten minutes with patients”—an average like that isn’t too hard to work out from a survey and squares with personal experience, I’d take it at face value
“By some estimates, deaths caused by medicine itself total 225,000 per year”—excess precision for a phrase as vague as “caused by medicine itself”, I’d check out the reference just to know what that’s supposed to mean
“Most published research findings are false”—this is the title of the Ioannidis article, and should not be taken at face value to mean “all research in all fields of medicine”, read the ref for sure
“Up to 30% of patients who die in hospitals have important diseases or lesions that are not discovered until after death”—I’d want to know more about how they estimate this—are we talking about extrapolation from the few deaths which result in autopsy?
“It takes an average of 17 years for scientific research to enter clinical practice”—uh, maybe; somewhat ambiguous (what’s an objective criterion for “entering clinical practice”?)
“In oncology alone, 90% of preclinical cancer studies could not be replicated.”—I get into trouble almost immediately trying to check this reference, given as ” Begley, C. G. (n.d.). Preclinical cancer research, 8–10.”: Google gives me partial matches on the title but no exact matches, (n.d.) means “No date” which is kind of weird; this does have a publication date and even a URL
“deaths from cancer have barely been touched”—I wouldn’t be surprised, cancer is a tough bastard
“If a primary care physician provided all recommended [...] care [...] he would need to work 21.7 hours a day”—excess precision (also a hypothetical, so not really “evidence”); check the source
Also, this is a Web page, so I get suspicious on principle that no hyperlinks are provided.
This number may not only include US data.
Indeed. And how do we find that out?