The evidence that saturated fat is bad for you is dubious. However there is good evidence that processed meats are bad for you, even though every test of possible causal pathways has failed so far.
However there is good evidence that processed meats are bad for you
Really? I heard something on the radio a few days ago about a study to that effect, and then I came across a blog posting by someone apparently reputable finding little substance in the original paper, so what am I to make of that?
For that matter, what is unprocessed meat? Raw?
ETA: This is the study (open access), and “processed” means “having had its shelf life extended”. From a brief glance at the paper, I don’t think they did any sort of causal analysis beyond controlling for possible confounders such as the tendency of high consumers of red meat to smoke more. I don’t care enough about this to study it in any more detail.
Unprocessed means untreated with preservatives. Smoked, salted, dried, potassium benzoate, etc. The evidence I’m referencing is a meta-review of epidemiological studies. The lack of a causal pathway refers to the failure to find anything when doing intervention studies on particular substances. So it could very well be that the epidemiological studies are all failing to properly control for confounding factors. Nutritional self reporting is notoriously terrible. Epidemiological studies often rely on spaced surveys, sometimes asking questions about food habits over an entire year. That people are unable to provide accurate info is unsurprising. Still, it is not zero evidence.
My own hypothesis is that the animal’s diet has a lot more to do with the potential harm to you than currently realized. Animals with crappy diets are sickly. We likely have a natural aversion to eating sickly animals for a reason.
Uh, yeah. The reason for that is that sickly animals carry parasites. It is logical that we wouldn’t want to eat parasite-ridden or diseased animals, because then WE get the parasites. If the animal is not parasite-ridden, there’s no good reason to believe it would be unhealthy to eat.
My personal suspicion for the cause is underlying SES factors (wealthy people tend to eat better, fresher food than the poor) as well as the simple issue of dietary selection—people who watch what they eat are also more likely to exercise and generally have healthier habits than those who are willing to eat anything.
There might be some factors which the study is failing to control for, but from the link in the grandparent
Included in the analysis were 448,568 men and women without prevalent cancer, stroke, or myocardial infarction, and with complete information on diet, smoking, physical activity and body mass index
The study seems to control for the more obvious associated factors.
Also, the full text states that the consumption of red meat is associated with an increase in mortality when controlling for the confounders assessed in their study, with processed meat being associated with a greater increase, but poultry not being associated with an increase in mortality.
The problem is that the choice to eat differently itself is potentially a confounding factor (people who pick particular diets may not be like people who do not do so in very important ways), and any time you have to deal with, say, 10 factors, and try to smooth them out, you have to question whether any signal you find is even meaningful at all, especially when it is relatively small.
The study in particular notes:
[quote]Men and women in the top categories of red or processed meat intake in general consumed fewer fruits and vegetables than those with low intake. They were more likely to be current smokers and less likely to have a university degree [/quote]
At this point, you have to ask yourself whether you can even do any sort of reasonable meta analysis on the population. You’re seeing clear differences between the populations and you can’t just “compensate for them”. If you take a sub-population which has numerous factors which increase their risk of some disease, and then “compensate” for those factors and still see an elevated level of the disease, it isn’t actually suggestive of anything at all, because you have no way of knowing whether your “compensation” actually compensated for it or not. Statistics is not magic; it cannot magically remove bias from data.
This is the problem with virtually all analysis like this, and is why you should never, ever believe studies like this. Worse still, there’s a good chance you’re looking at the blue M&M problem—if you do enough meta analysis of a large population you will find significant trends which are not really there, and different studies (noted in the paper) indicate different results—that study showed no increase in mortality and morbidity from red meat consumption, an American study showed an increase, and several vegetarian studies showed no difference at all. Because of publication bias (positive results are more likely to be reported than negative results), potential researcher bias (belief that a vegetarian diet is good for you is likelier than normal in a population studying diet, because vegetarians are more interested in diets than the population as a whole), and the fact that we’re looking at conflicting results from studies, I’d say that that is pretty good evidence that there is no real effect and it is all nonsense. If I see five studies on diet, and three of them say one thing and two say another, I’m going to stick with the null hypothesis because it is far more likely that the three studies that say it does something are the result of publication bias of positive results.
At this point, you have to ask yourself whether you can even do any sort of reasonable meta analysis on the population. You’re seeing clear differences between the populations and you can’t just “compensate for them”. If you take a sub-population which has numerous factors which increase their risk of some disease, and then “compensate” for those factors and still see an elevated level of the disease, it isn’t actually suggestive of anything at all, because you have no way of knowing whether your “compensation” actually compensated for it or not. Statistics is not magic; it cannot magically remove bias from data.
Well, if you already know how much each of the associated factors contributes alone via other tests where you were able to isolate those variables, you can make an educated guess that their combined effect is no greater than the sum of their individual effects.
The presence of other studies that didn’t show the same significant results weighs against it, but on the other hand such cases are certainly not unheard of with respect to associations that turn out to be real. The Cochrane Collaboration’s logo comes from a forest plot of results for whether an injection of corticosteroids reduce the chance of early death in premature birth. Five out of seven studies failed to achieve statistical significance, but when their evidence was taken together, it achieved very high signficance, and further research since suggests a reduction of mortality rate between 30-50%.
While a study of the sort linked above certainly doesn’t establish the truth of its findings with the confidence of its statistical significance, “never believe studies like this” doesn’t leave you safe from a treatment-of-evidence standpoint, because even in the case of a real association, the data are frequently going to be messy enough that you’d be hard pressed to locate it statistically. You don’t want to set your bar for evidence so high that, in the event that the association were real, you couldn’t be persuaded to believe in it.
You can’t make an educated guess that a combination of multiple factors is no greater than the sum of their individual effects, and indeed, when you’re talking about disease states, this is the OPPOSITE of what you should assume. The harm done to your body taxes its ability to deal with harm; the more harm you apply to it, whatever the source, the worse things get. Your body only has so much ability to fight off bad things happening to it, so if you add two bad things on top of each other, you’re actually likely to see harm which is worse than the sum of their effects because part of each of the effects is naturally masked by your body’s own repair mechanisms.
On the other hand, you could have something where the negative effects of each of the things counteracts each other.
Moreover (and worse), you’re assuming you have any independent data to begin with. Given that there is a correlation between smoking and red meat consumption, your smoking numbers are already suspect, because we’ve established that the two are not independent variables.
In any event, guessing is not science, it is nonsense. I could guess that the impact of the factors was greater than the sum of the parts, and get a different result, and as you can see, it is perfectly reasonable to make that guess as well. That’s why it is called a guess.
When we’re doing analysis, guessing is bad. You guess BEFORE you do the analysis, not afterwards. All you’re doing when you “guess” how large the impact is, is manipulating the data.
That’s why control groups are so important.
Regarding glucocorticosteroid use in pregnancy, there actually is quite a bit of debate over whether or not their use is actually a good thing due to the fact that cortiocosteroids are tetratogens.
And yes, actually, it is generally better not to believe in true correlations than it is to believe in false ones. Look at all the people who are raising malnourished children on vegan and vegetarian diets.
Well, there’s certainly no shortage of evidence that it’s unhealthy for children to be malnourished, so that amounts to defying one true correlation in favor of the possibility of another.
Supposing that there were a causative relation between red meat consumption and mortality, with a low effect size, under what circumstances would you be persuaded to believe in it?
Well, the fact that they have about the same calories per gram doesn’t mean that they’re equally healthy. The fat in bacon is almost all saturated.
Bacon is probably more filling per calorie though, so you’d be less likely to gain weight snacking on bacon than potato chips.
The evidence that saturated fat is bad for you is dubious. However there is good evidence that processed meats are bad for you, even though every test of possible causal pathways has failed so far.
Really? I heard something on the radio a few days ago about a study to that effect, and then I came across a blog posting by someone apparently reputable finding little substance in the original paper, so what am I to make of that?
For that matter, what is unprocessed meat? Raw?
ETA: This is the study (open access), and “processed” means “having had its shelf life extended”. From a brief glance at the paper, I don’t think they did any sort of causal analysis beyond controlling for possible confounders such as the tendency of high consumers of red meat to smoke more. I don’t care enough about this to study it in any more detail.
Unprocessed means untreated with preservatives. Smoked, salted, dried, potassium benzoate, etc. The evidence I’m referencing is a meta-review of epidemiological studies. The lack of a causal pathway refers to the failure to find anything when doing intervention studies on particular substances. So it could very well be that the epidemiological studies are all failing to properly control for confounding factors. Nutritional self reporting is notoriously terrible. Epidemiological studies often rely on spaced surveys, sometimes asking questions about food habits over an entire year. That people are unable to provide accurate info is unsurprising. Still, it is not zero evidence.
My own hypothesis is that the animal’s diet has a lot more to do with the potential harm to you than currently realized. Animals with crappy diets are sickly. We likely have a natural aversion to eating sickly animals for a reason.
Uh, yeah. The reason for that is that sickly animals carry parasites. It is logical that we wouldn’t want to eat parasite-ridden or diseased animals, because then WE get the parasites. If the animal is not parasite-ridden, there’s no good reason to believe it would be unhealthy to eat.
My personal suspicion for the cause is underlying SES factors (wealthy people tend to eat better, fresher food than the poor) as well as the simple issue of dietary selection—people who watch what they eat are also more likely to exercise and generally have healthier habits than those who are willing to eat anything.
There might be some factors which the study is failing to control for, but from the link in the grandparent
The study seems to control for the more obvious associated factors.
Also, the full text states that the consumption of red meat is associated with an increase in mortality when controlling for the confounders assessed in their study, with processed meat being associated with a greater increase, but poultry not being associated with an increase in mortality.
The problem is that the choice to eat differently itself is potentially a confounding factor (people who pick particular diets may not be like people who do not do so in very important ways), and any time you have to deal with, say, 10 factors, and try to smooth them out, you have to question whether any signal you find is even meaningful at all, especially when it is relatively small.
The study in particular notes:
[quote]Men and women in the top categories of red or processed meat intake in general consumed fewer fruits and vegetables than those with low intake. They were more likely to be current smokers and less likely to have a university degree [/quote]
At this point, you have to ask yourself whether you can even do any sort of reasonable meta analysis on the population. You’re seeing clear differences between the populations and you can’t just “compensate for them”. If you take a sub-population which has numerous factors which increase their risk of some disease, and then “compensate” for those factors and still see an elevated level of the disease, it isn’t actually suggestive of anything at all, because you have no way of knowing whether your “compensation” actually compensated for it or not. Statistics is not magic; it cannot magically remove bias from data.
This is the problem with virtually all analysis like this, and is why you should never, ever believe studies like this. Worse still, there’s a good chance you’re looking at the blue M&M problem—if you do enough meta analysis of a large population you will find significant trends which are not really there, and different studies (noted in the paper) indicate different results—that study showed no increase in mortality and morbidity from red meat consumption, an American study showed an increase, and several vegetarian studies showed no difference at all. Because of publication bias (positive results are more likely to be reported than negative results), potential researcher bias (belief that a vegetarian diet is good for you is likelier than normal in a population studying diet, because vegetarians are more interested in diets than the population as a whole), and the fact that we’re looking at conflicting results from studies, I’d say that that is pretty good evidence that there is no real effect and it is all nonsense. If I see five studies on diet, and three of them say one thing and two say another, I’m going to stick with the null hypothesis because it is far more likely that the three studies that say it does something are the result of publication bias of positive results.
Well, if you already know how much each of the associated factors contributes alone via other tests where you were able to isolate those variables, you can make an educated guess that their combined effect is no greater than the sum of their individual effects.
The presence of other studies that didn’t show the same significant results weighs against it, but on the other hand such cases are certainly not unheard of with respect to associations that turn out to be real. The Cochrane Collaboration’s logo comes from a forest plot of results for whether an injection of corticosteroids reduce the chance of early death in premature birth. Five out of seven studies failed to achieve statistical significance, but when their evidence was taken together, it achieved very high signficance, and further research since suggests a reduction of mortality rate between 30-50%.
While a study of the sort linked above certainly doesn’t establish the truth of its findings with the confidence of its statistical significance, “never believe studies like this” doesn’t leave you safe from a treatment-of-evidence standpoint, because even in the case of a real association, the data are frequently going to be messy enough that you’d be hard pressed to locate it statistically. You don’t want to set your bar for evidence so high that, in the event that the association were real, you couldn’t be persuaded to believe in it.
You can’t make an educated guess that a combination of multiple factors is no greater than the sum of their individual effects, and indeed, when you’re talking about disease states, this is the OPPOSITE of what you should assume. The harm done to your body taxes its ability to deal with harm; the more harm you apply to it, whatever the source, the worse things get. Your body only has so much ability to fight off bad things happening to it, so if you add two bad things on top of each other, you’re actually likely to see harm which is worse than the sum of their effects because part of each of the effects is naturally masked by your body’s own repair mechanisms.
On the other hand, you could have something where the negative effects of each of the things counteracts each other.
Moreover (and worse), you’re assuming you have any independent data to begin with. Given that there is a correlation between smoking and red meat consumption, your smoking numbers are already suspect, because we’ve established that the two are not independent variables.
In any event, guessing is not science, it is nonsense. I could guess that the impact of the factors was greater than the sum of the parts, and get a different result, and as you can see, it is perfectly reasonable to make that guess as well. That’s why it is called a guess.
When we’re doing analysis, guessing is bad. You guess BEFORE you do the analysis, not afterwards. All you’re doing when you “guess” how large the impact is, is manipulating the data.
That’s why control groups are so important.
Regarding glucocorticosteroid use in pregnancy, there actually is quite a bit of debate over whether or not their use is actually a good thing due to the fact that cortiocosteroids are tetratogens.
And yes, actually, it is generally better not to believe in true correlations than it is to believe in false ones. Look at all the people who are raising malnourished children on vegan and vegetarian diets.
Well, there’s certainly no shortage of evidence that it’s unhealthy for children to be malnourished, so that amounts to defying one true correlation in favor of the possibility of another.
Supposing that there were a causative relation between red meat consumption and mortality, with a low effect size, under what circumstances would you be persuaded to believe in it?