If they do encounter deficiency issues, they can simply reintroduce other foods without suffering catastrophic effects.
And how would they know? Let’s say a child develops a iodine deficiency—a common consequence is the drop in IQ, 10-15 points on the average. You think this will be detected in time to fix this? Let’s imagine—not an improbable scenario at all—that a deficiency of micronutrient X doubles or triples your risk of some old-age disease Y. By the time you’re diagnosed with Y it’s way too late to do anything.
I’d point out that iodine deficiency’s effect on IQ seems to be entirely prenatal—that is, there is a window of vulnerability during a human’s development, and once they’re past that, iodine deficiency no longer operates on IQ (for better or worse) and all that’s left are more minor effects like reducing goiters. Seems possible that a lot of nutrients are like that: main effects of deficiency are in childhood/infancy/prenatal.
Severe iodine deficiency tends to be much more common in diabetic patients, and hypothyroidism (most commonly caused by iodine deficiency or hashimoto’s thyroiditis) tends to be comorbid with diabetes.
By the time you’re diagnosed with Y it’s way too late to do anything.
OK, that’s probably true. It is also true in the case of deficiencies arising from a more conventional diet however. How frequently do micronutrient deficiencies occur under regular diets? How is the chance of timely detection and intervention affected by controlled intake in conjunction with deliberate ongoing monitoring versus an unmonitored ad libitum diet?
Let’s consider two contrasting propositions regarding diet and nutrition:
A) Consuming a varied diet of naturally occurring, unprocessed foods prevents micronutrient deficiencies.
B) Deliberately engineered supplementation prevents micronutrient deficiencies.
Before reading on, take a moment to consider how much confidence you have in proposition A versus B.
Since you introduced the example of iodine deficiency, let’s consider it in more depth. The Wikipedia page on iodine deficiency indicates: “According to World Health Organization, in 2007, nearly 2 billion individuals had insufficient iodine intake …”
Furthermore, this deficiency appears to be common even in wealthy industrialized countries where a wide variety of food is readily available: “In a study of the United Kingdom published in 2011, almost 70% of test subjects were found to be iodine deficient.”
The article proceeds to explain that iodine deficiency is addressed by deliberately and artificially introducing supplemental iodine into the food supply.
I’m shocked to learn just how widespread iodine deficiency is among people eating a “normal” diet (I would have guessed less than 10% prevalence). It seems like traditional diets do a startlingly poor job of avoiding this particular deficiency. I’m updating my beliefs in favor of proposition B over A in light of this data.
I’m shocked to learn just how widespread iodine deficiency is among people eating a “normal” diet (I would have guessed less than 10% prevalence).
The worst part is, a lot of that decline is from people trying to eat ‘healthy’ - and cut out as much salt as possible. Guess what the main source of iodine for people who can’t afford seafood for every meal is? Iodized salt.
(If you want US figures for deficiency, you can find some cites in my iodine page, or simply search for papers related to the long-running NHANES survey which is the main source of evidence.)
And how would they know? Let’s say a child develops a iodine deficiency—a common consequence is the drop in IQ, 10-15 points on the average. You think this will be detected in time to fix this? Let’s imagine—not an improbable scenario at all—that a deficiency of micronutrient X doubles or triples your risk of some old-age disease Y. By the time you’re diagnosed with Y it’s way too late to do anything.
I’d point out that iodine deficiency’s effect on IQ seems to be entirely prenatal—that is, there is a window of vulnerability during a human’s development, and once they’re past that, iodine deficiency no longer operates on IQ (for better or worse) and all that’s left are more minor effects like reducing goiters. Seems possible that a lot of nutrients are like that: main effects of deficiency are in childhood/infancy/prenatal.
When my iodine levels get low I develop symptoms of diabetes. Sushi can induce insulin shock/hypoglycemia in me.
It screws with my hunger and thirst levels as well.
Apparently I’m not alone, either; there seems to be some evidence that there’s a link between iodine and diabetes more generally.
Could you please post or link to it?
See for example https://www.mja.com.au/journal/1999/171/9/iodine-deficiency-ambulatory-participants-sydney-teaching-hospital-australia
and http://link.springer.com/article/10.1007/BF00257427
and http://www.sciencedirect.com/science/article/pii/S1262363607702860
Severe iodine deficiency tends to be much more common in diabetic patients, and hypothyroidism (most commonly caused by iodine deficiency or hashimoto’s thyroiditis) tends to be comorbid with diabetes.
OK, that’s probably true. It is also true in the case of deficiencies arising from a more conventional diet however. How frequently do micronutrient deficiencies occur under regular diets? How is the chance of timely detection and intervention affected by controlled intake in conjunction with deliberate ongoing monitoring versus an unmonitored ad libitum diet?
Let’s consider two contrasting propositions regarding diet and nutrition: A) Consuming a varied diet of naturally occurring, unprocessed foods prevents micronutrient deficiencies. B) Deliberately engineered supplementation prevents micronutrient deficiencies.
Before reading on, take a moment to consider how much confidence you have in proposition A versus B.
Since you introduced the example of iodine deficiency, let’s consider it in more depth. The Wikipedia page on iodine deficiency indicates: “According to World Health Organization, in 2007, nearly 2 billion individuals had insufficient iodine intake …”
Furthermore, this deficiency appears to be common even in wealthy industrialized countries where a wide variety of food is readily available: “In a study of the United Kingdom published in 2011, almost 70% of test subjects were found to be iodine deficient.”
The article proceeds to explain that iodine deficiency is addressed by deliberately and artificially introducing supplemental iodine into the food supply.
I’m shocked to learn just how widespread iodine deficiency is among people eating a “normal” diet (I would have guessed less than 10% prevalence). It seems like traditional diets do a startlingly poor job of avoiding this particular deficiency. I’m updating my beliefs in favor of proposition B over A in light of this data.
The worst part is, a lot of that decline is from people trying to eat ‘healthy’ - and cut out as much salt as possible. Guess what the main source of iodine for people who can’t afford seafood for every meal is? Iodized salt.
(If you want US figures for deficiency, you can find some cites in my iodine page, or simply search for papers related to the long-running NHANES survey which is the main source of evidence.)
I got chewed out constantly growing up by -everybody- because of my salt intake.
Guess what happened to my salt cravings when I added an iodine supplement as an adult?
People don’t just screw themselves up on that one.