Junkfood Science: Salt’s pedigree

December 29, 2006

Salt’s pedigree

They come in an array of sparkling shades and shapes with exotic names like Black Kala Namak from India; Fleur del Sel from Guerande, France; Himalayan Pink Gos Sel; Hawaiian red clay salt; Maldon sea salt; and Korean bamboo sea salt. Salt tastings are all the rage, salt samplers are among the choicest holiday gifts, and chefs tell us each salt has a different flavor and has to be used to its best advantage in cooking. Sea salt tastes a lot better than mined salt, Rocco DeSpirito told the New York Times. “It’s got a real saline, ocean character that comes across in the food.”

Thanks to celebrity chefs, popular cookbook authors and gourmet catalogs, entire mythologies have developed about salts and their healthful virtues. Culinary gurus talk passionately of various salts and continually try to outdo each other for the most alluring, exotic and lavish offerings. And salt fads are born. Gourmet salts can now sell for more than 100 times the price of plain table salt. For most of us, following these food fads seem harmless fun. It never occurs to us it might not be.

Often taken as gospel are claims that sea salt is unrefined, more natural and more healthful than ordinary table salt because it comes from the sea and is high in minerals. Sea salt has been praised for tasting pure, fresh, bright, delicate, sweet, sharp, refined, balanced and well-rounded. Everyday table salt is condemned as tasting bitter, tinny, metallic, acrid, characterless, and chemical-like, because it’s said to be cheap and highly refined.

In actuality, all edible salt sold is about 99% pure sodium chloride. The remaining 1% — negligible traces in a dish — are far too minute to make a difference nutritionally. Scientists tell us that those minuscule amounts of minerals are also undetectable by our taste buds, but it’s easy to convince ourselves otherwise when we’re paying so much for them. :)

Salts do not differ in their saltiness or tastes. What we confuse with a bright salty flavor has nothing to do with the exotic origin of the salt, but more with the size and shape of the grains. When we bite into a big, crunchy crystal, we get a burst of salty flavor on our tongue. But when a fancy salt is dissolved in a moist or cooked dish, its special taste attributes dissolve too, leaving food tasting indistinguishable from that salted using a salt shaker.

Despite popular beliefs, sea salt is not rich in the minerals found in sea water because those are left behind when the sodium chloride crystallizes out as sea water evaporates. It is Mother Nature’s natural refining. Sea salt is ten times freer of minerals than sea water.

Some of the most amusing claims surround salt’s freshness. Salt doesn’t change when it’s exposed to oxygen and it doesn’t contain volatile, aromatic oils that can be released by grinding. So “fresh”-ground is an oxymoron.

We don’t dare admit that we can’t taste any difference among salts, though. We might appear to have an “insensitive palate!” So we choose to ignore the scientific reality or let ourselves get swept up in the fun of the popular fad. But when we forget our food history, we can find ourselves having to relearn the same hard lessons of our great grandparents.

Old fashioned, salt shaker salt differs from most gourmet varieties in one very important way. Since the 1920s, most table salts have had iodine added. Today’s trendiest salts haven’t. Sea salt contains less than 2% of the iodine in iodized salt, despite beliefs that it is naturally iodized (perhaps because some associate it with iodine-rich seaweeds).

Iodized salt was the first “functional” food. Chicken feed was also supplemented with iodine, it was given to milk cows and cattle to prevent hoof rot and reduce fertility problems, and was used to sanitize milking teats (now a waning practice), so eggs and milk products can also provide some iodine. What led up to iodized salt isn’t widely known anymore. Prior to iodized salt, Americans in many areas of the country were deficient in iodine. It was not uncommon for children in some regions to be considered “dull” or “dim-witted” with IQs 15 points below children from areas where iodine deficiency was less common. During World War I, goiter (from severe iodine deficiency) ran as high as 64% in some areas, and it was the biggest single cause of medical disqualification for military service!

By 1955, about 76% of American homes used only iodized salt. Where salt is adequately iodized and people are not limiting their food choices, deficiencies are rare.

Iodine is a natural element needed by the body to make thyroid hormones and is essential for normal growth and development of our nervous system (brain), sexual development, to maintain fertility, regulate our metabolism and maintain our body temperature. Adults with insufficient iodine in their diet show signs of hypothyroidism and women have higher rates of miscarriage; infant mortality is higher in babies; and children are at risk for reduced intelligence and can suffer permanent mental retardation, neurological defects and growth abnormalities.

The U.S. Institute of Medicine’s recommended dietary allowance (RDA) for iodine is 150 mcg for adults and adolescents, 220 mcg for pregnant women, 290 mcg/d for lactating women, and 90-120 mcg for children aged 1-11 years.

But over recent decades, Americans are increasingly shunning ordinary table salt; and commercial restaurants, food processors and chefs have abandoned iodized salt in response to consumer concerns it could affect the taste of foods; preferring “natural” sea salts, kosher salts and noniodized salts. Last month, Food Technology reported that this fad, along with attempts to reduce salt intakes, may be the most significant factor leading us to deficiencies again. The National Health and Nutrition Examination Surveys found that from the 1971-1974 to 2001-2002 examinations, iodine excretion in adults dropped from 320 mcg/L to 168 mcg/L — by nearly half — and the frequency of iodine deficiencies in pregnant women jumped from 1% to 7%.

These are astounding changes. While iodine levels are not yet low enough to declare a public health emergency (remember, RDAs are not minimum requirements and are set higher than most people need to prevent deficiencies to allow for a safety margin), they indicate a trend of serious concern to health professionals.

This summer, researchers at the Conway Institute of Biomolecular & Biomedical Research at the University College Dublin reported that the iodine intakes among Irish women of childbearing age were significantly below World Health Organization recommendations. They reported that a mere 3.3% of all salt sold in Ireland and UK was iodized. This past spring researchers reported in the Medical Journal of Australia that iodine deficiencies were re-emerging in Australia.

A week ago, the New York Times reported that about one-third of the world’s population eating only locally produced foods is short on iodine, contributing to stunted growth among the children and “even a moderate deficiency lowers intelligence by 10 to 15 IQ points, shaving incalculable potential off a nation’s development.” Multiple international iodizing efforts are underway, just as the United States did in the 1920s. Meanwhile, we might be poised to having to relearn our own history lessons.

Bookmark and Share