Monica Reinagel – Food & Nutrition Magazine https://foodandnutrition.org Award-winning magazine published by the Academy of Nutrition and Dietetics Mon, 26 Jul 2021 17:15:56 +0000 en-US hourly 1 https://foodandnutrition.org/wp-content/uploads/2017/04/cropped-Favicon-32x32.png Monica Reinagel – Food & Nutrition Magazine https://foodandnutrition.org 32 32 The Science on Soy https://foodandnutrition.org/from-the-magazine/the-science-on-soy/ Sat, 05 Jan 2019 18:29:41 +0000 https://foodandnutrition.org/?p=18526 ]]> Soy seems to have an equal number of promoters and detractors. Should we go out of our way to eat more soy or avoid it? One website touts it as a cure for hot flashes, while another warns it may bring on early puberty. One source says it’s good for your heart, and another says it’s bad for your thyroid.

The naysayers may be prevailing among the public. A 2016 survey by the International Food Information Council Foundation revealed that, while 68 percent of participants said they were trying to eat more beans, nuts and seeds, 27 percent said they were trying to eat less soy (which is a bean) as a protein source.

Unlike some hot-button food trends, there is a lot of research on soy, allowing food and nutrition professionals to offer evidence-based answers to questions and concerns about soy-based foods and beverages, including soymilk, tofu, edamame and soy protein powder and fermented soy-based foods such as miso, tempeh and natto.

Focusing mainly on dietary soy (evidence for soy-based isoflavone supplements should be evaluated separately), here’s what the science says in answer to eight frequently asked questions:

Do soy foods contain estrogen?
Soybeans contain phytoestrogens, compounds that are similar in structure to the human hormone estrogen (estradiol). In vitro studies have shown that phytoestrogens can interact with estrogen receptors on the surface of human cells, giving them the potential to exert mild estrogenic effects. Phytoestrogens also may have anti-estrogenic effects by blocking the more active human estrogen from reaching the receptors.

Soy’s potential estrogen agonist and antagonist properties have led to questions about how eating soy might influence hormone levels in men, women and children. The answer: Evidence is mixed.

Does feeding soy to infants or children cause early puberty?
The average age of puberty onset has decreased in Western societies, and the popularity and consumption of soy has increased. Is there a connection? Evidence says probably not.

A case-control study found no difference in age of puberty between children who were given soy infant formula and kids who were breast-fed or given cow’s milk formula.

A cross-sectional analysis of pre- and post-pubescent girls found no relationship between reported soy consumption and reported age of menarche. A similar analysis of teenage boys found the reported age at puberty was an average of six months earlier for those who reported high soy consumption compared to those who consumed low amounts of soy.

However, a prospective study of pre-pubertal boys and girls found adding soy protein to the diet on a daily basis for a year did not affect sexual maturation or age of puberty.

Does soy consumption affect male fertility?
Among male patients at an infertility clinic, soy foods were associated with a reduction in sperm concentration but not motility, and was not associated with the success of fertility treatment. Soy isoflavone supplementation did not appear to affect sperm quality in healthy males. Limited evidence addresses this topic and more research is needed.

Does eating soy protect against breast cancer?
Epidemiological evidence has linked higher soy food consumption with reduced breast cancer risk. Women from countries such as Japan, where soy foods are central to the diet, experience much lower incidence of breast cancer than Western women who traditionally eat less soy. It appears that eating soy foods early in life may deliver the most protective benefit.

Is soy safe for those who have been diagnosed with breast cancer?
The potential estrogenic effects of soy have been a source of concern for those with hormone-sensitive cancers such as breast cancer. The concern is that even weak phytoestrogens in soy might promote the growth of cancer or its recurrence. However, a 10-year follow-up study of thousands of women with previous breast cancer diagnosis found that soy isoflavone intake from foods did not increase their risk of recurrence. To the contrary, it was associated with reduced all-cause mortality, although this association was most significant in women with hormone-negative cancers and those who did not receive hormone treatment.

Can soy relieve hot flashes?
Hot flashes and other menopausal symptoms caused by waning estrogen levels can be affected by many factors, including diet, body weight, exercise and stress. Estrogen replacement therapy often is prescribed to women experiencing hot flashes; could the phytoestrogens in soy foods be potent enough to reduce the hot flashes? Although a prospective study of Japanese women found that hot flashes were inversely associated with soy consumption, a review of 10 intervention studies found that incorporating more soy foods into the diet did not reliably reduce hot flashes and other symptoms of menopause.

Although evidence does not support recommending soy foods for preventing or alleviating hot flashes, soy can be recommended as a nutritious food.

How might soy affect thyroid function?
Some warn that soy may negatively affect thyroid function. As long a person’s iodine intake is adequate, this is unlikely to be a concern. A review of 14 trials found that soy and soy isoflavone from dietary sources and supplements had either no effect or minimal effect on thyroid function in healthy people with normal iodine status.

Soy foods may affect the absorption of synthetic thyroid hormone. This does not necessarily mean patients on thyroid replacement need to avoid soy. If their soy food consumption is reasonably consistent, their medication dose can be calibrated to accommodate typical eating patterns: for example, a slightly higher dose to compensate for any absorption-inhibiting effects of soy. Patients with thyroid conditions should discuss this with their health care provider.

Do soy foods lower cholesterol or reduce the risk of coronary heart disease (CHD)?
There are at least two potential mechanisms by which soy foods might reduce cholesterol and thereby heart disease risk. Soy contains phytosterols and stanols, plant compounds that have been shown to lower LDL cholesterol in a dose-dependent manner. (These fat-soluble compounds are concentrated in soybean oil.) Whole soybeans also contain soluble fiber, which has been shown to reduce LDL cholesterol.

Earlier studies suggested that replacing animal protein with soy protein foods reduced total and LDL cholesterol levels. In 1999, the Food and Drug Administration approved the claim that “soy protein included in a diet low in saturated fat and cholesterol may reduce the risk of CHD by lowering blood cholesterol levels.”

However, in 2017, the FDA proposed a rule to revoke this claim, citing “numerous studies published since the claim was authorized in 1999 have presented inconsistent findings on the relationship between soy protein and heart disease.”

But in 2017, the FDA also approved a qualified health claim that “supportive but inconclusive scientific evidence suggests that eating about 1½ tablespoons of soybean oil daily may reduce the risk of coronary heart disease.”

Soy foods may play a role in a heart-healthy diet, especially to the extent that they replace foods higher in saturated fat. However, soy does not appear to be a silver bullet against high cholesterol or heart disease.

References

21 CFR Part 101 Food Labeling: Health Claims; Soy Protein and Coronary Heart Disease. Federal Register website. Published October 26, 1999. Accessed November 30, 2018.
Anderson JW, Johnstone BM, Cook-Newell ME. Meta-analysis of the effects of soy protein intake on serum lipids. N Engl J Med. 1995;333(5):276-282.
Balentine DA. Qualified Health Claims Petition – Soybean Oil and Reduced Risk of Coronary Heart Disease. Food and Drug Administration letter. Published July 21, 2017. Accessed December 11, 2018.
Brown L, Rosner B, Willett WW, Sacks FM. Cholesterol-lowering effects of dietary fiber: a meta-analysis. Am J Clin Nutr. 1999;69(1):30-42.
Cederroth CR, Nef S. Soy, phytoestrogens and metabolism: A review. Mol Cell Endocrinol. 2009;304(1-2):30-42.
Chavarro JE, Toth TL, Sadio SM, Hauser R. Soy food and isoflavone intake in relation to semen quality parameters among men from an infertility clinic. Hum Reprod. 2008;23(11):2584-2590.
Duitama SM, Zurita J, Cordoba D, Duran P, Ilag L, Meija W. Soy protein supplement intake for 12 months has no effect on sexual maturation and may improve nutritional status in pre-pubertal children. J Paediatr Child Health. 2018;54(9):997-1004.
Food Decision 2016: Food & Health Survey. International Food Information Council Foundation website. Accessed November 30, 2018.
Herman-Giddens ME. Recent data on pubertal milestones in United States children: the secular trend toward earlier development. Int J Androl. 2006;29(1):241-246.
Hot Flashes: What Can I Do?. National Institute on Aging website. Reviewed June 26, 2017. Accessed December 11, 2018.
Levis S, Griebeler ML. The role of soy foods in the treatment of menopausal symptoms. J Nutr. 2010;140(12):2318S-2321S.
Messina M. Impact of Soy Foods on the Development of Breast Cancer and the Prognosis of Breast Cancer Patients. Forsch Komplementmed. 2016;23(2):75-80.
Messina M, Redmond G. Effects of soy protein and soybean isoflavones on thyroid function in healthy adults and hypothyroid patients: A review of the relevant literature. Thyroid. 2006;16(3):249-258.
Mínguez-Alarcón L, Afeiche MC, Chiu YH, et al. Male soy food intake was not associated with in vitro fertilization outcomes among couples attending a fertility center. Andrology. 2015;3(4):702-708.
Mitchell JH, Cawood E, Kinniburgh D, Provan A, Collins AR, Irvine DS. Effect of a phytoestrogen food supplement on reproductive health in normal males. Clin Sci (Lond). 2001;100(6):613-618.
Morito K, Aomori T, Hirose T, et al. Interaction of phytoestrogens with estrogen receptors alpha and beta (II). Biol Pharm Bull. 2002;25(1):48-52.
Nagata C, Takatsuka N, Kawakami N, Shimizu H. Soy product intake and hot flashes in Japanese women: results from a community-based prospective study. Am J Epidemiol. 2001;153(8):790-793.
Ras R, Geleijnse J. LDL-cholesterol-lowering effect of plant sterols and stanols across different dose ranges: A meta-analysis of randomized controlled studies. British J Nutr. 2014;112(2):214-219.
Segovia-Siapco G, Pribis P, Messina M, Oda K, Sabaté J. Is soy intake related to age at onset of menarche? A cross-sectional study among adolescents with a wide range of soy food consumption. Nutr J. 2014;13:54.
Segovia-Siapco G, Pribis P, Oda K, Sabaté J. Soy isoflavone consumption and age at pubarche in adolescent males. Eur J Nutr. 2018;57(6):2287-2294.
Sinai T, Ben-Avraham S, Guelmann-Mizrahi I, et al. Consumption of soy-based infant formula is not associated with early onset of puberty [published online ahead of print March 20, 2018]. Eur J Nutr. 2018. Accessed December 11, 2018.
Statement from Susan Mayne, Ph.D., on proposal to revoke health claim that soy protein reduces risk of heart disease. U.S. Food & Drug Administration website. Published October 30, 2017. Accessed November 30, 2018.
Zhang FF, Haslam DE, Terry MB, et al. Dietary isoflavone intake and all-cause mortality in breast cancer survivors: The Breast Cancer Family Registry. Cancer. 2017;123(11):2070-2079.

]]>
Should You Eat a Less Varied Diet? https://foodandnutrition.org/blogs/stone-soup/should-you-eat-a-less-varied-diet/ Tue, 11 Sep 2018 09:00:40 +0000 https://foodandnutrition.org/?p=16281 ]]> “Eat a varied diet” is a fairly standard piece of advice. The idea is that by eating a greater variety of foods, you’ll be more likely to check off all the nutritional boxes. But a new report suggests that the enormous amount of variety in our diet may be leading us astray.

When we have lots of different foods on our plates (or on a buffet line), we tend to eat more. You’ve no doubt experienced this countless times. After eating a bowl of chili, we might feel no desire to continue eating … until a piece of cheesecake appears. Suddenly, we have a little more room.

But we can use this effect to our advantage, by limiting the variety of snacks and sweets that we keep around and increasing the variety of fresh vegetables, for example.

Just for fun, why not take an inventory of what’s in your house right now? How many different types of crackers, salted nuts, chips or other snack foods are on hand? How many different kinds cookies, cereal, muffins, granola bars, ice cream, chocolate or other sweet treats? How many types of bread, rolls, tortillas and other starchy foods?

Now open up that crisper drawer. How many different kinds of vegetables and fruits are in there, ready to eat? How many different sources of lean protein?

How does the variety (or lack thereof) of various categories of food correlate to your consumption patterns? Should You Eat a Less Varied Diet? -

If you want to cut down on snacking, try keeping fewer snack foods around. If you want to eat more vegetables, surround yourself with more different kinds of produce.

]]>
Can Collagen Supplements Make Your Skin Younger? https://foodandnutrition.org/blogs/stone-soup/can-collagen-supplements-make-skin-younger/ Mon, 16 Apr 2018 09:00:41 +0000 https://foodandnutrition.org/?p=14532 ]]> There’s a lot of buzz about collagen peptide supplements these days. Collagen is a structural protein present in the skin, joints, hair and nails. The gradual loss of collagen as we age can make the skin look less plump. The idea is that collagen supplements can replace some of that lost collagen and improve the look of the skin.

Assessing the effectiveness of skin care products or supplements is notoriously difficult. For one thing, it’s difficult to isolate the effects of any particular cream or pill. The condition of our skin surface can be affected by diet, hydration, sun exposure, temperature and humidity. It’s also really hard to be objective about what we’re seeing in the mirror. So how do we know whether these supplements are actually working?

There have been a couple of small studies attempting to objectively assess the effectiveness of these supplements by measuring the skin’s moisture levels and elasticity after several weeks of use. In some (but not all) cases, the differences reached statistical significance. But that doesn’t necessarily translate into something that you’d be able to see in the mirror.

(It’s much the same story for claims that collagen supplements will relieve joint pain.)

Collagen supplements are generally safe and well-tolerated. If nothing else, they provide a small amount of protein — although not the highest quality. You could try them to see whether you perceive a positive difference in your skin, and whether the improvement is enough to justify the expense. Both of those judgments are entirely subjective, of course.

But based on the published research so far, I suspect that money would be better spent on healthy foods and a decent moisturizer. Can Collagen Supplements Make Your Skin Younger? -

]]>
What is the Sirtfood Diet? https://foodandnutrition.org/blogs/stone-soup/what-is-the-sirtfood-diet/ Wed, 07 Mar 2018 10:00:36 +0000 https://foodandnutrition.org/?p=14023 ]]> Will there ever be an end to silly new diet trends?

The Sirtfood Diet is the latest to cross my desk and, boy, is it a doozy.

The premise is that certain foods increase the activity of sirtuins in your body. Sirtuins are special proteins that allegedly have all sorts of beneficial effects, everything from fighting inflammation to preventing cancer and neurodegenerative diseases, all the way to reversing aging and extending lifespan.

The Sirtdiet protocol consists of lots of green smoothies and other meals made from “sirtfoods,” which include capers, celery, cocoa powder, green tea, kale, parsley, onions, strawberries, turmeric and walnuts.

Nothing wrong with those foods. But there is a lot wrong with this diet.

First, the claims for this diet are not only unproven, they verge on the preposterous. Although sirtuins are an area of promising research, what we don’t know about them far exceeds what we do know about them.

Even if we did know more about how sirtuins promote health and longevity, the idea that these foods will increase sirtuin activity is pure speculation. These foods are rich in polyphenols, compounds that might boost sirtuin activity. Then again, they might not. We’ll have to get back to you on that.

The other problem with this diet is that it is designed to produce extreme (and extremely fast) weight loss. As you’ve heard me say before, dieting is counter-productive. Extreme dieting is extremely counter-productive.

I bet a lot of these so called “sirtfoods” are already in your diet. Stay the course! And some of the “sirtfood” recipes I’ve seen look delicious. Feel free to add them to your repertoire. But the actual Sirtfood Diet protocol? I’d pass on that if I were you. What is the Sirtfood Diet? -

]]>
Meet GenED: The Next Generation of Biotechnology https://foodandnutrition.org/from-the-magazine/meet-gened-next-generation-biotechnology/ Thu, 01 Mar 2018 11:02:07 +0000 https://foodandnutrition.org/?p=13885 ]]> There is enormous excitement about the potential applications of genome editing, or GenEd. Billed as a set of techniques that promise more precision, flexibility and efficiency, these new breeding technologies are fast-replacing the biotechnology used to create the first genetically engineered crops. Scientific and agricultural communities are eager to foster a better understanding and greater acceptance of this technology in hopes of moving past the “Frankenfood” stigma shadowing genetic engineering technology.

Genetic Modification Through the Ages

TAMING THE TERMINOLOGY

Mutagenesis: Changes (mutations) in an organism’s DNA, which may occur spontaneously, as a result of exposure to a mutagenic agent or through direct manipulation.
Enhanced mutagenesis: The use of high-dose radiation and/or chemicals to produce mutations at a higher rate than would occur naturally.
Hybridization: Cross-breeding two closely related plants to create a new strain.
Genetic Engineering (GE): The manipulation of genetic material through means that bypass the reproductive process.
Transgenic: The transfer of genetic material from one organism to another through means other than reproduction.
Gene Editing/Genome Editing (GenEd): A type of genetic engineering in which DNA of a living organism is modified by inserting or deleting individual nucleotide bases.
Genetically Modified Organism (GMO): Although cross-breeding, hybridization and genome editing are all forms of genetic modification, the term “genetically modified organism” usually is used to refer to crops created by transgenic modification.
New Breeding Technologies (NBTs): Techniques such as CRISPR-Cas9, zinc finger nuclease technology and TALENs, which allow precise editing of an organism’s DNA.
CRISPR: Acronym for Clustered Regularly Interspaced Short Palindromic Repeats, a genetic feature used in the CRISPR-Cas9 genome editing technique.
Cas9: A protein used by certain bacteria to edit DNA and utilized in the CRISPR-Cas9 genome editing technique.

Genetic modification of organisms happens in nature, without any human intervention. As plants and animals reproduce, random genetic changes called mutations occur. Some mutations are beneficial, some are detrimental and some are neutral. Growers have long sought to take advantage of this natural phenomenon by selecting the most successful “mutants” as seed stock or by cross-breeding similar plants in an attempt to transfer desirable traits to other strains (a technique called hybridization).

Starting in the early 20th century, breeders began applying chemicals or high-dose radiation to seeds in order to create mutations at a higher rate than would occur naturally — called enhanced mutagenesis. Plants with desirable mutations could then be crossed with other seed stock or crossed back into the parent stock to improve the quality and characteristics of the original variety.

Regardless of how the mutations occur, opportunities for crossbreeding are limited by the ability of two plants to successfully reproduce. A grapefruit can be crossed with an orange, but not with a tomato, for example. This limitation was overcome in the 1970s by gene transfer technology that allowed scientists to transfer genetic material from one organism to another, called “transgenic modification,” regardless of whether the two organisms were closely enough related to cross naturally. The first transgenic plant (the Flavr Savr tomato) was approved for human consumption and brought to market in 1994 but was not sufficiently profitable to continue production. Although there are no genetically engineered tomatoes on the market, nine transgenically modified food crops are commercially available (corn, soybean, squash, papaya, alfalfa, sugar beets, canola, potato and apples) — most of which were modified to increase resistance to disease or pests or tolerance to a specific herbicide.

But with the advent of genome editing techniques, the gene splicing techniques used to create GMO crops are comparatively clunky and primitive. Instead of transferring genetic material from one organism to another, scientists now can edit an organism’s own DNA by deleting a few nucleotide bases (the basic units in DNA) or shifting them slightly. No foreign genetic material is involved. It’s like the difference between using a photo editor to paste one person’s head on another person’s body and editing a few pixels to remove a blemish or fix flyaway hairs in a picture.

Risks and Rewards of New Technologies

Out of the handful of genome editing techniques, the CRISPR-Cas9 method has emerged as the most promising. CRISPR stands for Clustered Regularly Interspaced Short Palindromic Repeats — a sequence of nucleotides that acts as a DNA “bookmark,” making it easier to locate specific pieces of an organism’s genetic code. Cas9 is an enzyme produced by certain bacterial strains as part of the bacteria’s acquired immune response. Its function is to search for and edit, or “cleave” (inactivate), specific CRISPR sequences that belong to previously recognized pathogens. However, it also can be programmed to recognize and edit any CRISPR sequence and is now being used by genetic engineers to tweak the genetic code of plants and other organisms. The result is fewer unintended effects, and less desirable traits are revealed much more quickly. Research and development of new crops and traits is more efficient and less costly, and products can be brought to market more quickly. CRISPR technology is so much faster and more precise that many geneticists believe pursuing new transgenically modified crops no longer makes a lot of sense.

Genetic researchers working with gene editing, along with farmers and growers, are excited about the potential for CRISPR technology to expedite solutions to a wide array of pressing concerns including climate change, malnutrition and population growth. Existing food crops can be modified to increase yields and drought and pest resistance, and improve nutrient proles.

Coming Soon to a Shelf
Near You

ANTIBROWNING MUSHROOMS
Scientists from Penn State University used CRISPR-Cas9 gene editing to disable an enzyme that causes white mushrooms to brown, thereby extending shelf-life. The mushroom has been cleared by the USDA for commercial cultivation.

DISEASE-RESISTANT CITRUS
Genetic scientists hope CRISPR-Cas9 technology may provide a solution to the “citrus greening” disease that is decimating Florida orange groves by editing the genome of the trees to make them more resistant to the pathogen that causes the disease.

FUNGUS-RESISTANT BANANAS
The global banana crop is currently under threat from a widespread fungal disease. Australian scientists already have succeeded in introducing resistance via transgenic modification. Now, they hope to use CRISPR-Cas9 techniques to produce disease-resistant bananas without introducing any foreign DNA.

REDUCED-GLUTEN WHEAT
Scientists in Spain have successfully used CRISPR-Cas9 techniques to modify the genome of wheat, producing strains that are significantly lower in gluten.

All breeding and agricultural methods have the potential to impact the environment and the genetic makeup of our food supply, and with any new technology, the unknowable looms particularly large — especially when advances have the potential to alter the landscape so quickly and dramatically. Genetically engineered foods have been part of the food system for more than 20 years, and despite scientific consensus that these foods are safe for human consumption, a 2016 NPD Group report found GMOs continue to be a growing concern for consumers — so much so that some food manufacturers have removed GMO ingredients from their products.

The evolution away from transgenics to genome editing renders certain issues moot. For example, questions about food allergens are largely obviated by CRISPR techniques because no foreign genetic material is introduced in GenEd foods. Other concerns, such as a loss of biodiversity or the development of pesticide and herbicide resistance, are not unique to new breeding technologies. Any use of pesticides or herbicides — including approved organic ones — has the potential to promote resistance. The process through which weeds and pests develop resistance to chemicals is the same process used by traditional breeding: mutagenesis. Random mutation can produce a pest or weed that is resistant to a given chemical, increasing its chance of survival and therefore its ability to pass that trait to the next generation.

Regulatory Status

Meanwhile, as society weighs the potential risks of new technologies with the real need for solutions to pressing problems, lawmakers are grappling with how new breeding technologies and their products should be regulated. Because CRISPR technology does not involve the transfer of foreign genetic material, it does not fit the current definition of a genetically modified food — making its regulatory status a bit of a gray area. So far, the USDA has opted not to regulate foods created with CRISPR technology as GMO foods. The FDA is reviewing its existing guidelines for determining the safety of new plant varieties and is seeking input from the scientific community.

To the extent that regulations are put in place, scientists are calling for a “product vs. process-based” review system, whereby as new foods are brought to market, they are evaluated according to the attributes and makeup of the food itself, not how these attributes were acquired.

]]>
Have You Tried Black Rice? https://foodandnutrition.org/blogs/stone-soup/tried-black-rice/ Fri, 26 Jan 2018 10:00:59 +0000 https://foodandnutrition.org/?p=13477 ]]> I have a confession to make. Although I know that brown rice is more nutritious than refined white rice, I actually prefer the softer texture and milder flavor of white rice, especially white basmati rice. Because I don’t eat rice all that often or in large quantities, I often choose white rice instead of the whole-grain brown stuff.

But my discovery of black rice — sometimes called Forbidden Rice — has changed everything. Like brown rice, black rice is a whole grain, with all of the fiber-rich bran and nutritious germ intact. Have You Tried Black Rice? - In fact, black rice has even more fiber and protein than brown rice!

But where black rice really shines is in its antioxidant content. The deep pigmentation of black rice, which is actually a very dark purple, comes from anthocyanins, the same compounds that give blueberries their color — and health benefits.

Anthocyanins help protect your heart and brain, lower your cholesterol and may even guard against cancer and dementia. In fact, anthocyanins may be responsible for many of the health benefits attributed to diets high in fruits and vegetables.

Black rice is by far the richest source of anthocyanins of any grain. Take that, brown rice.

What Does It Taste Like?

Like brown rice, black rice is chewier and denser than white rice. But it really doesn’t taste like either white or brown rice. It’s plumper, almost juicy. And its flavor is more robust and complex. If you can imagine it, it’s almost like a cross between brown rice, black beans and huckleberries. (Black beans and berries are both good sources of anthocyanins, come to think of it.)

How to Cook with Black Rice

The recipe that first made me fall in love with black rice was a salad made with cooked black rice, roasted butternut squash, pomegranate seeds, toasted pecans and slivered scallions with a maple-lemon dressing. But you don’t have to work that hard every time you want to enjoy black rice. You can toss cooked black rice with whatever you’ve got on hand: raw or roasted veggies, leftover cooked chicken or tofu, dried or fresh fruit, chopped nuts or herbs. Drizzle with a bit of oil and vinegar, if you like. Serve it as a side dish, main course or pack it for your lunch.

]]>
The 411 on Hydroponics https://foodandnutrition.org/september-october-2017/the-411-on-hydroponics/ Thu, 19 Oct 2017 09:15:11 +0000 https://foodandnutrition.org/?p=9609 ]]> Hydroponically grown lettuce, tomatoes, strawberries and herbs make up an ever-increasing share of produce on display at many grocery stores and farmers markets. But growing plants indoors in controlled conditions is hardly new. For centuries, large- and small-scale growers have used greenhouses and container gardening to extend the growing season, create ideal growing conditions and increase growing space. What makes hydroponics different is the absence of soil.

Instead of drawing water and nutrients from soil, hydroponic produce is grown with the roots submerged in nutrient-fortified water. Aeroponics is a related method in which the roots hang in the air and are regularly misted with water and nutrients.

One benefit of hydroponic farming is that it allows large amounts of produce to be grown on a relatively small piece of land and in places where traditional farming would be untenable, such as urban centers, or limited by geography and weather, such as in Alaska.

Hydroponic growers help satisfy demand for local produce in areas that are not well-suited for traditional farming. But even in farm-friendly regions where rain and sun are plentiful, storms, heat waves and unexpected freezes can cause unpredictable and costly crop losses, to which indoor farmers are largely immune.

Hydroponic crops also are less susceptible to weeds, insects and other pests, which means the plants can be produced without herbicides and pesticides.

All this has made hydroponics one of the fastest growing sectors of the agriculture industry, with more traditional growers investing in hydroponics as a supplement to existing operations or switching over entirely.

Another potential advantage of hydroponic farming is the degree of control the grower has over the conditions to which plants are exposed and the ability to precisely replicate those which produce better results.

However, not every plant does equally well in hydroponic growing conditions, at least with current technology. Lettuce and other leafy greens, herbs, tomatoes, peppers, cucumbers and strawberries are the most commonly grown hydroponics.

Hydroponic operations, which often use sophisticated water recycling systems, may use up to 90 percent less water than traditional farming, depending on the location. While traditional farms in water-stressed areas rely heavily on irrigation and use far more water than hydroponics, other areas get most of their water from rain.

Some hydroponic growers also rely on electric grow-lights instead of the sun. This allows growers to artificially extend the length of the day and manipulate light wavelengths to increase yield and productivity year-round.

Not having to till and plow fields can reduce the amount of greenhouse gasses associated with crop production. However, the overall carbon footprint of a food depends on additional factors, such as the distance it is transported after harvest, which complicates comparisons of environmental impact between hydroponically and soil-grown crops.

How Do Hydroponics Compare Nutritionally?

Comparing the nutritional content of hydroponic and soil-grown produce is challenging, and research involving direct comparisons is limited. Both traditional and hydroponic farmers can influence the nutritional content of produce by adding nutrients to the soil or growing medium. But the nutritional composition of a fruit or vegetable also depends on the particular cultivar or variety, the degree of ripeness when harvested and the storage period after harvest.

With hydroponics, there is no danger of plants absorbing heavy metals that may be in the soil. On the other hand, growing vegetables in soil may yield benefits that we don’t yet fully understand. Raw produce can be a source of beneficial probiotic soil-based bacteria, for example. Although it’s not yet clear to what extent this can be replicated indoors, commercial hydroponic growers are experimenting with techniques that foster a healthy and diverse microbiome.

What about Flavor?

Although you may think sun and soil are essential for good flavor, the factors that influence flavor — including air temperature, humidity, the amount and color spectrum of light-nutrient availability and moisture — can be precisely controlled and more reliably replicated in a hydroponic environment.

As with nutrient content, the flavor of produce depends on the variety and freshness, as well as growing conditions. “We can breed for flavor, texture and nutrition instead of mold resistance, pest resistance or shelf life,” says Alina Zolotareva, RDN, marketing manager for AeroFarms, an aeroponic grower of salad greens in Newark, N.J. “And because hydroponics can move the farms to the people, it can give people access to fresher produce.”

But the hydroponic produce at a local grocer isn’t necessarily grown locally. Although hydroponics make it possible to grow fruits and vegetables anywhere, the largest hydroponic growers in the U.S., accounting for about half of hydroponic sales, are clustered in Pennsylvania, California, New York, Vermont and Wisconsin.

The Organic Debate

Because hydroponically grown fruits and vegetables can be produced without the use of pesticides or herbicides, many hydroponic growers are pursuing organic certification, which allows produce to be sold at a higher price point. But many traditional organic farmers are crying foul.

“[Hydroponics] is a really efficient model of production,” says Mark Kastel, co-founder of The Cornucopia Institute, a public interest group supporting sustainable and organic agriculture. “If unchecked, hydroponics could squeeze out traditional organic farmers to the extent that consumers wouldn’t really have the option to buy soil-grown organic produce.”

“Allowing hydroponic produce to be certified as organic is completely contrary to the values of the organic movement,” Kastel says. “Improving and maintaining the health and biodiversity of soil is one of the core principles of organic growing. How can you be stewarding the soil if there is no soil?”

Proponents of organic hydroponics argue that hydroponic systems are aligned with the principles of stewardship, conservation and environmental harmony outlined in the Organic Foods Production Act of 1990. They also view new technology, such as hydroponics, as essential to meeting the demand for organics.

In 2010, the National Organic Standards Board, or NOSB, a federal advisory board, petitioned the United States Department of Agriculture to make hydroponically grown vegetables ineligible for organic certification, a recommendation that the USDA has so far declined to adopt.

“The USDA has made it clear that hydroponics will be allowed to be certified organic,” says Nate Lewis, farm policy director for the Organic Trade Association. “At the very least, we’d like to see a rule that requires hydroponically grown vegetables to be labeled as such.”

In April 2017, the NOSB convened and discussed potential recommendations it would provide the USDA for organic certification of hydroponic systems. No decisions were made, and the Crops Subcommittee will be developing a proposal on this topic for a meeting in the fall.

]]>
Trend Alert: What’s the Deal with Ginger Shots? https://foodandnutrition.org/blogs/stone-soup/trend-alert-whats-deal-ginger-shots/ Wed, 21 Jun 2017 17:03:08 +0000 https://foodandnutrition.org/?p=8378 ]]> Ever since Selena Gomez was filmed tossing one back with “Carpool Karaoke” host James Corden, ginger shots have become the latest celebrity-driven superfood craze.

Ginger shots are usually made with one to two ounces of fresh ginger juice, and a pound of ginger will produce about 12 ounces of juice. Will a daily ginger shot detox your organs, kill cancer cells or melt away fat? Of course not. But ginger does have some legitimate health benefits. Trend Alert: What’s the Deal with Ginger Shots? - Ginger Shots

Ginger root contains small amounts of various vitamins and minerals but not enough to have a substantial impact on your overall nutrition. However, ginger also contains compounds called gingerols that translate to anti-inflammatory activity in the body. If your ginger shot also contains a substantial amount of sugar, which tends to be pro-inflammatory, it might end up being a wash.

Gingerol also has anticoagulant properties, reducing the tendency of the blood to form clots. Consuming natural foods with these properties can help reduce the risk of heart attack and stroke. However, they also could interact with prescription anticoagulants like Warfarin to cause excessive bleeding. Even if you’re not on any meds, too many ginger shots could potentially cause you to bruise more easily. Ginger shots probably are best avoided by those who take prescription blood thinners, those with any kind of bleeding disorder, and anyone planning surgery or dental procedures.

Ginger also is possibly effective in reducing and preventing nausea, so a ginger shot might settle a queasy stomach. On the other hand, some people report heartburn or mild stomach discomfort after consuming ginger.

Fresh ginger juice has a little too much kick for most people to enjoy straight up, so it’s usually blended with other ingredients such as lemon juice, honey, turmeric or, in Gomez’s case, yerba mate tea. Bottoms up!

]]>
What Type of Diet is Best for People with a Previous Cancer Diagnosis? https://foodandnutrition.org/january-february-2017/type-diet-best-people-previous-cancer-diagnosis/ Tue, 28 Feb 2017 05:52:45 +0000 https://foodandnutrition.org/?p=6857 ]]> Dietary recommendations for people with a history of cancer are essentially the same as they are for anyone seeking to optimize their health and longevity — and offer similarly profound benefits in terms of risk reduction. Given the evidence on diet and cancer incidence and recurrence, the American Institute for Cancer Research, National Comprehensive Cancer Network and American Cancer Society all recommend people with a history of cancer consume a diet high in fruits and vegetables, whole grains, plant-based fats, nuts and legumes, with limited amounts of refined grains, added sugars, red and processed meats, and alcohol.

The impact of individual dietary factors, such as fat, fiber and meat, on recurrence of various types of cancer has been evaluated in studies with mixed or inconclusive results. Overall, diets rich in plant foods have a positive impact on health and quality of life after cancer treatment, due in part to their nutrient density and fiber, which also help promote a healthy weight. Research suggests a lower risk of cancer recurrence in people who eat a diet rich in fruits, vegetables and whole grains, with most dietary fat coming from nuts and olive oil, and low amounts of red and processed meats, refined grains and full-fat dairy.

Although many people with a history of breast cancer worry about the potential cancer-promoting effects of dairy or soy, evidence indicates that neither low-fat dairy nor soy is linked to increased risk of recurrence — and moderate consumption of minimally processed soy foods may even have a protective effect. Similar questions have been raised regarding the effect of phytoestrogens in soy or flax on prostate cancer recurrence. The best available evidence suggests consuming soy and flaxseed may have a protective benefit, but people with a history of prostate cancer should avoid consuming large amounts of flax oil.

Restrictive dietary regimens, such as The Gerson Therapy or macrobiotic diets, are often promoted to people with a history of cancer. However, there is no evidence that such approaches reduce the risk of recurrence any more than a prudent dietary pattern.

The American Cancer Society recommends working with a registered dietitian nutritionist during and after cancer treatment for individualized nutrition care. Research suggests working with an RDN is especially helpful for healthy weight management, particularly in individuals with past female reproductive cancers.


]]>
Brewed Cacao: Your New Afternoon Pick-Me-Up? https://foodandnutrition.org/blogs/stone-soup/brewed-cacao-your-new-afternoon-pick-me-up/ Mon, 13 Feb 2017 22:47:02 +0000 https://foodandnutrition.org/?p=502 ]]> The health benefits of chocolate are widely celebrated, but there’s always that pesky asterisk attached. Although chocolate contains flavonoids and other antioxidants that may benefit your heart and buoy your mood, it also can be loaded with sugar and calories. That’s why we’re usually told to limit our consumption to one small square of dark chocolate a day.

Enter brewed cacao: a trendy new beverage that delivers all the benefits and decadent flavor of dark chocolate, without any of the sugar and fat, and virtually no calories. Brewed Cacao: Your New Afternoon Pick-Me-Up? - It also is low in caffeine, so you can enjoy it any time of day. Brewed cacao is, however, rich in theobromine, another mild central nervous system stimulant. Some users believe that theobromine boosts their mental energy and focus but without the nerve-jangling effects of caffeine. Controlled trials have so far failed to support these impressions.

Cacao beans intended for brewing are processed a bit differently than those destined to become chocolate or cocoa powder. Instead of the slow “baking” required to bring out the optimal flavor for chocolate, the beans are handled more like coffee beans and roasted quickly at high temperatures. Less processing means more of the beneficial compounds are preserved. As a result, a cup of brewed cacao contains roughly twice the amount of the flavonoids catechin and epicatechin as a square of dark chocolate or cup of hot cocoa.

With its rich, chocolatey aroma and full-bodied mouthfeel, brewed cacao is somehow more than just a beverage. Although an afternoon cup of tea or coffee often feels incomplete without a little something to go with it, a cup of brewed cacao, with or without a splash of milk, feels completely satisfying. As an after-dinner offering, it easily replaces both coffee and dessert.

To brew the perfect cup, place 2 tablespoons of ground cacao beans in a French press, add 1 cup of boiling water, stir briefly, and steep for 5 to 7 minutes before pressing. Brewed cacao can be enjoyed black or with milk or nondairy creamer and a touch of sweetener, if desired.  The spent grounds make great compost, and your compost heap will smell wonderful.

The ground beans also are edible — and quite tasty! Try blending a spoonful into a smoothie, stirring into hot cereal or sprinkling over ice cream.  

Although newly trendy, brewed cacao actually is nothing new. It was enjoyed by native Central Americans as early as 1,500 BC. So, the next time you get a craving for chocolate, instead of heading for the candy counter, why not make like the ancient Mayans and brew up a cup of cacao?

]]>
Osteoporosis Prevention throughout the Lifespan https://foodandnutrition.org/may-june-2016/osteoporosis-prevention-throughout-lifespan/ Thu, 28 Apr 2016 22:05:16 +0000 https://foodandnutrition.org/?p=6540 ]]> Although fractures due to thinning bones are rare before age 65, the risk of this common and often debilitating condition is heavily influenced by choices we make decades earlier. Each phase of life offers a unique set of challenges and opportunities to maximize bone density or minimize bone loss.

This decade-by-decade guide outlines the most effective ways to protect bones at every age. Osteoporosis Prevention throughout the Lifespan -

Teens

Even as teens approach their mature height, bone density and thickness continue to increase. Maximizing bone mineral acquisition early in life increases lifetime peak bone mass and reduces the risk of osteoporosis in later years; calcium enhances the rate of bone mineral acquisition by providing more of the material of which bones are made.

Unfortunately, only 42 percent of teenage boys and 13 percent of teenage girls get the recommended daily intake of 1,300 milligrams per day of calcium.

In the past, studies reported an inverse correlation between soda consumption and bone mass, leading many to hypothesize that phosphoric acid or other components of soda (such as caffeine) may rob bones of calcium. Subsequent research suggests the negative effect on bone density is more likely due to the fact that teens who drink more soda also consume less calcium.

Recommendation: Make the most of the rapid-rate bone formation throughout teen years by prioritizing foods rich in calcium and vitamin D.

Twenties

Bone mass continues to increase but the rate of bone acquisition begins to slow. Research has found that white women may reach their peak lifetime bone mass up to five years sooner than non-white women and men.

Young adults are often living on their own for the first time and are more likely to engage in behaviors such as binge drinking and social smoking, both of which can have a lasting and detrimental impact on bone health.

Extreme dieting and other disordered eating patterns also are common at this age, especially in women. Weight loss leads to bone loss, and a history of repeated weight loss, such as yo-yo dieting, has been shown to have negative consequences on bone density later in life.

Recommendation: Continue to prioritize adequate calcium and vitamin D intake and avoid behaviors such as smoking, excessive drinking and yo-yo dieting, which rob the bones of minerals.

Thirties

By the early 30s, bone density has typically reached its maximum level. Osteoporosis prevention shifts from maximizing bone acquisition to maintaining strong bones.

Caffeine consumption typically increases throughout adulthood and is known to increase urinary calcium losses. The effect is relatively modest, however; a cup of coffee may result in the loss of 2 to 3 milligrams of calcium. As long as calcium intake is adequate, the effect on bone density appears to be minimal.

Epidemiological surveys also report a positive association between fruit and vegetable consumption and bone health later in life. Unfortunately, those in their 30s consume less produce than any other age group.

Recommendation: Focus on eating more fruits and vegetables as part of a balanced diet. Heavy caffeine consumers should take extra care to ensure adequate calcium consumption.

Forties

Total body bone density continues its gradual decline in both men and women throughout this decade. Weight-bearing exercise, which includes walking, jogging, tennis, hiking and strength training (but not swimming or biking), is one of the best ways to preserve bone mass.

Although the percentage of Americans who exercise regularly has increased over the last decade, only one in four men and one in five women in their 40s get the recommended amount of exercise.

Typical sodium consumption for Americans in their 40s is about 3,800 milligrams per day, which is 65 percent higher than the recommended daily consumption of 2,300 milligrams per day. In addition to increasing the risk of hypertension, high-sodium diets also increase urinary calcium excretion.

Recommendations: Develop or maintain a regular exercise habit, and be mindful of sodium consumption. Increase intake of fruits, vegetables and calcium-rich foods to mitigate the effects of sodium on blood pressure and bones.

Fifties

Although bone mass may decline gradually in both men and women after the early 30s, the rate of bone loss begins to accelerate dramatically when women reach menopause. The recommended daily intake of calcium for adults over 50 increases from 1,000 milligrams per day to 1,200 milligrams.

Not surprisingly, women older than 50 are more likely than men or younger women to take calcium supplements, but several recent studies have found supplementation is of questionable benefit.

Other researchers have noted an association between calcium supplementation and increased risk of cardiovascular events in men and women over the age of 50.

Recommendation: Get as much calcium as possible from foods rather than supplements. Leafy greens, such as kale and collards, are good sources of calcium and are rich in vitamin K, which helps move calcium out of the arteries and into the bones. However, some greens (particularly spinach and beet greens) also are high in oxalates, which can bind to calcium. Oxalates can be significantly reduced by cooking.

Sixties

Bone density continues to decline gradually in men and resumes a more gradual decline in postmenopausal women. As total bone density decreases, fracture rate increases, but fracture risk also increases with age, independent of bone density. A 65-year-old woman is at greater risk of fracture than a 45-year-old woman with the same bone density.

Sixty-eight percent of men and 61 percent of women in their 60s fail to get the recommended daily amounts of calcium. Declining protein consumption among older adults also may contribute to bone density loss.

Although protein consumption can increase urinary calcium excretion, it also increases absorption of calcium from the gut. A prospective study of healthy men and women in their 60s found that higher protein intake was associated with higher bone density and that supplementation with calcium provided no bone benefit in those with low protein consumption.

Recommendations: Increasing protein consumption can help preserve bone density and muscle mass. Develop an exercise routine that includes activities to improve balance and flexibility.

Seventies and Beyond

Age-related osteoporosis occurs at approximately 70 years old and beyond. One in five women have osteoporosis by age 70; by age 90, it’s one out of every three.

Although men never completely “catch up” with women in terms of osteoporosis risk, those living into their 80s and 90s are increasingly likely to be affected.

Elderly men and women who consume more protein have a reduced rate of bone loss. Vitamin D deficiency, which accelerates bone loss, affects 80 percent of those older than 70.

Recommendations: Although calorie needs decline with age, protein requirements do not. Be sure to keep protein intake up, even as total food consumption diminishes. A vitamin D supplement may be necessary to maintain adequate levels.

]]>
Who Is Affected by Hyponatremia? https://foodandnutrition.org/march-april-2016/who-is-affected-by-hyponatremia/ Fri, 26 Feb 2016 19:43:59 +0000 https://foodandnutrition.org/?p=6465 ]]> Hyponatremia is a potentially life-threatening condition that occurs when there is too little sodium in the blood. Early signs may include fatigue, headaches, confusion and nausea. Who Is Affected by Hyponatremia? - If not quickly resolved, hyponatremia can lead to seizures, coma and death.

Defined as a blood sodium level below 135 mmol/L, hyponatremia can occur when excessive amounts of sodium are lost through urination, perspiration, vomiting or diarrhea. Health conditions or medications that cause fluid retention can cause dilutional hyponatremia, as can overhydration. Medical conditions that may lead to hyponatremia include congestive heart failure, kidney disease and syndrome of inappropriate anti-diuretic hormone. Psychogenic polydipsia, which leads sufferers to drink excessive amounts of water, affects up to one-fifth of psychiatric patients and frequently leads to hyponatremia.

Overhydration leading to hyponatremia used to be more common among athletes, particularly women, who participated in long-duration sports, such as marathons. Nowadays, those who engage in endurance sports — as well as emergency personnel who treat athletes in distress — are far better equipped to prevent, recognize and manage hyponatremia, thanks to a concerted effort in the sports medicine community to raise awareness about risks and signs of overhydration.

But there are other at-risk populations who health professionals may encounter. Being alert to risk factors and early signs of hyponatremia enables nutrition professionals to work with the entire health-care team to avert acute problems. Diuretics may deplete the body of electrolytes, including sodium. SSRI antidepressants, such as paroxetine, which increase levels of antidiuretic hormone, have led to life-threatening cases of hyponatremia, especially in elderly patients.

Intravenous administration of hypotonic fluids, which contain a lower concentration of sodium than blood, in excessive quantities or speeds can cause dilutional hyponatremia. This occurs most commonly in children and the elderly. Patients receiving parental nutrition also need to be closely monitored, and their orders adjusted as needed, to maintain proper fluid and electrolyte balance.

Avoiding dietary sodium, on the other hand, is unlikely to cause hyponatremia. Even a very low-sodium diet (500 to 1,000 milligrams/day) should maintain adequate levels under normal circumstances.

]]>
Livestock Antibiotics: Not Just Another Food Fight https://foodandnutrition.org/may-june-2015/livestock-antibiotics-not-just-another-food-fight/ Wed, 29 Apr 2015 02:20:16 +0000 https://foodandnutrition.org/?p=5951 ]]> Prophylactic use of antibiotics in animal feed helps livestock stay healthy and grow faster, but it also may speed the emergence of drug-resistant bacteria that, according to the Centers for Disease Control, kill 23,000 people each year and infect up to 2 million annually.

Although Alexander Fleming’s discovery of antibiotics in 1928 was a game changer in the ability to treat routine bacterial infections, the development of antibiotic resistance is a natural and inevitable phenomenon. In his acceptance speech for a 1945 Nobel Prize, Fleming noted, “It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them.”

It was only five years later in 1950 when researchers discovered putting low levels of antibiotics in the feed of healthy animals (specifically poultry) increased the speed at which they grew, resulting in higher production yields at lower costs. Under pressure to provide meat for an expanding urban population — and still feeling the stings of angry protests over high meat prices after World War I and farming struggles during World War II — meat and poultry producers quickly adopted the practice.

However, adding antibiotics to the feed of healthy animals also exposed disease-causing microbes present in these animals to non-lethal doses of antibiotics — precisely the scenario that accelerates the survival and growth of resistant bacteria. Illnesses due to resistant bacteria are more prevalent among those who work in concentrated animal feeding operations, but these “super-bugs” don’t necessarily stay on the farm. Research shows they can end up in the soil, water and air surrounding meat-growing operations, where they may drift to adjacent farms, waterways or to a neighbor’s backyard vegetable patch. Resistant bacteria (as well as antibiotic residue) also are present in the manure of antibiotic-fed animals, which often is used as fertilizer. And although all soil contains resistant bacteria, soil amended with manure from antibiotic-fed animals is more likely to contain bacteria resistant to multiple drugs and which require higher doses of antibiotics to kill.

In addition, resistant bacteria can be readily transported (by an infected person or simply on a farm worker’s shoes or car tires) to other environments, including hospitals and clinics. Resistant strains of staphylococcus and E. coli have become medical menaces in hospitals and day care centers, putting at risk those whose natural defenses are low. Without effective treatments, health-care workers can do little but try to manage the symptoms of the sick and attempt to prevent transmission to new patients.

Part of what makes the antibiotics issue unique is that the risk of resistant bacteria affects everyone, regardless of personal dietary choices — making it not as much a food safety issue, but an environmental and public health issue. Avoiding meat from animals that were treated with antibiotics won’t protect individuals from antibiotic-resistant bacteria, but that doesn’t mean consumers aren’t trying. In a 2012 Consumer Reports poll of American consumers, 86 percent said they agreed that local supermarkets should carry meat and poultry raised without antibiotics and more than 60 percent said they’d be willing to pay more for it.

Today, a vast majority of grocery stores now carry meat raised without antibiotics. Niman Ranch, the producer that supplies pork to the national Chipotle restaurant chain, raises its pigs without antibiotics except as needed to treat disease in individual animals. McDonald’s Corporation announced its commitment to eliminate use of antibiotics important to human medicine by 2017, while the two largest chicken producers in the country, Tyson and Purdue, now offer chicken raised without antibiotics — accounting for about 9 percent of total sales.

But turning the tide, or even just holding the line, on antibiotic resistance may require more than a few boutique brands of chicken and bacon. In fact, despite an increased demand for “antibiotic-free” meat, the most recent available figures indicate sales of antibiotics to American livestock producers increased 16 percent from 2009 to 2013, including those antibiotics considered most critical to human medicine.

Raising more animals without antibiotics is a good thing, according to Patrick Baron, a fellow at the Johns Hopkins Center for a Livable Future who specializes in the epidemiology of antibiotic resistance. “But we’ve gone so far down this road that turning the ship around will take a much broader and more sustained effort,” he says. “Subtracting a million pounds a year from a 30-million pound a year system can only do so much.”

Antibiotics Reduction Legislation: Yeas and Nays

In 2007, U.S. Rep. Louise Slaughter (N.Y.) made antibiotic resistance her signature legislative issue with the Preservation of Antibiotics for Medical Treatment Act (H.R. 1150), or PAMTA. [Its respective bill in the Senate was the Preventing Antibiotic Resistance Act (S. 1256).] First introduced in 1999, the bill amended the Federal Food, Drug and Cosmetic Act to require comprehensive risk assessments and for drug manufacturers to demonstrate that there would be no harm to human health through certain uses of antimicrobials. The bill eventually garnered support from 450 medical, consumer advocacy and public health groups including the American Medical Association and the World Health Organization — as well as intense oppositional lobbying.

Anti-legislation arguments included concerns for animal health and welfare expressed by the American Veterinary Medical Association, which said PAMTA eliminated “the ability for veterinarians to prevent disease through the judicious use of antimicrobials” and pointed out that there were regulations and requirements already in place that oversee the use of antibiotics in agriculture.

Mike Apley, DVM, PhD, professor in clinical sciences at Kansas State University in Manhattan and beef production consultant, agrees. “We want drugs that are effective for humans and animals and harm neither, which is what the regulatory processes do,” Apley says. “We don’t want legislation that would ultimately result in circumventing regulation altogether. For example, completing qualitative risk assessments for every single drug would be physically impossible within the [PAMTA] bill’s two-year timeline.”

Some opponents warned that an accelerated reduction or elimination of prophylactic antibiotic use could destroy the livelihoods of meat and poultry farmers by driving down efficiency and driving up production costs — and by extension, meat and poultry prices. “Further limiting or eliminating animal antibiotic use for livestock will have negative economic and animal health consequences,” according to an American Farm Bureau Federation issue brief.

Another argument is that scaling back antibiotic use in livestock is not the only means to decelerate the natural development of resistance. Up to 80 percent of the antibiotics sold today are for agricultural use; however, direct administration of antibiotics to humans also plays a role in the development of resistance. Patients demand (and physicians prescribe) antibiotics for conditions that are not caused by bacteria, such as colds, flus and unconfirmed ear or sinus infections. People may stop taking antibiotics when they begin to feel better rather than finishing the prescribed course as directed, thus eliminating the weaker bacteria but potentially allowing the stronger to survive and mutate. And pharmaceutical residues in wastewater are not always adequately removed by water treatment programs — another factor contributing to conditions that favor the emergence of drug-resistant bacteria.

About the only thing everyone does agree on that there is a significant need for committed resources to research the systemic complexities of antibiotic-resistant bacteria and the practices that create the greatest human health risk. The PAMTA bill died in committee in 2013. That year, the U.S. Food and Drug Administration issued guidelines encouraging meat producers to voluntarily phase out routine use of those antibiotics most important in human medicine.

Characterizing the FDA’s guidelines as “woefully inadequate,” Rep. Slaughter announced the re-introduction of PAMTA (H.R. 1552) in March 2015 [the Preventing Antibiotic Resistance Act (S. 621) was re-introduced in the Senate as a separate bill], in addition to plans to reintroduce the Delivering Antimicrobial Transparency Act, or DATA Act, which requires drug manufacturers to disclose how antimicrobials are used in food-producing animals, and large-scale poultry and livestock producers to detail the type and amount of antibiotics in their feed.

Days later, the White House released the National Action Plan for Combating Antibiotic-Resistant Bacteria, a five-year roadmap requiring “sustained, coordinated and complementary efforts of individuals and groups around the world, including public and private sector partners, health-care providers, health-care leaders, veterinarians, agriculture industry leaders, manufacturers, policymakers and patients.” In addition, the U.S. Department of Agriculture’s National Institute of Food and Agriculture announced more than $6.7 million in funding for research programs across the country to develop mitigation strategies for antimicrobial resistance.

“Antibiotic resistance is the most pressing public health crisis of our time,” wrote Rep. Slaughter in a statement. “Both the American people and the U.S. government need to give this issue the attention it demands.”

]]>
Co-Op Comeback: What the New Co-Ops Do Different https://foodandnutrition.org/january-february-2015/co-op-comeback-new-co-ops-different/ Mon, 29 Dec 2014 22:25:57 +0000 https://foodandnutrition.org/?p=5847 ]]> With approximately 300 food co-ops in the U.S. today, cooperative grocers represent fewer than 1 percent of American grocery stores. But food co-ops — which are owned and directed by customers rather than corporations — are holding their own.

Despite an image associating co-ops with back-to-nature, collectivist sensibilities of the 1970s, American food cooperatives actually were born of the Great Depression. Consumers had little cash and many local groceries were shuttered, leaving communities with limited access to fresh food. Families pooled resources in order to buy in volume, direct from wholesalers. By the end of the 1930s, there were approximately 600 well-organized food cooperatives across the country.

In the 1940s and 1950s, as a rebounding economy coincided with a seismic shift in the retail food industry, the first self-service grocery stores (antecedents of today’s supermarkets) brought lower prices and a bigger selection. Nearly all of the original Depression-era co-ops went the way of the Model T and disappeared.

In the 1970s, a new co-op movement took shape — this one fueled more by idealism than necessity. Organic farming, vegetarian diets and communal living were hip, and conventional grocery stores were suddenly out of sync. For a generation looking to buck the establishment, food cooperatives offered a way to practice communitarian ideals and fuel alternative lifestyles, and the number of food co-ops swelled again to about 500 nationwide.

Throughout the 1980s, well-capitalized grocery chains specializing in local and organic foods sprouted like mushrooms. And as vegetarian and organic options became increasingly available, cheaper and more slickly merchandized, the need for food co-ops once again seemed less relevant. By the end of the millennium, half of the second-wave co-ops had closed their doors.

Today, food co-ops once again are on the rise in numbers and revenues. This latest growth spurt appears to be fed by a number of cultural forces. “The notion that diet is a primary driver of health and wellness has become truly mainstream,” says C.E. Pugh, chief operating officer of the National Cooperative Grocers Association. More shoppers are shunning highly processed, industrial foods in favor of products that are (or at least seem) more wholesome, while today’s “conscious consumer” wants to know where and how their food is produced. Many even are willing to spend a bit more in order to support the environment, animal welfare and small local farmers while keeping dollars within the community.

“There is a growing thirst among consumers to vote with their dollar and to buy from places where business is being done fairly,” says Dan Nordley, executive director of the Cooperative Grocers Network.

Operating at the intersection of such interests are co-ops. However, these are not your grandmother’s (or her grandmother’s) food cooperatives. The worker/member model, in which every member spent a few hours a month bagging bulk raisins or operating the cash register to contribute to the cause, has all but disappeared — along with bi-level pricing for members and nonmembers and unwieldy efforts to manage by committee.

Although members still may vote on organizational policies, day-to-day operations are overseen by professional managers (often, veterans of the for-profit grocery industry) who supervise paid employees. Stores are well-lit and feature eye-catching displays, modern technology and expanded inventories. And in addition to organic produce, soy burgers, bulk grains and natural cleaning solutions, you’re likely to find a robust selection of meats, conventional produce, convenience foods and packaged goods. Don’t be surprised to see circulars and coupons, either.

But while co-ops have come to look much more like the conventional grocery stores, their tenets remain constant. All cooperatives operate under seven core principles, which state that members will decide how the co-op will operate and that any operating surplus will be shared by members. A commitment to sustainable community development is another core principle, as is a commitment to education of members and the public.

Some of these principles are costly. Compared to commercial grocers, co-ops have a higher percentage of full-time employees, a higher average hourly wage for part-time staff and more generous benefits packages. Money that could be spent on advertising and marketing may instead be spent on educational programming and outreach. And buying from small farmers and local vendors may support the community, but it often costs more than buying from mainline wholesalers.

Although they work hard to keep prices competitive, co-ops likely never will be the cheapest place to buy food. At their heart, co-ops are value-driven, not profit-driven — and success is not calculated solely in dollars and cents. “We measure many bottom lines,” says Mary Saucier Choate, MS, RDN, food and nutrition educator for the Co-Op Food Stores of New Hampshire and Vermont. “Not just the financial one, but also things like quality of life for our workers, education of our members and sustainability of our buying practices.”

As food co-ops ride this third wave of growth, those within the movement (many of whom are veterans of the second wave’s boom and bust) are thinking hard about how to avoid a third colony collapse. “In the last couple of years, the competitive landscape has changed dramatically, in a way that’s putting pressure on food co-ops,” says Pugh. “Investors are buying up and consolidating the remaining independent natural grocers. And although their early efforts to enter the natural and organic market were haphazard and ineffective, the big industry players have now gotten serious about it, expanding their offerings and launching generic organic labels. We’ll have to become more efficient and productive in order to compete.”

However, co-ops are also becoming more organized, forming coalitions such as the Cooperative Grocers Network and the National Cooperative Grocers’ Association. These meta-cooperatives allow individual co-ops to support and learn from each other, share resources and best practices and negotiate better prices from vendors while preserving their autonomy. Teaming up to get better prices from suppliers is part of that, as is sharing educational materials and operational procedures. “But the primary thing co-ops offer is local ownership and cooperative structure,” Pugh points out. “Our success doesn’t benefit investors; it goes back to the members and the community. That’s always had value and resonance — and that won’t change.”

Monica Reinagel, MS, LD/N, CNS, is the author of Nutrition Diva’s Secrets for a Healthy Diet: What to Eat, What to Avoid, and What to Stop Worrying About (St. Martin’s Griffin 2011).

]]>
Kefir: From Russia with Love https://foodandnutrition.org/july-august-2014/kefir-russia-love/ Tue, 01 Jul 2014 02:19:38 +0000 https://foodandnutrition.org/?p=5637 ]]> Fermented foods are having a big moment, prompted in part by a flood of new research on the human microbiome — the ecological community of microorganisms living in the human body — and the benefits of probiotic foods. One of the fastest rising stars on this scene is kefir, an ancient fermented milk beverage believed to have originated in the Caucasus Mountains.

The very first kefir grains — a semi-solid complex of microorganisms, proteins and liposaccharides — probably arose spontaneously. The exact details of kefir's advent remain shrouded in history but it seems likely that kefir was originally discovered by nomads, who used animal skins to carry milk from their sheep or camels. The fresh milk served as a food source for microorganisms present in the animal skins, and a symbiotic colony of bacteria and yeasts (or SCOBY) eventually formed, turning the containers into natural fermentation tanks.

Although it dramatically altered the flavor and texture of fresh milk, fermentation also greatly extended its shelf life without refrigeration. Early kefir consumers learned to strain the grains out of the soured milk and use them to ferment the next batch. These living colonies — gelatinous, walnut-sized blobs resembling cauliflower florets — were carefully nurtured, often passed down through generations.

Kefir is similar to yogurt, but the two differ in several important ways. In both, lactobacillus bacteria digest lactose in milk, producing lactic acid. But kefir also involves yeasts that produce carbon dioxide and ethanol as they multiply, which means that in addition to having the tartness or sourness we associate with yogurt, kefir is also slightly carbonated and contains small amounts of alcohol. Commercially produced kefir is usually processed to remove the alcohol. Home-brewed kefir, however, can range from 0.5-percent to 2-percent alcohol, depending on how long it is allowed to ferment. As a rule of thumb, the fizzier it is, the more alcohol it contains.

Kefir may also be lower in residual lactose than yogurt. In both cases, longer fermentation allows bacteria to digest more lactose. In yogurt, however, increasing lactic acid levels lower the pH, which eventually inhibits the bacteria's activity and slows fermentation. In kefir, yeasts and their by-products buffer some of the lactic acid's acidity, allowing bacteria to continue to work on the lactose. (The bacteria return the favor by creating conditions that are favorable to yeast growth).

Kefir's reputation as a functional food extends back hundreds of years. It was traditionally believed to improve digestion, boost well-being and enhance longevity. More recently, it's been put forth as a treatment for everything from allergies to tuberculosis to heart disease, although evidence for these uses is largely anecdotal.

Modern research confirms kefir's role as a probiotic food. The beneficial microorganisms it contains inhibit the growth of pathological microorganisms (both in foods and in the digestive tract), enhance digestion of other foods, and synthetize valuable nutrients such as B12 and vitamin K. Preliminary research in vitro and in animal models also suggests possible anticarcinogenic, immune-stimulating and cholesterol-lowering effects. But with kefir, researchers face some unique challenges.

The specific health benefits of any probiotic food depends on the particular strains of bacteria or yeasts, and kefir grains host an extremely diverse population. Using electronic microscopy and genome sequencing, researchers have cataloged hundreds of different bacteria and yeasts in traditional kefir products, including numerous strains and subspecies of lactobacilli, streptoccoi, acetobacter and saccharomyces. Many are thought to be unique to kefir; several are even named after it.

Adding to the complexity is the enormous variation in kefirs from different sources. Each colony of kefir grains develops a unique microbial profile, depending on the milk in which it is grown and the ambient microbial environment where fermentation occurs. This, in turn, makes every batch of kefir unique. (Even if I share my colony of grains with you, the kefir I ferment on my counter will not be exactly the same as the kefir you brew in your kitchen.)

Traditionally produced kefir typically contains dozens of different bacteria and yeasts. Commercial producers, on the other hand, use a limited number of carefully selected species to create a more consistent product. But some health benefits ascribed to kefir may depend on the more varied cultures found in traditionally fermented kefir. Fortunately, kefir is exceedingly easy to make at home. Unlike yogurt, which must be incubated between 100 degrees Fahrenheit and 110 degrees Fahrenheit, kefir ferments best at room temperature. Simply pour any type of milk (cow, goat or sheep; non-fat, reduced-fat or whole) over the grains and leave the container on the counter for one to two days.

When the milk has thickened, pass it through a mesh strainer to remove the grains. As long as they have regular access to a fresh food supply (milk), the grains remain viable indefinitely. In fact, you could end up making the kefir faster than you can consume it. Refrigeration slows their activity, and although the grains will survive on their own for a few days between use, they will eventually starve.

Alternatively, you can store the strained grains in a small amount of milk (just enough to cover them) in the refrigerator for up to ten days. When you're ready to use them again, discard the milk they've been stored in and start over with fresh milk.

Kefir can be an acquired taste. If you enjoy drinking buttermilk, you'll probably enjoy unsweetened kefir. For those who find it too sour, it can be flavored with pureed fruit or vanilla extract, or used in smoothies. It can also be used in place of milk or buttermilk in things like muffins and pancakes and to add a sourdough-like tang to breads or pizza crust. Like any probiotic food, however, heating kefir above 110 degrees Fahrenheit or so will kill the beneficial bacteria and yeasts and nullify its probiotic benefits.

]]>
Sugar and the Science of Addiction https://foodandnutrition.org/november-december-2013/sugar-science-addiction/ Mon, 28 Oct 2013 15:13:25 +0000 https://foodandnutrition.org/?p=5378 ]]> When we say that a dessert is addictive, we usually mean it’s very delicious. To those who study the physiological and neurochemical aspects of substance abuse, however, “addictive” is a term with serious health implications.

Yet even those on the front lines of addiction research can’t quite agree on what qualifies as addiction. Witness the debate in scientific journals over whether sugar can be classified as an addictive substance or whether obesity should be added to the Diagnostic and Statistical Manual of Mental Disorders as a form of substance (food) abuse or addiction.

Advocates of treating food as a potentially addictive substance point out that palatable food and drugs like cocaine stimulate some of the same regions of the brain, and both trigger a flood of dopamine and feelings of well-being. In both cases, the euphoria is short-lived and the brain craves more. When these pleasure pathways are powerfully and repeatedly stimulated, the brain adapts and it takes more of the substance to achieve the same high. Abusers will continue to pursue that pleasure despite painful consequences.

Research done by Bart Hoebel, Ph.D., and Nicole Avena, Ph.D., at Princeton University lends support to the concept of food addiction. Like humans, rats prefer sweetened water to plain water. Once rats have habituated themselves to drinking sugar water, they exhibit symptoms of withdrawal once it’s taken away — nearly identical to those observed with other chemical dependencies — and they will binge when sugar water access is restored.

Experiments designed by neuropharmacologist Paul J. Kenny, Ph.D., at the Scripps Research Institute demonstrate that rats will continue to pursue palatable foods despite painful consequences — another hallmark of addiction. Rats will scamper away from regular rat chow when they hear a sound signaling an impending electrical shock, but rats with access to chocolate, cheesecake and sausage will keep eating those foods, even when they know a painful shock is coming.

Opponents argue that there are important differences between humans and rats, and between the world we live (and eat) in and the conditions required to induce sugar addiction in lab animals. Even those who argue for the food addiction model acknowledge that — for humans at least — it’s not sugar that pushes our buttons; it’s the combination of sugar, fat and salt. But as unpleasant as it may be to give up favorite foods, nothing like true drug withdrawal symptoms have ever been observed in dieting humans.

Treating Obesity through the Lens of Addiction
For nutrition professionals, the salient question is whether the concept of food addiction suggests more effective treatments for obesity. If food addiction is a real illness, then drugs developed to combat other addictions might offer some hope.

Drugs that block endorphin activity in the brain have been shown to reduce use of heroin, cocaine and alcohol in addicted humans. They also inhibit the consumption of appetizing food in both humans and rats. Similarly, some appetite-suppressing drugs have the bonus effect of reducing the desire to smoke.

One problem is that response to dopamine-blocking drugs depends a lot on a person’s individual baseline. In someone with elevated production of (or sensitivity to) dopamine, a drug that suppresses dopamine makes the reward less rewarding, making compelling foods a bit easier to resist. But in someone with low production of (or sensitivity to) dopamine, such a drug can suppress feelings of well-being so much that depression or even suicidal thoughts may ensue.

Further complicating matters is that obesity is sometimes characterized both by hyper- and hypo-responsiveness to dopamine. Either state, an exaggerated pleasure response to foods or a reduced ability to derive pleasure from foods, could plausibly lead to overeating. Kenny believes that the production of and responsivity to dopamine and other endorphins may change over the course of the disease, similar to the way in which the body’s production of and responsiveness to insulin changes as metabolic syndrome progresses — first surging and ultimately declining.

Although Kenny is hopeful that pharmaceutical interventions may one day be a potent weapon in the fight against obesity, he’s also quick to point out that — as with any addiction — behavioral modification is a crucial element of successful treatment.

Just as those struggling to maintain sobriety are advised to avoid bars and ex-smokers need an activity to replace the after-dinner cigarette, dieters still will need to modify behaviors and habits. But if an anti-addiction drug could safely improve the dismal success rate for dieters, it would be a welcome addition to the arsenal.

]]>
Are Bugs the New Beef? https://foodandnutrition.org/september-october-2013/bugs-new-beef/ Tue, 27 Aug 2013 01:43:43 +0000 https://foodandnutrition.org/?p=5298 ]]> I once opened a container of rolled oats that had gotten lost in the back of my pantry and was horrified to find a few mealworms wriggling in it. Not only did the oats go straight into the trash but the trash bag was immediately taken out to the curb—just to be on the safe side. In the not too distant future, however, nutrition-conscious consumers might be paying extra for flour fortified with protein-rich mealworms.

Entomophagy (the practice of eating insects) still elicits disgust from most North Americans and Europeans. You won’t find any nutrition information for insects in the USDA’s Food Composition Database, for example. (The USDA does, however, view the presence of insect parts in processed foods as “natural or unavoidable” and maintains that this poses no hazard to humans. See USDA publication: Food Defect Action Levels Handbook for regulations on amounts of insect material allowed in manufactured foods.)

But our aversion is far from universal. Various types of insects are an important food source — as well as a delicacy — for close to a third of the world’s citizens.

As the world’s population swells, a growing consortium of entomologists, entrepreneurs, venture capitalists, adventurous eaters, and “ethicureans” are rallying around insects (a.k.a. micro-livestock) as a more sustainable, climate- and animal-friendly way to supply the world’s calorie and protein needs. Insects, they argue, provide high-quality nutrition (see chart below) and require far less feed, water, space, time or energy to produce than traditional livestock. For a comprehensive examination of the issues and opportunities surrounding the use of insects for food, see the FAO’s 2013 report “Edible Insects: future prospects for food and feed security.”

Although still a decidedly niche market, the first insect-based products are already on the market. Cricket-based energy bars, for example, are available in three flavors at Chapul.com. And for a $1 surcharge, you can now order Mama Bird’s Granola with added insect protein.

Entomologist Harman Johar is founder and CEO of World Ento, a company that raises crickets and mealworms for human consumption. Although World Ento ships whole, toasted crickets and mealworms directly to a few intrepid consumers, Johar sees the development of insect-based flours and protein powders as the most likely route to gaining wide-spread consumer acceptance. “Getting people over the ‘ew’ factor is our biggest challenge,” says Johar. “Generally, if people can’t actually see the insects, they can get over it pretty quickly.” People who aren’t ready to eat a bugbased burger, suggests Johar, might be willing to accept a bun made with protein-rich cricket flour.

With few guidelines or regulations specifically addressing the production and marketing of insects for food, insect farmers are, for the moment, largely self-regulated. Most, however, hold themselves to high standards, both in terms of the quality and safety of the product as well as the quality of life for the insects. Johar’s insects, for example, live in climate-controlled conditions that mimic their natural habitat, and enjoy a diet of organic grains, fruits, and vegetables. When harvest time comes, the ambient temperature is gradually lowered, putting the bugs to sleep, before they are killed by freezing temperatures. The bugs are then sorted, cleaned, roasted, and packaged for sale. Toasted crickets reportedly have a faint grassy taste and a texture like potato chips with. Mealworms are said to be a bit moister and chewier, with a nutty, creamy flavor. Both have a wide range of culinary uses, ranging from cookies to calamari.

Despite the vastly lower inputs required, ento-culture has yet to develop the technologies, infrastructure, or economy of scale enjoyed by traditional livestock farmers. As a result, edible bugs and products made from them remain a bit pricey.

Nonetheless, when it comes to the next big thing in protein, the writing appears to be (crawling) on the wall.

]]>
Color Confusion: Identifying Red Meat and White Meat https://foodandnutrition.org/january-february-2013/color-confusion-identifying-red-meat-white-meat/ Wed, 02 Jan 2013 16:54:34 +0000 https://foodandnutrition.org/?p=5089 ]]> The idea that red meat is less healthful than white meat may be generally undisputed; multiple studies link red meat consumption to increased health risks including diabetes, stroke, coronary heart disease, weight gain, certain cancers and all-cause mortality. But what exactly is "red meat?" A precise definition is hard to come by.

Virtually all dietary studies categorize poultry and fish as "white meat" and four-legged land animals such as beef, pork and lamb as "red meat." Yet in culinary or cultural contexts, veal is often considered a white meat and duck or goose may be classified as red. Food scientists point to higher concentration of myglobin and slow-twitch muscle fibers as the primary determinant of red meat; however, the dark meat of chicken or turkey usually has more myoglobin than veal or pork.

Even the U.S. Department of Agriculture seems inconsistent in its explanations. According to an online meat preparation fact sheet on lamb, the amount of myoglobin in the animal's muscle determines its meat color category. In a separate USDA fact sheet on poultry production, ratites (large flightless birds such as emu, ostrich and rhea) are identified as red meat because "the pH of their flesh is similar to beef."

What is it about red meat that is so bad for us? Observational studies can detect a correlation between dietary patterns and health outcomes, but they cannot prove causation, nor can they provide much information about the mechanism by which certain foods, including red meat, may promote or undermine health.

"I suspect that multiple factors contribute to adverse effects of red meat," says Walter Willett, MD, DrPH, chair of the department of nutrition at the Harvard School of Public Health and a primary investigator in both the "Nurses' Health Study II" and the "Health Professionals Follow-up Study," from which many recent red meat associations are drawn.

"High amounts of heme iron, which is absorbed even when we have adequate iron stores, is probably a contributing factor for type 2 diabetes," says Willett. "However, the high amounts of saturated fatty acids and cholesterol are also probably contributing risks of cardiovascular disease, and specific amino acids may also be a factor."

Still, not one of these nutrients is consistent across the meat color categories (see chart). If we suspect that consuming myoglobin (or heme iron, or cholesterol, or fat) might shorten lives, why not collect, analyze and report the data on those nutrients, rather than continue to rely on vague and arbitrary designations like "red" and "white" meat?

Furthermore, according to analysis of data from the "Health Professionals Follow-up Study" and "Nurses' Health Study II," people who eat the most beef, pork and lamb live less healthful lifestyles in general. They tend to exercise less, eat fewer vegetables, are more likely to smoke and less likely to take multivitamins. But given the steady stream of bad press for red meat, should we be surprised that health-conscious people tend to eat less of it? Are they healthier because they eat less red meat, or do they eat less red meat because we keep telling them it's bad for them?

Another factor that looms large and is typically unaccounted for in dietary questionnaires is cooking method. Animal protein of any color cooked at high temperatures or over direct heat produces carcinogenic and atherogenic compounds. Without specifying preparation methods in these studies, it is impossible to distinguish between a char-broiled burger and a slow-braised pot roast.

As new factors of potential research interest are identified, questionnaires are updated with new categories and questions, but the ability to examine longitudinal effects is often limited by the less-specific questions included in the earliest versions.

Meanwhile, when it comes to improving public health, simple messages are usually the most effective. In that spirit, perhaps warning people about the dangers of "red meat" is the simplest way to encourage people to eat fewer burgers (and the fries and sodas that often accompany them), even if the color of the meat is not the primary culprit. But do we run the risk of creating the impression that chicken nuggets are more healthful than pork tenderloin?

]]>
Today’s Contemporary Spice Cabinet https://foodandnutrition.org/winter-2012/todays-contemporary-spice-cabinet/ Wed, 15 Feb 2012 19:35:02 +0000 https://foodandnutrition.org/?p=4954 ]]> When I was a child, my mother (who was—and still is—a wonderful cook) had a small wooden spice rack with two rows of stoppered glass bottles. Those 16 little vials contained pretty much everything she needed, from ground cloves for Thanksgiving pumpkin pies to paprika she dusted on deviled eggs. Today, my mother’s spice collection fills a cupboard that’s four feet high and two feet deep. Ask her for the paprika, and she’ll ask you whether you want sweet, hot or smoked.

According to McCormick, which has been selling herbs and spices since 1889, today’s home cook is likely to keep at least 40 different seasonings on hand, whereas the typical 1950’s American homemaker relied on fewer than 10 spices. Although the three best-selling flavorings (black pepper, vanilla extract and cinnamon) haven’t changed since the end of World War II, the rest of the spice rack has undergone a dramatic transformation. McCormick’s sales figures indicate that allspice, lemon extract, ground mustard and celery seed— all top sellers in the postwar years—have slid in popularity, while oregano, cumin, coriander and smoked paprika have risen through the ranks.

Adventurous Cooks, Sophisticated Palates

Generations of immigrants have brought a rich array of culinary traditions from around the globe to North America—and today, these exotic cuisines are sought and celebrated like never before. Tanya Wenman Steel, editor-in-chief of the popular Epicurious website, sees users filling their online recipe boxes with Indian and Middle Eastern dishes in addition to the usual lasagna and chicken salad. A recent search for the most popular recipes on Epicurious tells the tale: South American arepas and Javanese chicken curry vie for top billing with chocolate fudge and apple tart.

“American cooks have gotten a lot more adventurous,” says food writer and cookbook editor Amanda Hesser, “but also more discerning. We not only use a wider range of spices, but we’ve gotten more sophisticated about layering different flavors to create more complex and intense flavor profiles.”

There’s also an increased value on authenticity. Instead of relying on cayenne pepper as an all-purpose source of heat, today’s cooks might use certain peppers for Asian cuisine and others for Latin dishes. “Chefs are traveling more widely—and not just to Europe—and bringing back exotic ingredients and techniques,” says Hesser. “And thanks to the speed of information sharing, an ingredient can make the leap from esoteric to household name in a much shorter period of time.”

Unusual spices are also getting easier to procure, with mainstream spice vendors vastly expanding their product lines in response to the growing popularity of global cuisines. These days, you might not have to go farther than the grocery store to find chipotle pepper or garam masala—seasonings that most of us didn’t even know how to pronounce just a few years ago. And with a few days’ lead time, you can get just about anything else you could possibly want from online vendors.

From Spice Cabinet to Medicine Cabinet

As ancient healers realized and modern science has confirmed, herbs and spices do much more than flavor our foods. They can also be concentrated sources of nutrients and compounds with medicinal effects. Virtually all herbs and spices display antimicrobial and antioxidant activity. Some of the most commonly used spices display other useful properties, as well. According to the Therapeutic Research Center’s Natural Medicines Comprehensive Database:

  • Cinnamon has been shown to improve blood sugar control in diabetics in some studies although other studies showed no benefit.
  • Garlic may slow the development of atherosclerosis and seems to be able to modestly reduce blood pressure. In addition, eating garlic has been linked to a decreased risk of developing stomach and colon cancers, although garlic supplements do not seem to offer this benefit.
  • Ginger can help lessen nausea and vomiting in pregnant women and post-operative patients, and it may reduce symptoms of dizziness, including nausea. Also, there is preliminary evidence that ginger may have some benefits in managing osteoarthritis, specifically in reducing pain.
  • Turmeric, a key ingredient in curry blends, has been shown to reduce symptoms of indigestion.

Making Food Safer with Spices

The preservative power of spices was also well understood by the ancients, who used spices to slow the spoilage of perishables and even to embalm their dead. Today, food scientists are increasingly interested in spices as a way to make our food supply safer, testing the ability of various herbs and extracts to kill E. coli bacteria in meat, for example. Although far from providing surefire protection from foodborne illness, garlic, clove and oregano are among the most promising candidates.

Even more encouraging is the ability of certain herbs and spices to reduce the formation of harmful compounds formed when meats are cooked at high temperatures, such as on a grill. According to a study in the October 2011 Journal of Food Science, adding rosemary or turmeric to a burger can reduce the formation of heterocyclic amines by up to 40 percent. A separate study published in the May 2010 American Journal of Clinical Nutrition found adding a mixture of ground cloves, cinnamon, oregano, ginger, rosemary, black pepper, paprika and garlic powder to burgers reduces the formation of carcinogenic and atherogenic compounds by 70 percent.

Storing and Using Spices

One possible downside to an ever-expanding repertoire of spices: Stocking dozens of different seasonings to fuel your culinary adventures increases the chances that spices will sit in the cupboard for a long time before you use them. Whenever possible, buy spices in amounts that you can use within 12 months. For unusual ingredients that you use rarely, look for a store that sells spices in bulk so that you can buy only as much as you need. And to keep herbs and spices at their peak of flavor and nutritional potency, that wooden rack next to the stove has got to go. Spices should be kept away from heat, moisture and light, and they are best stored in a cool, dark cupboard in airtight containers.

]]>