Published by
Stanford Medicine

Category

Myths

Fertility, Myths, Pediatrics, Pregnancy, Sexual Health, Women's Health

Research supports IUD use for teens

research-supports-iud-use-for-teens

A large body of scientific research supports the safety and effectiveness of intrauterine devices and other forms of long-acting, reversible contraception (LARC) for adolescents, and physicians should offer these birth control methods to young women in their care. That’s the message behind a series of review articles published this week in a special supplemental issue of the Journal of Adolescent Health.

Stanford ob/gyn expert Paula Hillard, MD, who edited the supplement, explained to me that doctors are missing a great opportunity to prevent unwanted pregnancies by not offering young women the LARC birth control methods, which include IUDs and hormonal implants. Not only are the LARC methods very safe, the rate of unintended pregnancy with typical use of these techniques is 20 times lower than for alternate methods such as the Pill or a hormone patch.

But a design flaw in one specific IUD used in the 1970s - the Dalkon Shield - increased women’s risk for pelvic infections and gave all IUDs a bad rap. Use of IUDs among adult American women has been low ever since; it’s even lower in teens.

“Long after it was proven that the Dalkon Shield was particularly bad and newer IUDs were much safer, women were just scared,” Hillard said. “Not only did women stop asking for for them, many doctors also stopped using IUDs.”

The new review articles that Hillard edited are targeted at physicians but contain some interesting tidbits for general readers as well. The article titled “Myths and Misperceptions about Long Acting Reversible Contraception (LARC)” provides scientific evidence to refute several common myths, concluding, for instance, that IUDs don’t cause abortions or infertility, don’t increase women’s rates of ectopic pregnancy above the rates seen in the general population, and can be used by women and teens who have never had children.

And, as Hillard put it for me during our conversation, “These birth control methods are very safe and as effective as sterilization but completely reversible. They work better than anything else, and they’re so easy to use.”

Previously: Will more women begin opting for an IUD?, Promoting the use of IUDs in the developing world, and Study shows women may overestimate the effectiveness of common contraceptives
Photo, by ATIS547, shows a public sculpture on the campus of the University of California, Santa Cruz that is affectionately known as the “Flying IUD”

In the News, Myths, Nutrition, Parenting, Pediatrics

Debunking a Halloween myth: Sugar and hyperactivity

debunking-a-halloween-myth-sugar-and-hyperactivity

Does sugar make children hyperactive? To the surprise of many, particularly parents gearing up for tonight’s Halloween craziness, the answer is no.

A large body of scientific evidence debunks the notion of a cause-and-effect relationship between sugar consumption and children’s hyperactivity. So what’s actually going on? The San Francisco Chronicle interviewed a Stanford nutrition expert today to find out:

Dr. Tom Robinson, director of the Center for Healthy Weight Lucile Packard Children’s Hospital at Stanford, explains that because so many parents (and thus children) expect eating sweets to make them hyper, it becomes a self-fulfilling prophecy.

“The way we think we should feel has a lot to do with how we do feel,” he said.

The story mentions one of my favorite studies on the subject, in which parents who thought their kids were sugar-sensitive were asked to rate their child’s behavior after the children had consumed soda. Parents who heard that their children received sugar-sweetened sodas rated the youngsters’ behavior significantly worse than those who were told their kids drank artificially-sweetened soda. The catch? All the kids in the study consumed artificially-sweetened sodas.

Several other studies have attacked this question from different angles and reached the same conclusion that eating sugar doesn’t make children hyperactive. But as Robinson notes in the Chronicle piece, there are plenty of other good reasons, besides hyperactivity, to limit children’s sugar consumption. Two such reasons are sugar’s connections to promoting obesity and dental cavities.

Continue Reading »

Myths, Nutrition, Stanford News

Fact or fiction: Talkin’ turkey and tryptophan

fact-or-fiction-talkin-turkey-and-tryptophan

I’m pretty sure you’ve heard of the so-called turkey coma: Tryptophan, an amino acid present in all that turkey you’re going to eat tomorrow, makes you sleepy. Heck, it was fodder for a whole Seinfeld episode. And, some of you may have even used it as an excuse to get out of doing the dishes on Thanksgiving.

Though scientists have debunked the tryptophan/turkey myth, the urban legend continues to live on. I decided to turn to Stanford neuroimmunologist Lawrence Steinman, MD, to finally put the turkey talk to rest. Back in 2005, his lab showed that tryptophan plays a pivotal role in the immune system.

So I asked Steinman: If we feel sleepy after eating a big turkey meal on Thanksgiving, is it due to tryptophan (which is allegedly very high in turkey)? He told me:

Humans cannot make tryptophan. Tryptophan is not higher in turkey than in most other muscle tissue from other animals, more commonly known as meats. When we ingest tryptophan, most is metabolized in the liver. However, some tryptophan gets to the brain, where it is metabolized into serotonin and melatonin. This uptake and conversion may take hours. The effects of alcohol are much faster.

It is not the turkey that makes us sleepy at Thanksgiving. It is the massive ingestion of calories, carbohydrates and often alcohol that results in the desire to sleep. Whatever makes you sleepy on the Thanksgiving holiday just enjoy it. Kick off your shoes, stretch out on the couch and watch a football game. Just refrain from snoring or you risk alarming the guests. But please ask someone to wake you from your nap, so you can help with the dishes!

Previously: Wherein we justify eating more cranberry sauce tomorrow and A guide to your Thanksgiving dinner’s DNA
Photo by orphanjones

Addiction, Behavioral Science, Mental Health, Myths, Neuroscience, Women's Health

Research links bulimia to disordered impulse control

Although some consider eating disorders like bulimia to be the over-hyped, Hollywoodian maladies of the wealthy and superficial, the fact is that they are serious psychiatric disorders. Bulimia seems to be particularly complex from a psychological standpoint.

A recent article in the East Bay Express focused on the disorder and discussed research by Stanford’s James Lock, MD, PhD, psychiatric director of the Comprehensive Eating Disorders Program at Lucile Packard Children’s Hospital. Lock’s research suggests that bulimia is an impulse-control disorder (where the impulse is binge eating), a class of disorders that also includes shoplifting and drug addiction:

As young women with bulimia grow older, destructive impulses like bingeing and purging may become more powerful while parts of the brain that govern impulse control may weaken. And according to… studies, the bulimic brain is more likely to succumb to a variety of self-destructive impulses, making the disorder a sort of psychological Hydra. Over time, these impulses may turn into compulsions, or bad habits, much like drug addiction.

Lock, who has been working with eating-disordered youth at Stanford’s clinic for nine years, noticed that his patients often exhibited behavior consistent with impulse-control issues. Such behavior included sexual promiscuity and kleptomania. In a study requiring both healthy and bulimic girls to avoid performing a task, Lock noticed that bulimic girls had significantly more difficulty controlling their impulse to perform the forbidden task. Moreover, Lock noticed increased brain activity in the sections of these girls’ frontal lobes responsible for impulse control. His findings seemed to suggest that the girls’ brains were working overtime to manage impulses that healthy girls had no trouble controlling.

Eating disorders and many other mental disorders are medically elusive, since their physiological causes are practically unknown. Research like Lock’s, which considers disorders like bulimia to be serious psychiatric conditions and attempts to link them to other psychological disparities, is a crucial step in solving the mystery.

Previously: KQED health program examines causes and effects of disordered eating

Myths, Nutrition, Obesity

Effects of diet sodas on weight gain remain uncertain

effects-of-diet-sodas-on-weight-gain-remain-uncertain

Recent studies suggesting that diet sodas may lead to weight gain have stirred up interest among diet-soda-drinkers and non-diet-soda-drinkers alike, confirming suspicions that the “diet” label and zero-calorie contents may be too good to be true. One of these studies, presented to the American Diabetes Association in June, associated diet soda consumption with a waistline size increase 70% greater than non-users. These results, along with the results of several related studies, are in line with thinking suggesting diet or “light” foods and beverages may contribute to weight gain.

But Loyola University obesity specialist Jessica Bartfield, MD, thinks that we should take these studies with a grain of salt (or, if you prefer, aspartame). A release that came out today quotes her take on the issue:

“I suspect that people are likely drinking those diet sodas to wash down high-fat and high-calorie fast-food or take-out meals, not as a complement to a healthy meal prepared at home or to quench a thirst after a tough workout.”

In other words, it’s not the fake sugar in diet soda that causes weight gain-it’s the lifestyle choices that usually accompany it. Switching from regular soda to zero-calorie diet varieties, she argues, may be tremendously effective as a weight-loss strategy-just as long as users aren’t canceling it out with an otherwise high-calorie diet.

Bartfield also points out the importance, in the case of obesity studies, of taking all factors into account:

“The association studies are significant and provocative, but don’t prove cause and effect,” said Bartfield, who counsels weight-loss patients at the Chicago-area Loyola University Health System. “Although these studies controlled for many factors, such as age, physical activity, calories consumed and smoking, there are still a tremendous number of factors such as dietary patterns, sleep, genetics, and medication use that account for weight gain.”

Dieters looking for a satisfying answer to their weight-loss questions may be annoyed by the back-and-forth on issues like these. Then again, if obesity were a straightforward issue, we’d have solved it already.

Photo by computerjoe

Autism, Genetics, Myths, Neuroscience, Research

Unsung brain-cell population implicated in variety of autism

Like the late Rodney Dangerfield, and as I once wrote in Stanford Medicine, glial cells “don’t get no respect.” Combined, the three glial cell types - astrocytes, oligodendricytes, and microglia - constitute a good 90 percent of all the cells in the brain. Yet the remaining 10 percent - the neurons - are so celebrated they’ve lent their name to brain science: neurobiology.

Stanford’s Ben Barres, MD, PhD, a lonely voice in the wilderness, has long advocated paying more attention to glial cells. His experience as a young neurologist in the 1980s convinced him that they’re involved in all sorts of brain pathology.

And, belatedly, glial cells are getting some grudging respect, in appreciation of their increasingly well-characterized roles in everything from directing blood vessels to increase their diameters in the vicinity of busy nerve circuits to determining which synapses will live and which will die.

In a new study just published in Nature Neuroscience, a genetic deficiency known to be responsible for Rett syndrome, the most physically disabling of the autistic disorders, has been shown to wreak many of its damaging effects via astrocytes. These gorgeous star-shaped glial cells, alone, account for almost half of all cells in the human brain (although by volume not so much, as they’re smaller than neurons).

In the study, investigators at Oregon Health and Science University employed a mouse model of Rett’s syndrome in which the condition’s defining gene defect was present in every cell of every mouse. When the investigators restored that defective gene to the mice’s astrocytes - and only their astrocytes - many of the signature symptoms of the disease cleared up.

Rett Syndrome was once assumed to be exclusively a function of damaged neurons. This latest finding, like many others over the past decade, goes to show that glial cells aren’t just a bunch of packing peanuts whose main job is to keep our neurons from jiggling when we jog.

Photo by Ibtrav

Myths, Nutrition, Obesity

The dark side of “light” snacks: study shows substitutes may contribute to weight gain

the-dark-side-of-light-snacks-study-shows-substitutes-may-contribute-to-weight-gain

Yet another get-thin-quick scheme has been debunked: A new study by Purdue University researchers shows that fake fats used in low-calorie snacks may actually contribute to weight gain.

Synthetic fat substitutes are the cornerstone of zero-calorie snack foods that market themselves as “diet” products. But when Purdue researchers put one group of mice on a high-fat diet and another group on a mixed diet containing both fatty products and products containing fat substitutes, they noticed that the mixed-diet mice gained more weight than the high-fat-diet mice.

“But,” I cried, dejectedly stuffing a handful of chips into my face, “how can low-calorie foods cause weight gain?” The release explains:

Food with a sweet or fatty taste usually indicates a large number of calories, and the taste triggers various responses by the body, including salivation, hormonal secretions and metabolic reactions. Fat substitutes can interfere with that relationship when the body expects to receive a large burst of calories but is fooled by a fat substitute.

“Is there any good news?” I whimpered, wiping tears and crumbs off onto my sleeve. Certainly. Synthetic fat substitutes only seem to promote weight gain when consumed as part of a high-fat diet. Low-fat dieters, like some of the mice in Purdue’s study, are safe from fat substitutes’ greasy grasp:

The rats that were fed a low-fat diet didn’t experience significant weight gain from either type of potato chips. However, when those same rats were switched to a high-fat diet, the rats that had eaten both types of potato chips ate more food and gained more weight and body fat than the rats that had eaten only the high-calorie chips.

As always, easy-weight-loss solutions like fat substitutes and artificial sweeteners (which have also been linked to weight gain) can prove disappointing and even dangerous. Looks like healthy foods and exercise are still your best bet.

Photo by bamalibrarylady

Global Health, Health Policy, Medicine and Society, Myths, Patient Care

The English patient meets the British health-care system… eventually

In all respects excluding its expense, health care in the U.S. has been given a bum rap. (See here, here and here.) I’ve kvetched about the shortcomings of the Canadian and French health-care systems, as experienced by my own flesh and blood relatives (or theirs).

Now, the latest knock on the British one. (Not that it’s the only thumbs-down on British health care.) A snippet:

Katherine Murphy, the director of the Patients Association, said it had heard from people whose hip or knee replacement had been postponed once or twice without them being offered a new date, leaving them in pain and with their independence compromised.

Previously: Forbes: U.S. still most-innovative country in biomedicine, Rand Corp. study says U.S. health care for elderly superior to UK’s, U.S. health system’s sketchy WHO rating is bogus, says horse’s mouth and Rush to judgment regarding the state of U.S. health care?

 

Infectious Disease, Myths, Nutrition

Does vitamin C work for the common cold?

Since I spend quite a bit of time traveling and can’t afford to get many colds, the slightest tickle in my throat usually sends me bounding into the kitchen in search of a bolus of vitamin C. There I tear into a small blue packet containing a pastel powder, dump it into a glass of water, and gulp down the resulting fizzy elixir. Thinking rationally, I recognize the vitamin C is unlikely to help much - and yet I still practice this ritual every time I feel unwell.

Now I’ve come across an excellent analysis of whether or not vitamin C works for the common cold on Clinical Correlations and it offers another reminder that I’m probably wasting my money. Carolyn Bevan, MD, writes:

…at the end of the day, is there any benefit to taking a daily vitamin C supplement, or for chugging down that fizzy shot of mega-dose vitamin C when you feel a cold coming on? If you are a marathon runner, or if you are planning a winter adventure in the arctic tundra, you should certainly consider a daily dose of vitamin C. For the rest of us, it doesn’t seem to be worth the hassle and expense of adding one more pill to our daily routine.

How Bevan gets to that conclusion is definitely worth reading - and her analysis is even peer reviewed. Thanks to her effort, at least until I replace marathon flights with marathon runs, it seems that I can probably skip the vitamin C.

Aging, Global Health, Health Policy, In the News, Myths, Patient Care

Rand Corp. study says US health care for elderly superior to UK’s

I’ve written before - in fact more than once - about statistics purporting to show that the U.S. health care system stinks compared with those of, say, Canada or Europe. It appears these apples-and-oranges comparisons may be full of beans.

The evidence keeps piling up. In a just-out Rand Corporation study, investigators compared older American versus older English citizens’ death rates from various aging-associated diseases. Older Americans are almost twice as likely to be diagnosed with such increasingly common conditions as type 2 diabetes, high blood pressure, and the like. Is this evidence of an inferior U.S. health care system?

Probably not. Those suffering from these syndromes in the U.S. are only half as likely to die from them as their British counterparts. As a result, the afflicted Americans live at least as long as the afflicted British. Overall, 65-year-old Americans can expect to outlive their like-aged friends across the pond by about three months, despite their higher likelihood of having or getting an aging-associated illness.

The researchers note two possible explanations for why sick elders live longer in America than in England:

One is that the illnesses studied result in higher mortality in England than in the United States. The second is that the English are diagnosed at a later stage in the disease process than Americans.

Either explanation, they add, implies a more responsive health care system in the U.S. - at least for older people, who have nearly universal access to it. The fault appears to lie not in the health care system we Americans frequent (if you ignore its expense), but in our own lifestyle choices and, perhaps, other factors outside the control of both our health-care system and ourselves.

Stanford Medicine Resources: