Drug Helps Obese People Drop Weight And Keep It Off

The diabetes drug liraglutide can help obese people who don't have diabetes lose weight and keep it off, new findings confirm.

Researchers found that 63 percent of study participants given liraglutide for 56 weeks lost at least 5 percent of their body weight — the amount experts agree is needed to make a difference in obesity-related health problems — whereas just 27 percent of the placebo group lost that much.

"It is a very effective drug. It seems to be as good as any of the others on the market, so it adds another possibility for doctors to treat patients who are having trouble either losing weight or maintaining weight loss once they get the weight off," said Dr. Xavier Pi-Sunyer,  a professor of medicine at Columbia University Medical Center in New York City, and first author of the new study published today (July 1) in the New England Journal of Medicine. The company Novo Nordisk, the maker of liraglutide, funded the research.

Liraglutide has been available in the United States for treating people with diabetes since 2010. The drug mimics a naturally occurring hormone called glucagon-like peptide-1, which is released in the human intestine and reduces hunger, increases satiety and slows the rate at which the stomach empties its contents into the small intestine. The Food and Drug Administration approved liraglutide (at a higher dose than is used for diabetes) for treating obesity in December 2014.

In the new study, Pi-Sunyer and his colleagues randomly assigned 3,731 men and women with a body mass index of at least 30, or a BMI of at least 27 if they also had high cholesterol or high blood pressure, to receive a 3.0-milligram dose of liraglutide daily, or a placebo shot. Study participants also received counseling on ways to change their lifestyle to promote weight loss. About 2,500 patients in the study were given liraglutide, and about 1,200 were given the placebo injections.

After 56 weeks, the participants on liraglutide lost an average of 18.5 pounds, compared with 6.4 pounds for the people on the placebo. Among the patients on liraglutide, 33 percent lost at least 10 percent of their body weight, whereas just 11 percent of the placebo group lost that much. [7 Biggest Diet Myths]

The most common side effects of the drug were nausea and diarrhea. Patients on the medication were also at increased risk of gallbladder-related problems, which, the authors noted, could have been due to their above-average weight loss.

Starting patients at a lower dose and then increasing it gradually helps reduce gastrointestinal side effects, Pi-Sunyer said. For most patients, the nausea went away after they had been on the drug for four to six weeks, he added.

Drawbacks to the medication include its high cost — about $1,000 for a month of treatment — and the fact that it must be given by injection. Currently, most insurers don't cover liraglutide for treating obesity. Also, Pi-Sunyer said, patients will probably have to be on the drug indefinitely to maintain weight loss.

Nevertheless, "every tool we discover for obesity is good news," said Dr. Elias Siraj, a professor of medicine at Temple University School of Medicine in Philadelphia, who was not involved in the new study but co-authored an editorial accompanying it in the journal. "The reason is, we are in the midst of a huge global obesity epidemic, and there's no question it has not been easy to manage obesity."

Many of the people in the study who lost weight on liraglutide remained obese, Siraj said, although this doesn't mean they didn't benefit from losing weight. "Previous studies have shown if you lose more than 5 percent of your body weight, it may not make a difference in how you look from outside, but it does make a difference in terms of metabolic parameters and cardiovascular risk factors," he said.

The patients who will likely benefit the most from liraglutide are those with diabetes, high blood pressure, cholesterol and other obesity-related problems, he added. "You can't make a blanket recommendation that everyone should be on it," he said. "Cost is going to be an issue initially, but hopefully down the road the cost will get better."

The increased risk of gallstones and other problems associated with liraglutide should be investigated further, Siraj said. "There is always room for caution until we have long-term data."

"While there's room for options, we also have to note that this is not a cure," he told Live Science. "Fundamentally, obesity is a disease of lifestyle — diet and exercise — and therefore lifestyle modification has to be the core, no matter what you do. Medications alone are not going to do it."

Follow Live Science @livescience, Facebook & Google+. Original article on Live Science.

The Fascinating History Of Urine Tests

To a doctor, urine can provide much information. One way for doctors to find out what’s going on inside the body is to examine what flows out of it. So don’t be surprised the next time a doctor asks for a urine sample for a seemingly non-urinary complaint.

In fact, be a little proud. When you hand over that little cup, you’re participating in a medical tradition more than 6,000 years in the making.

Ancient Babylonian and Sumerian physicians first inscribed their evaluations of urine into clay tablets as early as 4,000 B.C.

Later, in ancient Greece, Hippocrates, often called the father of Western medicine, expanded on urine’s importance: “No other organ system or organ of the human body provides so much information by its excretion as does the urinary system,” he wrote.

By the late middle ages, the study of urine had solidified into the practice known as uroscopy. Medieval doctors associated nearly every known disease with urinary characteristics , and some would diagnose patients without even meeting them just by examining a bottle of their urine.

Uroscopy was commonplace, and it shows up in Shakespeare’s writings. In Henry IV, when Falstaff asks “What says the doctor to my water?” He’s not just asking about his urinary health; because urine was so central to medicine at that time, he was effectively asking for the results of his entire checkup.

Although many uroscopy tests done in those times have been discredited, certain tests are still done today because they accurately indicate health problems, said Eric Wallen, a professor of urology at the University of North Carolina. “Malodorous urine is accurately classified as infected, red urine still notable for the presence of blood, [and] brown urine for bilirubin or blood products,” Wallen said.

“But it would be a rare physician today who [only] utilized this form of analysis,” Wallen said, and it would be especially rare to find one still “tasting the urine to diagnose diabetes .”

Follow MyHealthNewsDaily on Twitter @MyHealth_MHND. Like us on Facebook.

Better Sleep May Help Improve Schizophrenia

Sleep problems and schizophrenia may have common roots, raising hopes that the devastating mental disorder could be improved by helping patients overcome insomnia.

In a new study monitoring the sleep and circadian rhythms of people with schizophrenia, researchers found many more sleep problems in the schizophrenia patients versus mentally healthy controls. Combined with other research linking a schizophrenia-related gene with sleep-wake cycles in mice, the findings suggest that sleep and schizophrenia are more closely intertwined than ever realized, study researcher Russell Foster told LiveScience.

"We've been thinking of sleep disruption as one of the genetic, developmental and environmental contributors to the development of these appalling conditions," said Foster, who is a circadian and visual neuroscientist at the University of Oxford. 

Sleep and schizophrenia

Clinicians have long recognized that schizophrenia and disturbed sleep go hand-in-hand — about 80 percent of schizophrenia patients have sleep problems, Foster said. But these problems have usually been dismissed as a medication side effect or as the result of social isolation and unemployment in people with the disorder. [10 Stigmatized Health Disorders]

"That didn't make too much sense to me," Foster said.

Many mental disorders come with a side of sleep problems, including depression and bipolar disorder, Foster and his colleagues realized. And intriguingly, genes linked to circadian rhythm — the neural and biological system that attunes our sleep-wake cycles to dark and light — may play a role in some of these disorders. A gene called SNAP25, for example, is known to be important in the circadian system. SNAP25 abnormalities have also been linked to schizophrenia.

Studying sleep

In order to take a systematic look at the circadian rhythms of people with schizophrenia, Foster and his colleagues recruited 20 people with the disease and instructed them to wear movement-detecting wristwatches for six weeks. The amount of motion detected can be analyzed to determine whether the person is asleep or awake, given the vastly different movement patterns between the two states.

The patients also filled out questionnaires and kept daily diaries of their sleep and activities. All of the patients were taking medication to control their symptoms, and they had all been stable on that medication for at least three months. Finally, the patients gave 48 hours work of urine samples to be tested for melatonin, a hormone that regulates sleep (melatonin makes a person sleepy).

For comparison, the researchers asked another 21 mentally healthy but unemployed adults to wear the same watches and keep the same records as the people with schizophrenia. Unemployed people were chosen because the patients with schizophrenia were all unemployed, and employment can alter sleep patterns by forcing people to get up with an alarm clock.

The insomnia of schizophrenia

A comparison between the two groups revealed that while unemployed people keep fairly regular sleep hours, every person with schizophrenia in the sample had a sleep problem.

"What became very clear is that they are massively and completely disrupted," Foster said.

This disruption did not follow a common pattern. Some people with schizophrenia went to bed late and got up late, with their melatonin release patterns delayed by several hours compared with healthy counterparts. Others would get up later and later every day, their circadian rhythms "drifting" through time. The most severely affected showed no normal 24-hour sleep-wake pattern at all. They'd alternate sleep and activity throughout the day and night. [Are You Getting Enough Sleep? (Infographic)]

The results weren't the result of unemployment, because the unemployed-but-healthy group did not show them. Nor could they be linked to any specific medication or dosage level, Foster said.

These results, published in the April issue of the British Journal of Psychiatry, mesh with another recent study by Foster's team, this one published in January in the journal Current Biology. In that study, the researchers examined the sleep-wake behaviors of mice with a SNAP25 gene mutation mimicking schizophrenia.

"Quite amazingly those mice show a [sleep] pattern which is just like the patients with schizophrenia," Foster said.

In mice, the problem arises in broken communication between the cells in the brain that set the body's "clock" and the neurons that then go on to match the body's physiology to that clock. If the same is true of humans with schizophrenia, Foster said, it's possible that by easing sleep troubles, you could also decrease schizophrenia symptoms. This could be done with light therapy, melatonin treatment or even cognitive-behavioral therapy, a kind of talk therapy that helps patients change behaviors such as when and how they fall asleep.

"We want to look at individuals with full-blown conditions, bipolar, psychosis, schizophrenia, to try to develop therapies which will stabilize sleep-wake," Foster said. "And at the same time look precisely at the impact we're having on their physiology."

You can follow LiveScience senior writer Stephanie Pappas on Twitter @sipappas. Follow LiveScience for the latest in science news and discoveries on Twitter @livescience and on Facebook.

Taking Adhd Medications May Help Reduce Car Accidents

Adults with attention-deficit/hyperactivity disorder (ADHD) who stay on their medication may be safer when they get behind the wheel than those who don't, a new study suggests.

Researchers found a link between a person's use of prescribed ADHD medications and a reduced risk for motor vehicle accidents: U.S. men with ADHD were 38 percent less likely to be involved in a car crash during the months when they received medication for the condition, compared with the months when these same men were not receiving medication.

For U.S. women with ADHD, the rates of motor vehicle crashes were 42 percent lower during the months when the women had received their ADHD medications than during the months when these same women went without treatment, according to the findings. The research was published today (May 10) in the journal JAMA Psychiatry. [10 Ways to Keep Your Mind Sharp]

The researchers concluded that people with ADHD are more likely to have motor vehicle crashes than people who do not have the condition, but that taking medication may help to reduce this risk, said lead study author Zheng Chang, a postdoctoral researcher in medical epidemiology and biostatistics at the Karolinska Institutet in Stockholm.

This is the first study to examine the association between the use of ADHD medication and motor vehicle crashes in the U.S., Chang told Live Science.

Previous studies have suggested that people with ADHD are more likely to be involved in motor vehicle crashes, because many of the symptoms of the disorder, such as inattention and impulsivity, may interfere with the ability to drive safely. However, researchers also thought that the use of ADHD medication could have a protective effect, reducing unsafe driving behaviors and resulting in fewer crashes and injuries.

In one study done in Sweden and published in 2014 in JAMA Psychiatry, researchers found that the use of ADHD medication in men was tied to a lower risk of traffic crashes, but it was unclear whether women's use of ADHD medications had a similar effect. 

In the new study, the researchers tracked more than 2 million American adults who had been diagnosed with ADHD between 2005 and 2014. The data came from a health insurance claims database that contained patient information from more than 100 health insurers, including inpatient and outpatient hospital visits as well as prescriptions filled.

To determine whether individuals with ADHD had been involved in car crashes, the researchers looked at visits to hospital emergency rooms resulting from motor vehicle accidents

The findings showed that ADHD medications were associated with a reduced risk of motor vehicle accidents in both men and women, Chang said. In a separate analysis, the researchers estimated that up to 22 percent of the car crashes could have been avoided if the patients had received ADHD medications at some point during the entire study period, he noted. 

Chang said he suspects the findings may actually underestimate the effect of ADHD medication on car accidents. That's because the study did not include crashes in which people did not seek out medical services, Chang said.

"Many (possibly most) vehicular collisions do not result in an emergency room visit," two other researchers wrote in an editorial accompanying the study that appeared in the same issue of JAMA Psychiatry. In addition, the data did not include car crashes that involved a person dying at the scene, the editorial noted.

In all, the new study probably underreports the benefits of appropriate ADHD medication and its effects on driving safety, suggestedthe editorial authors, Dr. Vishal Madaan, a child psychiatrist, and Daniel J. Cox, a professor of psychiatry and neurobiology, both at the University of Virginia School of Medicine in Charlottesville.  

Both of the editorial authors acknowledged receiving research support from pharmaceutical companies. [10 Facts Every Parent Should Know About Their Teen's Brain]

Madaan and Cox suggested that another limitation of the study is the way the researchers defined a patient's "use" of the ADHD medication; this was based only on whether he or she had filled the prescription during a given month. This "may have little bearing on whether the medication was active in the driver's body at the time of the driving mishap," the editorial said. 

It is not uncommon for people with ADHD to forget to take their prescribed medications or for the medication to have worn off at the time of the crash, especially if the accident occurred later in the evening, the editorial said.

But overall, the editorial concluded that the study findings confirm and extend the existing evidence for the influence of ADHD medications on accidents. The results "have impressive implications for [the] judicious use of ADHD medication," the editorial said.

Originally published on Live Science.

Reality Check: Genetic Test To Tell How Long You'll Live?

Researchers in Spain say they're close to marketing a genetic test that could tell consumers how fast they are aging and, potentially, how long they will live. But experts say that such claims are false.

The Independent, a British newspaper, reported today (May 16) scientists are developing a blood test that would measure the length of an individual's telomeres, or caps on the tips of chromosomes that protect the chromosomes from damage. Telomeres are thought to play a role in aging, and previous studies have found an association between telomere length and lifespan.

The test would tell an individual if their "biological age" — the age of their cells — matches their chronological age, the Independent reported. This information, in turn, might tell a person how many years of his or her life remain. The researchers hope to market their test to the general public later this year, sold by the company Life Length.

Estimating biological age

However, experts argue that the scientific understanding of telomeres is not at the point where such a test would be accurate. We know telomere length changes with age, said Carol Greider, a geneticist at John Hopkins School of Medicine in Baltimore who studies telomeres. But in the general population, the length of people's telomeres varies widely. A 20-year-old and a 70-year-old might have telomeres that were the same length, Greider said.

"We cannot tell how old a person is by looking at their telomeres," Greider told MyHealthNewsDaily. In addition, you can't tell someone they have the cells of a 50-year-old, even though they're 20, she said. "I would say that it is not possible to tell a persons 'biological age' from their telomere length," Greider said. If a test says it will tell you how long you will live "clearly that's not true," she said.

Others say it is possible to get a ballpark "biological age" by looking at an individual's telomeres. But it's essential to have information on additional factors as well, including the person's gender, age when they gave the test, family history of disease, smoking history and how often they are exposed to sun, all of which can influence telomere length, said Gil Atzmon, a researcher at the Albert Einstein College of Medicine in New York who has studied the genetics of aging. Taking all this information into account, you could estimate a person's biological age, but the accuracy would be skewed by 5 to 10 percent, Atzmon said. That means, if a test predicted your biological age was 50, your real biological age could be between 45 and 55, Atzmon said.

The researchers say they will determine the percent of very short or "dangerous" telomeres within a cell.

"A short telomere represents a persistent and non-repairable damage to the cells, which is able to prevent their division or regeneration," said Maria Blasco, inventor of the test and researcher at the Spanish National Cancer Research Centre in Madrid. The researchers hope to construct a database of telomere length values for the general population so they can tell "whether the percentage of short telomeres of a given person is within normality for a given age or indicates a younger or older biological age," Blasco told MyHealthNewsDaily.

The genetic test would take into account other factors that affect aging, Blasco said.

However, Blasco stresses, "We will not tell the clients how long they will [live]."

Looking at the length of telomeres does have some known clinical uses. Individuals with the shortest telomeres — shorter than 99 percent of the population — are at risk for certain diseases, including bone marrow failure and lung disease, Greider said.

Consumer interest?

Jerry Shay, a professor at the University of Texas Southwestern Medical Center in Dallas and a consultant for Life Length, said consumers would be interested in such information. "I think people are just basically curious about their own mortality. If you ask people what they worry about, most people would say they are worried about dying," Shay told the Independent.

He added: "People might say 'If I know I'm going to die in 10 years I'll spend all my money now,' or 'If I'm going to live for 40 more years I'll be more conservative in my lifestyle.'"

Greider said it's up to consumers whether they want to have this information, but up to scientists to make sure the public understands the true meaning of the results.

"It's a very personal choice whether somebody wants to know their genetic status," Greider said. "It is up to [scientists] to accurately say what we understand the genetic changes mean," she said.

Pass it on: Experts question the accuracy of a genetic test that would use telomere length to tell an individual how fast they are aging.

This story was provided by MyHealthNewsDaily, a sister site to LiveScience. Follow MyHealthNewsDaily staff writer Rachael Rettner on Twitter @RachaelRettner.

Which States Are Eating Their Fruits And Veggies?

Despite fruits' and vegetables' firm place in a healthy diet — and certainly in a diet for weight loss — Americans just aren't eating enough produce each day.

Data from the Centers for Disease Control and Prevention show that the majority of Americans aren't getting enough fruits or vegetables into their diet on a daily basis.

The U.S. Department of Agriculture recommends that people eat three to five servings of vegetables daily, and two to four servings of fruit daily. But the CDC data show not only that just a small minority of Americans are eating the recommended amount, but also that many Americans aren't eating fruits or vegetables even one time each day.

People in Arkansas fared the worst for eating even a minimal amount fruit, with about half (50.5 percent) reporting that they eat fruit less often than once per day.

For comparison, Californians appear to be the least likely to skimp on fruit. Only 30.4 percent of people in the Golden State eat fruit less often than once per day.

Americans are doing a slightly better job of at least occasionally eating vegetables. People in Louisiana were most likely to eat vegetables only minimally, with 32.7 percent reporting they eat vegetables less often than once per day.

Oregonians, on the other hand, were the most likely to eat at least some vegetables, with only 16.3 percent eating vegetables less often than once per day.

This map shows the percentage of people in each state who eat vegetables less often than once per day. (A higher percentage means generally lower rates of vegetable consumption.)

This map shows the percentage of people in each state who eat fruit less often than once per day. (A higher percentage means generally lower rates of fruit consumption.)

This article is part of a Live Science Special Report on the Science of Weight Loss.

Follow Sara G. Miller on Twitter @SaraGMiller. Follow Live Science @livescience, Facebook & Google+. Originally published on Live Science.

Why Does Plague Still Occur In The Western Us?

Three cases of plague have occurred in the United States in recent months, and although the illness is rare, it's not uncommon to have a few cases here each year.

Most recently, a girl in California became sickened with plague after visiting Yosemite National Park and the nearby Stanislaus National Forest in mid-July, according to the California Department of Public Health. The girl was hospitalized and is recovering, and officials are investigating the source of her infection.

There were also two deaths from plague in Colorado this summer — in early June, a 16-year-old boy in Larimer County died, and this week, an adult in Pueblo City died.

Plague cases occur sporadically in the United States — between 1970 and 2012, an average of seven plague cases occurred yearly, according to the Centers for Disease Control and Prevention. In 2014, there were 10 plague cases, and in 2013 and 2012, there were four cases each year.

But plague cases don't show up everywhere. Rather, most occur in rural areas in western states, particularly southern Colorado, northern New Mexico, northern Arizona, California, southern Oregon and western Nevada, the CDC says.

One reason why cases of plague are restricted to the West is that the rodent populations there carry the disease, said Dr. Amesh Adalja, an infectious-disease specialist and a senior associate at the University of Pittsburgh Medical Center's Center for Health Security.

"Prairie dogs are one of the major rodent species that serves as a reservoir for plague, and they tend to be west of the 100th meridian" in the United States. For this reason, this line of longitude is sometimes referred to as the "plague line," Adalja said.

The disease is caused by bacteria called Yersinia pestis, which are carried by rodents and their fleas. The most common way for people to contract plague is from fleabites, but people can also get the disease if they have contact with infected animals, or their tissue or fluids, the CDC says. [Pictures of a Killer: A Plague Gallery]

There are several types of plague, with bubonic plague being the most common. This form of the disease causes swelling of the lymph nodes, as well as fever, chills and headache. Bubonic plague does not spread from person to person.

People can protect themselves from plague — as well as other diseases that are carried by insects — by wearing protective clothing (such as long pants tucked into socks) and using insect repellent, Adalja said.

People should also not feed squirrels or other rodents, and shouldn't touch dead rodents without appropriate protective equipment, Adalja said.

And although plague cases can occur when people visit rural areas, Adalja noted that people are more likely to be infected with tick-borne illnesses such as Lyme disease, than plague.

"The plague is a very rare disease in the United States, and people shouldn’t be too unduly concerned about it," Adalja said. "It's not something that people should really change their vacation plans over."

Follow Rachael Rettner @RachaelRettner. Follow Live Science @livescience, Facebook & Google+. Original article on Live Science.

How Opioid Prescriptions Have Changed Recently: New Report

Too many Americans are prescribed too many opioids for too long, according to a new report from the Centers for Disease Control and Prevention (CDC).

Though the rates of doctors prescribing opioids have decreased since 2010, they still remained high in 2015 in the U.S., according to the report. In addition, the amount of opioids prescribed to Americans remained high in 2015, though that also decreased since its peak in 2010.

There were enough prescription opioids in the U.S. in 2015 "for every American to be medicated around the clock for three weeks," Dr. Anne Schuchat, acting director of the CDC, said at a news conference today (July 6). [America's Opioid-Use Epidemic: 5 Startling Facts]

"Higher opioid prescribing practices place residents … at greater risk for opioid addiction, overdose and death," Schuchat said. And the United States is currently experiencing the highest opioid overdose death rates ever recorded in the country, she added. This high death rate is driven by prescription opioids as well as illegal opioids, including heroin and illegally manufactured fentanyl, a particularly powerful form of the drug.

Schuchat said "high opioid prescribing" can be thought of in three parts.

First, too many opioid prescriptions are being written, Schuchat said. The new report found that in 2015, there were 71 opioid prescriptions written for every 100 people in the United States. This rate is down from a high of 81 prescriptions per 100 people in 2010 to 2012, according to the report.

Second, opioids are being prescribed for too many days, Schuchat said. The length of an opioid prescription has increased by about one-third, from 13 days in 2006 to nearly 18 days in 2015, the report found.

Even taking opioids for "just a few days makes a person more likely to take them long term," Schuchat said. And "taking even a low-dose opioid for more than three months increases the risk of addiction by 15 times," she said.

Third, the doses of the drugs that are prescribed are too high, Schuchat said.

To calculate the total amount of opioids prescribed to people, the CDC uses a measure called "morphine milligram equivalents" (MMEs). This measure takes into account the relative strengths of different types of opioids, using morphine as the standard. Some prescription opioids are weaker than morphine (1 milligram of codeine, for example, is equivalent to 0.15 milligrams of morphine), and others are stronger (such as 1 milligram of hydromorphone, which is equal to 4 milligrams of morphine).

Taking higher doses has been linked with an increased risk of dying from an overdose, Schuchat said. People who take a dose of 50 MMEs per day have double the risk of dying from an overdose compared with people who take a dose of 20 MMEs per day or less, Schuchat said. And a dose of 90 MMEs or more per day — which the researchers consider a "high" dose — is associated with a 10-fold greater risk of opioid overdose death compared with 20 MMEs per day. [11 Facts About Heroin]

The report found that about 7 out of 100 opioid prescriptions in 2015 were for 90 MMEs or higher a day.

Overall, opioid prescribing rates decreased by 18 percent from 2010 to 2015, the report found. But that decrease was limited to about half of the counties in the U.S., and every state has high-prescribing counties, Schuchat said.

There is "tremendous variation between counties," and rates vary "as much from place to place as the weather," Schuchat said. In 2015, for example, six times more opioids were dispensed in the highest-prescribing counties than in the lowest-prescribing counties, the report found.

The researchers noted that several factors were associated with counties that had high opioid prescribing rates. These included having a small city, a greater percentage of white residents, a higher concentration of primary care doctors or dentists, greater rates of people who did not have health insurance or were unemployed, and more people with diabetes, arthritis or a disability.

However, these factors explained only about one-third of the wide variation in opioid prescribing, Schuchat said.

Schuchat also noted that the CDC has released guidelines for doctors about prescribing opioids. But because these guidelines were published in 2016, the data in the new report doesn't reflect what impact, if any, they could have had on prescribing.

Researchers will be able to use the new report as a baseline with which to compare the effects of the CDC's 2016 guidelines, Schuchat said.

Originally published on Live Science.

Pregnant Women Over 50 'do Pretty Well' Study Finds

The average age of women becoming mothers has risen in the United States, and in the last 20 years, a few women have even entered motherhood in their 60s.

By implanting embryos produced by in-vitro fertilization using egg cells donated by younger women, women who have passed menopause can become pregnant and give birth.

A new study of 101 women age 50 and older who had children using donated eggs reveals that pregnancy at this age carries about the same risks as similarly induced pregnancies in younger women. The study is the largest one to date looking at pregnancy in post-menopausal women.

"These women do really pretty well," said Dr. Mark Sauer, senior author of the article and chief of the division of reproductive endocrinology and infertility at Columbia University Medical Center, where all the women in the study received IVF.

"If they're well-screened and well cared for, they really should do O.K.," Sauer said.

The study found women over age 50 had similar rates of complications, such as gestational diabetes and preterm labor, as women under age 42 who became pregnant after receiving donated eggs.

And although the older women had slightly higher rates of high blood pressure, that difference was small, and may have been due to chance.

The study is published in the February issue of the American Journal of Perinatology.

Pregnancy at older ages

While Sauer said the results of the study were surprising in terms of how well older mothers did, he noted that the women were highly screened and highly motivated.

"These are smart, educated, well-off people that are doing this," he said, and pregnancy after 50 is not common — the 101 cases in the study were collected over a decade.

One 49-year-old woman in the study died while pregnant (she was included in the study because she would have been 50 at the delivery). She had concealed from the doctors that she smoked three packs of cigarettes a day, which the doctors said likely contributed to her heart attack.

In general, carrying a pregnancy is much easier for an older woman's body than producing the egg needed to conceive one.

"The uterus is a very different organ than the ovaries," Sauer said.

Under a microscope, Sauer said, the uterus changes very little with age. Given adequate hormones, an older woman's uterus can sufficiently nourish a growing fetus.

Eggs, however, are a different story.

A 2009 study from the Sackler School of Medicine in Tel Aviv concluded that age 43 seems to be a cutoff point for IVF with a woman's own eggs, which is viable with only 5 percent of women at that age. While individual cases have been reported of natural pregnancy at older ages, the very fact of their publication suggests how rare such events are.

Sauer said celebrities who have given birth in their late 40s almost certainly used donor eggs, though they may not be acknowledging it. This may be preventing greater public acceptance of egg donation, he said.

In fact, a major challenge in infertility treatment is convincing women in their 40s and older to use donated eggs rather than their own, Sauer said. With donated eggs, the success rate is about 50 percent.

But how old is too old, and who decides?

Public attitudes towards older women having children have changed since research on such cases was first published.

Dr. Richard Paulson, who worked with Sauer in the 1990s on research at the University of Southern California and is currently the director of USC Fertility, said he has noticed a shift in acceptance.

"I think society has become comfortable with [alternative] parent situations," Paulson said.

Sauer believes women should have a choice as to when they have children, but said he understands the concerns.

It was in Sauer and Paulson's research group at USC that a 63-year-old woman became pregnant in 1996. Paulson said she misrepresented herself as 10 years younger.

"We tend to require ID now," Paulson said, noting that many IVF clinics restrict whom they give donated eggs to, with a cut-off age of 50 or 55.

Fifty-year-olds can expect to live another 30 years, and so will be able to raise their children.

"I lose my own personal comfort zone when you get over 60," he said, citing the physical, emotional and financial cost of raising a child, particularly for someone entering retirement. But the doctors agreed that age alone should not be a deciding factor in whether a woman should be treated.

"Of course IVF should not be denied solely based on age," said Dr. Sherman Silber, in Saint Louis, Mo., director of St. Luke's Hospital's Infertility Center.

Paulson said the new study provides more reassurance to doctors offering a reproductive option to older women.

"It points out that it is a relatively complicated pregnancy…but as you can see, most of them get through it just fine," he said.

"But before doing donor-egg IVF on a woman in her late 40s or 50s, you should ascertain that she has a good family support system to take care of the child if she should die before the average age for women in the U.S. of 84," Silber said.

Pass it on: Pregnancy in women over age 50 may be as safe as in younger women who use donated eggs.

This story was provided by MyHealthNewsDaily, a sister site to LiveScience. Follow MyHealthNewsDaily on Twitter @MyHealth_MHND. Find us on Facebook.

Feel Controlled By Your Hunger? New Study May Show Why

Have you ever felt like you would do just about anything to satisfy a hunger craving? A new study in mice may help to explain why hunger can feel like such a powerful motivating force.

In the study, researchers found that hunger outweighed other physical drives, including fear, thirst and social needs.

To determine which feeling won out, the researchers did a series of experiments, according to the study, published today (Sept. 29) in the journal Neuron. [The Science of Hunger: How to Control It and Fight Cravings]

In one experiment, the mice were both hungry and thirsty. When given the choice of either eating food or drinking water, the mice went for the food, the researchers found. However, when the mice were well-fed but thirsty, they opted to drink, according to the study.

In an experiment meant to pit the mice's hunger against their fear, hungry mice were placed in a cage that had certain "fox-scented" areas and other places that smelled safer (in other words, not like an animal that could eat them) but also had food. It turned out that, when the mice were hungry, they ventured into the unsafe areas for food. But when the mice were well-fed, they stayed huddled in areas of the cage that were considered "safe," the researchers found. 

Hunger also outweighed the mice's social needs, the researchers found. Mice are usually social animals and prefer to be in the company of other mice, according to the study. When the mice were hungry, they opted to leave the company of other mice to go get food.

To figure out why hunger prevailed over other feelings, the researchers looked into the brains of the mice.

They focused on a specific type of nerve cell that has been linked to hunger. In the study, the researchers placed tiny fibers into the brains of the mice that gave them the ability to turn these nerve cells on and off.

When the researchers activated the nerve cells, the mice that had been fed acted the same way as the mice that had not been fed. In other words, "turning on" these nerve cells seemed to turn on hunger, and thus drove the mice to eat.

The findings suggest that these "hunger-tuned neurons" can "anticipate the benefits of searching for food, and then alter behavior accordingly," Michael Krashes, a principal investigator at the National Institute of Diabetes and Digestive and Kidney Diseases and the senior author of the study, said in a statement.

The decision to search for food instead of looking for water or hiding from predators has important evolutionary implications, Krashes said. Mice, as well as humans, are constantly presented with the chance to pursue an "array of behaviors," Krashes said.

But "we can't pursue all those behaviors at once," he said. Rather, we have to choose which feelings are most important to address during times of need. "Evolutionarily speaking, animals that consistently picked the right motivations over others have survived, while other animals have not," Krashes said.

Originally published on Live Science.