A new fertility drug, originally developed to prevent the recurrence of breast cancer, is 30 percent more effective in helping some women become pregnant than one used for more than 40 years, new research shows.
For a study published in the New England Journal of Medicine, researchers at seven different academic centers recruited 750 couples to compare the long-used fertility drug clomiphene citrate, commonly called clomid, to letrozole.Related Articles On Futurity
- Cardiff UniversityAnxiety may start in the womb
- University of NottinghamNicotine patches don't work during pregnancy
- University of TorontoPMS: It may not exist
Of the 376 women who were given clomid, 72 became pregnant and gave birth. Of the 374 women who took letrozole, 103 gave birth.
“Letrozole works better, has about the same cost, has fewer side effects, and has a slightly lower twin rate than clomid,” says Gregory Christman, director of the division of reproductive endocrinology and infertility at University of Florida.
“Clomid has been available for fertility treatment for more than 40 years, but with this new information, we may soon have to reconsider its role in the treatment of infertility due to anovulation in women with polycystic ovarian disorder.”
Clomid is often prescribed to women with polycystic ovary syndrome as a first step in their treatment—and that population accounts for about a third of women who seek fertility treatment, Christman says. About 1 in 20 women of childbearing age have the disorder.
Women with the condition typically have fewer periods—seven cycles per year fewer than women without the condition—and therefore have fewer opportunities to become pregnant.How they work
Christman oversaw one of the trial’s sites as one of the principal investigators at the University of Michigan. There, he recruited 75 of the 750 couples for the study. Women in the study, who were an average of 29 years old, were randomly assigned to take either clomid or letrozole.
Because the drugs were administered in the same way—both were given for a five-day period at the beginning of a woman’s cycle—the study was double-blinded. Neither the doctor nor the patient knew which drug the patient was receiving.
Clomid works by traveling to the brain, where it partially blocks estrogen receptors. This triggers the brain to send a signal to the ovaries to produce more estrogen, which causes ovulation.
Letrozole is prescribed to prevent recurrence of breast cancer in women by shutting off an enzyme that converts circulating testosterone to estrogen. It works primarily in fat or adipose tissue throughout the body, causing estrogen levels in a woman’s bloodstream to fall. The brain sees this drop in estrogen and tells the ovaries to make more estrogen, which triggers ovulation, Christman says.Fewer twins
The study also found that letrozole results in fewer twins. Approximately 10 percent of women who are treated with clomid give birth to twins. The rate drops to between 3 to 4 percent in women who take letrozole.
“It always makes you smile when you hear someone is expecting twins, but because of the increased risks of a multiple pregnancy it would be better and safer if people conceived one baby at a time,” Christman says.
“This study indicates that there is a safe and effective medical treatment to help infertility patients with polycystic ovarian syndrome, which is one of the most common conditions causing infertility,” says David S. Guzick, senior vice president for health affairs and president of University of Florida Health, who helped oversee the study.
Generic versions of both medications are available, making treatment with either drug affordable.
The National Institute of Child Health and Human Development branch of the National Institutes of Health funded the study.
Source: University of Florida
Leaf-mining insects disappeared from the western United States after the late-Cretaceous asteroid impact that also triggered the extinction of dinosaurs.
Only a million years later, at Mexican Hat, in southeastern Montana, fossil leaves show diverse leaf-mining traces from new insects that were not present during the Cretaceous, according to paleontologists.
“Our results indicate both that leaf-mining diversity at Mexican Hat is even higher than previously recognized, and equally importantly, that none of the Mexican Hat mines can be linked back to the local Cretaceous mining fauna,” says Michael Donovan, graduate student in geosciences at Penn State.
Insects that eat leaves produce very specific types of damage. One type is from leaf miners—insect larvae that live in the leaves and tunnel for food, leaving distinctive feeding paths and patterns of droppings.Related Articles On Futurity
- Duke UniversityJumpy dew drops clean cicada wings
- Washington University in St. LouisLike tiny predators, ticks hunt wary prey
- University of WashingtonFruit flies maneuver like tiny fighter jets
Donovan, Peter Wilf, professor of geosciences at Penn State, and colleagues looked at 1,073 leaf fossils from Mexican Hat for mines.
They compared these with more than 9,000 leaves from the end of the Cretaceous, 65 million years ago, from the Hell Creek Formation in southwestern North Dakota, and with more than 9,000 Paleocene leaves from the Fort Union Formation in North Dakota, Montana, and Wyoming.
“We decided to focus on leaf miners because they are typically host-specific, feeding on only a few plant species each,” says Donovan. “Each miner also leaves an identifiable mining pattern.”
The researchers found nine different mine-damage types at Mexican Hat attributable to the larvae of moths, wasps, and flies, and six of these damage types were unique to the site.No refuge for leaf miners
The researchers weren’t sure whether the high diversity of leaf miners at Mexican Hat compared to other early Paleocene sites, where there is little or no leaf mining, was caused by insects that survived the extinction event in refugia—areas where organisms persist during adverse conditions—or were due to range expansions of insects from somewhere else during the early Paleocene.
However, with further study, the researchers found no evidence of the survival of any leaf miners over the Cretaceous-Paleocene boundary, suggesting an even more total collapse of terrestrial food webs than has been recognized previously.
“These results show that the high insect damage diversity at Mexican Hat represents an influx of novel insect herbivores during the early Paleocene and not a refugium for Cretaceous leaf miners,” says Wilf.
“The new herbivores included a startling diversity for any time period, and especially for the classic post-extinction disaster interval.”
Insect extinction across the Cretaceous-Paleocene boundary may have been directly caused by catastrophic conditions after the asteroid impact and by the disappearance of host plant species. While insect herbivores constantly need leaves to survive, plants can remain dormant as seeds in the ground until more auspicious circumstances occur.Insect outbreak
The low-diversity flora at Mexican Hat is typical for the area in the early Paleocene, so what caused the high insect damage diversity?
Insect outbreaks are associated with a rapid population increase of a single insect species, so the high diversity of mining damage seen in the Mexican Hat fossils makes the possibility of an outbreak improbable.
The researchers hypothesized that the leaf miners that are seen in the Mexican Hat fossils appeared in that area because of a transient warming event, a number of which occurred during the early Paleocene.
“Previous studies have shown a correlation between temperature and insect damage diversity in the fossil record, possibly caused by evolutionary radiations or range shifts in response to a warmer climate,” says Donovan.
“Current evidence suggests that insect herbivore extinction decreased with increasing distance from the asteroid impact site in Mexico, so pools of surviving insects would have existed elsewhere that could have provided a source for the insect influx that we observed at Mexican Hat.”
The researchers present their results in PLOS ONE.
Other researchers are from the National Museum of Natural History/Smithsonian Institution, University of Maryland, and Baylor University.
Source: Penn State
The post Bug ‘tunnels’ suggest quick rebound after asteroid appeared first on Futurity.
Earth’s biodiversity, the product of 3.5 billion years of evolutionary trial and error, is the highest in the history of life—but it may be reaching a tipping point.
A new review, published in Science, cautions that the loss and decline of animals is contributing to what appears to be the early days of the planet’s sixth mass biological extinction event.Related Articles On Futurity
- University of WashingtonMammals may not get to cool climates in time
- Stony Brook UniversityExtreme makeover for dwindling prairie dog
- University of California, DavisVanishing predators: Cascade of loss
Since 1500, more than 320 terrestrial vertebrates gone extinct. Populations of the remaining species show a 25 percent average decline in abundance. The situation is similarly dire for invertebrate animal life.
And while previous extinctions have been driven by natural planetary transformations or catastrophic asteroid strikes, the current die-off can be associated to human activity, a situation some researchers call an era of “Anthropocene defaunation.”
Across vertebrates, 16 to 33 percent of all species are estimated to be globally threatened or endangered. Large animals—a group known as megafauna that includes elephants, rhinoceroses, polar bears, and countless other species worldwide—face the highest rate of decline, a trend that matches previous extinction events.Fewer megafauna, more rodents
Larger animals tend to have lower population growth rates and produce fewer offspring. They need larger habitat areas to maintain viable populations. Their size and meat mass make them easier and more attractive hunting targets for humans.
Although these species represent a relatively low percentage of the animals at risk, their loss would have trickle-down effects that could shake the stability of other species and, in some cases, even human health.
For instance, previous experiments conducted in Kenya have isolated patches of land from megafauna such as zebras, giraffes, and elephants, and observed how an ecosystem reacts to the removal of its largest species.
Rather quickly, these areas become overwhelmed with rodents. Grass and shrubs increase and the rate of soil compaction decreases. Seeds and shelter become more easily available, and the risk of predation drops.
Consequently, the number of rodents doubles—and so does the abundance of the disease-carrying ectoparasites that they harbor.‘A vicious cycle’
“Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission,” says Rodolfo Dirzo, professor of biology at Stanford University.
“Who would have thought that just defaunation would have all these dramatic consequences? But it can be a vicious circle.”
The scientists also detailed a troubling trend in invertebrate defaunation. Human population has doubled in the past 35 years; in the same period, the number of invertebrate animals—such as beetles, butterflies, spiders, and worms—has decreased by 45 percent.
As with larger animals, the loss is driven primarily by loss of habitat and global climate disruption, and could have trickle-up effects in our everyday lives.Solutions are complicated
For instance, insects pollinate roughly 75 percent of the world’s food crops, an estimated 10 percent of the economic value of the world’s food supply. Insects also play a critical role in nutrient cycling and decomposing organic materials, which helps ensure ecosystem productivity. In the United States alone, the value of pest control by native predators is estimated at $4.5 billion annually.
The solutions are complicated. Immediately reducing rates of habitat change and overexploitation would help, but these approaches need to be tailored to individual regions and situations. He says he hopes that raising awareness of the ongoing mass extinction—and not just of large, charismatic species—and its associated consequences will help spur change.
“We tend to think about extinction as loss of a species from the face of Earth, and that’s very important, but there’s a loss of critical ecosystem functioning in which animals play a central role that we need to pay attention to as well,” Dirzo says.
“Ironically, we have long considered that defaunation is a cryptic phenomenon, but I think we will end up with a situation that is non-cryptic because of the increasingly obvious consequences to the planet and to human wellbeing.”
Researchers from University of California, Santa Barbara; Universidade Estadual Paulista in Brazil; Universidad Nacional Autonoma de Mexico; the Natural Environment Research Council Centre for Ecology and Hydrology in England; and University College London are coauthors of the study.
Source: Stanford University
A new vaccine uses a booster normally found in cancer vaccines to combat dust-mite allergies by naturally switching the body’s immune response.
In animal tests, the nano-sized vaccine package lowered lung inflammation by 83 percent, despite repeated exposure.
“What is new about this is we have developed a vaccine against dust-mite allergens that hasn’t been used before,” says Aliasger Salem, professor in pharmaceutical sciences at University of Iowa and a corresponding author of the paper.
Ubiquitous, microscopic dust mites that burrow in mattresses, sofas, and other homey spots are found in 84 percent of households in the United States, according to a published, national survey.
Preying on skin cells on the body, the mites trigger allergies and breathing difficulties among 45 percent of those who suffer from asthma, according to some studies. Prolonged exposure can cause lung damage.
Treatment is limited to getting temporary relief from inhalers or undergoing regular exposure to build up tolerance, which is long term and holds no guarantee of success.Alleviate mite-induced asthma
“Our research explores a novel approach to treating mite allergy in which specially-encapsulated miniscule particles are administered with sequences of bacterial DNA that direct the immune system to suppress allergic immune responses,” says Peter Thorne, public health professor and a contributing author of the paper. “This work suggests a way forward to alleviate mite-induced asthma in allergy sufferers.”Related Articles On Futurity
- University of NottinghamWhy viruses and bacteria trigger different immune responses
- University of FloridaOne-dose contraceptive for cats
- Columbia UniversityBoth heat and humidity predict flu globally
The vaccine takes advantage of the body’s natural inclination to defend itself against foreign bodies. A key to the formula lies in the use of an adjuvant—which boosts the potency of the vaccine—called CpG. The booster has been used successfully in cancer vaccines but never had been tested as a vaccine for dust-mite allergies.
Put broadly, CpG sets off a fire alarm within the body, springing immune cells into action. Those immune cells absorb the CpG and dispose of it.
This is important, because as the immune cells absorb CpG, they’re also taking in the vaccine, which has been added to the package, much like your mother may have wrapped a bitter pill around something tasty to get you to swallow it. In another twist, combining the antigen (the vaccine) and CpG causes the body to change its immune response, producing antibodies that dampen the damaging health effects dust-mite allergens generally cause.
In lab tests, the CpG-antigen package, at 300 nanometers in size, was absorbed 90 percent of the time by immune cells. Researchers followed up those experiments by giving the package to mice and exposing the animals to dust-mite allergens every other day for nine days total.
Packages with CpG yielded greater production of the desirable antibodies, while lung inflammation was lower than particles that did not contain CpG.
“This is exactly what we were hoping for,” Salem says.
First author of the paper, in the AAPS Journal, is Vijaya Joshi, a graduate fellow in pharmacy. The National Institutes of Health and the American Cancer Society partly funded the research.
Source: University of Iowa
In Aesop’s fable about the crow and the pitcher, a thirsty bird finds a vessel of water, but when he tries to drink from it, he finds that the water level is too low. Not strong enough to knock over the pitcher, the bird drops pebbles into it—one at a time—until the water level rises enough for him to drink his fill.
Highlighting the value of ingenuity, the fable demonstrates that cognitive ability can often be more effective than brute force. It also characterizes crows as pretty resourceful problem solvers.Related Articles On Futurity
- University of WashingtonDogs sniff forest to spot threatened owls
- Cornell UniversityBird's wing is built for love, not flight
- University of WashingtonClimate to send more animal traffic to these 6 spots
New research conducted by University of California, Santa Barbara’s Corina Logan and collaborators proves the birds’ intellectual prowess may be more fact than fiction. The findings appear in PLOS ONE.
“We showed that crows can discriminate between different volumes of water and that they can pass a modified test that so far only 7- to 10-year-old children have been able to complete successfully,” says Logan, who is lead author of the paper.
“We provide the strongest evidence so far that the birds attend to cause-and-effect relationships by choosing options that displace more water.”
Logan, a junior research fellow at UCSB’s SAGE Center for the Study of the Mind, worked with New Caledonian crows in a set of small aviaries in New Caledonia run by the University of Auckland.Training wild crows
“We caught the crows in the wild and brought them into the aviaries, where they habituated in about five days,” she says. Keeping families together, they housed the birds in separate areas of the aviaries for three to five months before releasing them back to the wild.
Getting individual crows into the testing room proved to be an immediate challenge. “You open the testing room door and then open the aviary door, with the idea that the bird you want is going to fly through into the testing room,” she says. But with four birds in an aviary, directing a particular test subject is tricky at best.
“So I thought, let’s pretend the sky’s the limit and I can train them to do whatever I want,” Logan says. “I started by pointing at the one I wanted and continuing to point until he or she flew out. I got to the point where I could stand outside the aviary and point at the one I wanted and it would fly out while the other birds stayed put.”
Two birds in particular—007 and Kitty—became so well trained that Logan had only to call them by name and they’d fly into the testing room.Narrow vs. wide
The testing room contained an apparatus consisting of two beakers of water, the same height, but one wide and the other narrow. The diameters of the lids were adjusted to be the same on each beaker.
“The question is, can they distinguish between water volumes?” Logan says. “Do they understand that dropping a stone into a narrow tube will raise the water level more?”
In a previous experiment by Sarah Jelbert and colleagues at the University of Auckland, the birds had not preferred the narrow tube. However, in that study, the crows were given 12 stones to drop in one or the other of the beakers, giving them enough to be successful with either one.
“When we gave them only four objects, they could succeed only in one tube—the narrower one, because the water level would never get high enough in the wider tube; they were dropping all or most of the objects into the functional tube and getting the food reward,” Logan explains. “It wasn’t just that they preferred this tube, they appeared to know it was more functional.”
However, she notes, we still don’t know exactly how the crows think when solving this task. They may be imagining the effect of each stone drop before they do it, or they may be using some other cognitive mechanism. “More work is needed,” Logan says.The U-tube task
Logan also examined how the crows react to the U-tube task. Here, the crows had to choose between two sets of tubes. With one set, when subjects dropped a stone into a wide tube, the water level raised in an adjacent narrow tube that contained food.
This was due to a hidden connection between the two tubes that allowed water to flow. The other set of tubes had no connection, so dropping a stone in the wide tube did not cause the water level to rise in its adjacent narrow tube.
Each set of tubes was marked with a distinct color cue, and test subjects had to notice that dropping a stone into a tube marked with one color resulted in the rise of the floating food in its adjacent small tube.
“They have to put the stones into the blue tube or the red one, so all you have to do is learn a really simple rule that red equals food, even if that doesn’t make sense because the causal mechanism is hidden,” says Logan.
As it turns out, this is a very challenging task for both corvids (a family of birds that includes crows, ravens, jays, and rooks) and children. Children ages 7 to 10 were able to learn the rules, as Lucy Cheke and colleagues at the University of Cambridge discovered in 2012. It may have taken a couple of tries to figure out how it worked, Logan notes, but the children consistently put the stones into the correct tube and got the reward (in this case, a token they exchanged for stickers).
Children ages 4 to 6, however, were unable to work out the process. “They put the stones randomly into either tube and weren’t getting the token consistently,” she says.
Recently, Jelbert and colleagues from the University of Auckland put the New Caledonian crows to the test using the same apparatus the children did. The crows failed.Good work, Kitty
So Logan and her team modified the apparatus, expanding the distance between the beakers. And Kitty, a six-month-old juvenile, figured it out.
“We don’t know how she passed it or what she understands about the task,” Logan says, “so we don’t know if the same cognitive processes or decisions are happening as with the children, but we now have evidence that they can. It’s possible for the birds to pass it.
“What we do know is that one crow behaved like the older children, which allows us to explore how they solve this task in future experiments,” she continues. Research on causal cognition using the water displacement paradigm is only beginning to get at what these crows know about solving problems. This series of experiments shows that modifying previous experiments is useful for gaining a deeper understanding.Smaller-brained grackles
The research on the crows is part of a larger project Logan is working on to compare the cognitive powers of crows with those of grackles.
“So far, no smaller-brained species have been tested with the tests we use on the crows, and grackles are smaller-brained,” she says. “But they’re really innovative. So they may have a reason to pay attention to causal information like this.”
The next research phase will begin next month, after the grackles’ breeding season ends and they are ready to participate.
The National Geographic Society/Waitt Grants Program supported the work.
Source: UC Santa Barbara
Mechanical engineers have found a way to dramatically increase the sensitivity of a light-based plasmon sensor. They say it could potentially be used to detect an incredibly minute and hard-to-detect explosive popular among terrorists.
The engineers put the sensor to the test with various explosives—2.4-dinitrotoluene (DNT), ammonium nitrate, and nitrobenzene—and found that the device successfully detected the airborne chemicals at concentrations of 0.67 parts per billion, 0.4 parts per billion, and 7.2 parts per million, respectively.Related Articles On Futurity
- Duke UniversityPhysics gets dirty to stop terrorism
- Georgia Institute of TechnologyHow to power 1,000 LED bulbs by stomping one foot
- University at BuffaloLasers pop tiny 'chemo' balloons to kill cancer
One part per billion would be akin to a blade of grass on a football field.
The results, published in the journal Nature Nanotechnology, are much more sensitive than those for other optical sensors, says Xiang Zhang, professor of mechanical engineering at University of California, Berkeley.
“Optical explosive sensors are very sensitive and compact,” says Zhang, who is also director of the Materials Science Division at the Lawrence Berkeley National Laboratory (Berkeley Lab) and director at UC Berkeley of the National Science Foundation Nanoscale Science and Engineering Center.
“The ability to magnify such a small trace of an explosive to create a detectable signal is a major development in plasmonsensor technology, which is one of the most powerful tools we have today.”Dogs and swabs
The new sensor could have an advantage over current bomb-screening methods, says co-lead author Ren-Min Ma, an assistant professor of physics at Peking University, who did the work as a postdoctoral researcher in Zhang’s lab.
“Bomb-sniffing dogs are expensive to train, and they can become tired. The other thing we see at airports is the use of swabs to check for explosive residue, but those have relatively low-sensitivity and require physical contact. Our technology could lead to a bomb-detecting chip for a handheld device that can detect the tiny-trace vapor in the air of the explosive’s small molecules.”
The sensor also could be developed into an alarm for unexploded landmines that otherwise are difficult to detect. Landmines kill 15,000 to 20,000 people every year, according to the United Nations. Most of the victims are children, women, and the elderly.Terrorists’ explosive of choice
The nanoscale plasmon sensor used in the lab experiments is much smaller than other explosive detectors on the market. It consists of a layer of cadmium sulfide, a semiconductor, that is laid on top of a sheet of silver with a layer of magnesium fluoride in the middle.
In designing the device, the researchers took advantage of the chemical makeup of many explosives, particularly nitro-compounds such as DNT and its more well-known relative, TNT.
Not only do the unstable nitro groups make the chemicals more explosive, they also are characteristically electron deficient. This quality increases the interaction of the molecules with natural surface defects on the semiconductor. The device works by detecting the increased intensity in the light signal that occurs as a result of this interaction.
“We think that higher electron deficiency of explosives leads to a stronger interaction with the semiconductor sensor,” says study co-lead author Sadao Ota, a former PhD student in Zhang’s lab who is now an assistant professor of chemistry at the University of Tokyo.
Because of this, the researchers are hopeful that their plasmon laser sensor could detect pentaerythritol tetranitrate, or PETN, an explosive compound considered a favorite of terrorists. Small amounts of it pack a powerful punch, and because it is plastic, it escapes X-ray machines when not connected to detonators.
This is the explosive that was found in Richard Reid’s shoe bomb in 2001 and Umar Farouk Abdulmtallab’s underwear bomb in 2009.High level of sensitivity
US Attorney General Eric Holder Jr. was recently quoted in news reports as having “extreme, extreme concern” about Yemeni bomb makers joining forces with Syrian militants to develop these hard-to-detect explosives, which can be hidden in cell phones and mobile devices.
“PETN has more nitro functional groups and is more electron deficient than the DNT we detected in our experiments, so the sensitivity of our device should be even higher than with DNT,” Ma says.
The sensor represents the latest milestone in surface plasmon sensor technology, which is now used in the medical field to detect biomarkers in the early stages of disease.
The ability to increase the sensitivity of optical sensors traditionally had been restricted by the diffraction limit, a limitation in fundamental physics that forces a tradeoff between how long and in how small a space the light can be trapped.
By coupling electromagnetic waves with surface plasmons, the oscillating electrons found at the surface of metals, researchers were able to squeeze light into nanosized spaces, but sustaining the confined energy was challenging because light tends to dissipate at a metal’s surface.A sharper signal
The new device builds upon earlier work in plasmon lasers by Zhang’s lab that compensated for this light leakage by using reflectors to bounce the surface plasmons back and forth inside the sensor—similar to the way sound waves are reflected across the room in a whispering gallery—and using the optical gain from the semiconductor to amplify the light energy.
The amplified sensor creates a much stronger signal than the passive plasmon sensors currently available, which work by detecting shifts in the wavelength of light, Zhang says.
“The difference in intensity is similar to going from a light bulb for a table lamp to a laser pointer. We create a sharper signal, which makes it easier to detect even smaller changes for tiny traces of explosives in the air.”
The sensor could have applications beyond chemical and explosive detection, such as use in biomolecular research.
The US Air Force Office of Scientific Research Multidisciplinary University Research Initiative program helped support this work.
Source: UC Berkeley
Struggling with multiple chronic illnesses shortens life expectancy dramatically, and for older Americans, it threatens to reverse recent gains in average lifespans.
Nearly four in five older Americans now live with multiple chronic medical conditions, which perhaps could explain why increases in life expectancy for US seniors are already slowing, report researchers.Related Articles On Futurity
- University of California, Davis'White matter' may give first hints of cognitive decline
- University at BuffaloAging is jolly good, even for overweight adults
- Emory UniversityTo sniff out muscle repair, follow the nose
“Living with multiple chronic diseases such as diabetes, kidney disease, and heart failure is now the norm and not the exception in the United States,” says lead author Eva H. DuGoff, a graduate of the Johns Hopkins University’s Bloomberg School of Public Health.
“The medical advances that have allowed sick people to live longer may not be able to keep up with the growing burden of chronic disease. It is becoming very clear that preventing the development of additional chronic conditions in the elderly could be the only way to continue to improve life expectancy.”
Life expectancy in the US is rising more slowly than in other parts of the developed world. Many blame the obesity epidemic and related health conditions for the worsening health of the American population.Current expectations
The researchers used a nationally representative sample of Medicare beneficiaries enrolled as of January 2008. The data included 21 defined chronic conditions and the records of nearly 1.4 million people 67 and older.
The analysis found that, on average, a 75-year-old American woman with no chronic conditions will live 17.3 additional years to more than 92 years old. But a 75-year-old woman with five chronic conditions will live, on average, only to age 87, and the average 75-year-old woman with 10 or more chronic conditions will survive only to age 80.
Women continue to live longer than men, while white people live longer than black people.
Which diseases, as well as how many, matters. At 67, an individual with heart disease is estimated to live an additional 21.2 years on average, while someone diagnosed with Alzheimer’s disease is only expected to live 12 additional years.‘Greater than the sum of its parts’
On average, life expectancy is reduced by 1.8 years for each additional chronic condition, the researchers found. But in reality, the arithmetic is not that neat and simple: The first disease shaves off just a fraction of a year off life expectancy for older people, but the impact grows as the number of diseases adds up.
“We tend to think about diseases in isolation. You have diabetes or you have heart failure. But many people have both, and then some,” says senior author Gerard F. Anderson, professor of health policy and management at Johns Hopkins.
“The balancing act needed to care for all of those conditions is complicated; more organ systems become involved, as do more physicians prescribing more medications. Our system is not set up to care for people with so many different illnesses. Each one adds up and makes the burden of disease greater than the sum of its parts.”
The study’s findings, published in Medical Care, could be useful to Social Security and Medicare planners in making population and cost predictions.
Policymakers are facing a different landscape as so many more people are living with multiple chronic conditions than before: 60 percent of those 67 and older in the United States have three or more of these diseases, the researchers found. Eventually, there may be a tipping point, when the medical advances that have boosted life expectancy for so long can no longer keep pace with the many illnesses people are collecting as they age.
“We already knew that living with multiple chronic conditions affects an individual’s quality of life; now we know the impact on quantity of life,” DuGoff says. “The growing burden of chronic disease could erase decades of progress. We don’t want to turn around and see that life expectancy gains have stopped or reversed.”
The American Insurance Group supported the study.
Source: Johns Hopkins University
In the United States, one infant is born each hour with drug withdrawal, but the medical care they receive varies widely, new research shows.
These infants, born with neonatal abstinence syndrome (NAS), after being exposed to opioid medications like oxycodone in utero, can have seizures, difficulty feeding, respiratory complications, and low birth weights.Related Articles On Futurity
- University at BuffaloAddicts get hooked at the pharmacy
- University of Colorado at BoulderVia proteins, morphine effects and abuse
- Cornell UniversityCholine does pregnant mom (and baby) good
A new study shows that infants treated with methadone have a shorter length of treatment and length of stay when compared to other treatment methods for infants with the condition.
For the study, published in the Journal of Perinatology, researchers looked at data from 1,424 infants with NAS at 14 major children’s hospitals and found significant variations in treatment.
“We found that, even among large children’s hospitals, only a couple of hospitals followed the same method of treatment more than 80 percent of the time,” says Stephen Patrick an assistant professor of pediatrics at Vanderbilt University, who completed the work as a neonatology fellow at the University of Michigan.More babies born with NAS
“I think, overall, what this calls for is standardizing how we care for these infants as the American Academy of Pediatrics and World Health Organization suggests: making sure we are giving the same treatment using the best possible evidence that we can to improve outcomes for our babies and families.”
Recently, there has been a rapid increase in the number of infants born with NAS in the United States. Tennessee has been particularly affected. In some areas of the state, infants born with NAS make up 5 percent of all births, Patrick says.
“Throughout the United States we have seen a fourfold increase in sales of opioid pain relievers and, from that, we have seen complications in women and pregnant women that include anything from overdose deaths to treatment facility admissions to neonatal abstinence syndrome,” Patrick says. “I think the recent rise that we have seen in Tennessee can be attributed to the increase in sales and use of opioid pain relievers.”Standardize care
The study looked at a combination of hospital administration and pharmacy data to evaluate differences in drug treatment for infants with NAS. Six hospitals primarily used methadone, six prescribed morphine, and two used phenobarbital. Methadone showed the best outcome.
Though more research is needed, Patrick says it is likely that methadone-treated infants will have fewer instances where they show signs of withdrawal.
Fewer signs of withdrawal could decrease the need for opioid dose escalations, potentially shortening their length of treatment and hospital stay.
“The problem with neonatal abstinence syndrome is there is not a lot of research to inform what we do. We hope to add to the evidence by comparing medications and by showing how variable care is for this vulnerable population. I hope our work will serve as a call to hospitals caring for infants with NAS to standardize how they care for patients,” Patrick says.
The Robert Wood Johnson Foundation Clinical Scholars Program, a collaboration among the Children’s Hospital of Philadelphia, Cincinnati Children’s Hospital, and the University of Michigan Health System, supported the study.
Source: Vanderbilt University
The post Methadone best for babies born with drug withdrawal appeared first on Futurity.
The first assessment of community-led marine conservation in the Western Indian Ocean shows a revolution in the management of more than 4,200 square miles of marine protected areas.
Marine protected areas (MPAs) are zones of the seas and coasts designed to protect wildlife from damage and disturbance and managed typically by governments rather than by local communities. They are rapidly increasing in number as countries rush to meet international conservation commitments.Related Articles On Futurity
- Brown UniversityFast coastal waters flow with diversity
- Carnegie Mellon UniversityGoogle Earth Engine adds time-lapse video
- Stanford UniversitySocial network tracks amphibians
“MPAs are vital tools for marine conservation but often fall short of their potential and can have negative impacts on local fishing communities,” says lead author Steve Rocliffe, a researcher with the University of York’s environment department.
“Against this backdrop, we’re seeing coastal communities across a vast swathe of the Indian Ocean taking more responsibility for their resources by setting up conservation zones known as ‘locally managed marine areas’ or LMMAs.
“LMMAs put people at the center: It’s the fishers themselves who are making the management decisions, based on their needs, their priorities, and their traditional ecological knowledge.”
The study, which covers a region of 11 coastal and island states stretching from Somalia in the north to South Africa in the south, appears in PLOS ONE.Biodiversity targets
The inventory summarizes information on 62 local initiatives and 74 MPAs, providing agencies, researchers, and government officials with an important baseline against which to evaluate future efforts to expand marine conservation in the region.
The researchers assessed LMMAs in terms of geography, numbers, size, and governance structures, comparing them with areas under government stewardship and evaluating their potential contributions towards Convention on Biodiversity targets to effectively conserve 10 percent of marine and coastal ecological regions by 2020.
“LMMAs have proven to be a cost-effective, scaleable, resilient, and more socially acceptable alternative to more traditional ‘top-down’ methods of marine resource management. They have also shown promise as a means to safeguard food security, address coastal poverty, and help coastal communities to adapt to climate change,” says Julie Hawkins of the University of York.
“We found that although locally managed marine areas are hampered by underdeveloped legal structures and enforcement mechanisms, they are emerging as a tool of choice in mainland Tanzania and Madagascar, where they cover 3.5 and 4.2 times more area than centrally managed MPAs respectively,” says Shawn Peabody of Blue Ventures Conservation.
“The way forward now is to establish a network through which LMMA practitioners can share experiences and best practice.”
The Natural Environment Research Council, the Economic and Social Research Council, and the John D. and Catherine T. MacArthur Foundation supported the study. The survey is a collaboration among the University of York, Blue Ventures Conservation, and CORDIO, an East African research organization.
Source: University of York
The size and age of plants have more of an impact on their productivity than temperature and precipitation do, a new study suggests.
Researchers combined a new mathematical theory with data from more than 1,000 forests around the world to show that climate has a relatively minor direct effect on net primary productivity, or the amount of biomass—wood or any other plant materials—that plants produce by harvesting sunlight, water, and carbon dioxide.Related Articles On Futurity
- Texas A&M UniversityRemodel job may curb cancer cells
- University of MichiganIn experimental forest, trees soak up CO2
- Cornell UniversitySex life of plants can alter their defenses
“A fundamental assumption of our models for understanding how climate influences the functioning of ecosystems is that temperature and precipitation directly influence how fast plants can take up and use carbon dioxide,” says Brian Enquist, professor in the ecology and environmental biology department at University of Arizona.
“Essentially, warm and wet environments are thought to allow plant metabolism to run fast, while cold and drier environments slow down metabolism and hence lower biomass production in ecosystems.
“This assumption makes sense, as we know from countless experiments that temperature and water control how fast plants can grow. However, when applied to the scale of entire ecosystems, this assumption appears to not be correct.”Plant math
Published in Nature, the analysis reveals a new and general mathematical relationship that governs worldwide variation in terrestrial ecosystem net primary productivity. Plant size and plant age control most of the variation in plant productivity, not temperature and precipitation as traditionally thought.
“This general relationship shows that climate doesn’t influence productivity by changing the metabolic reaction rates underlying plant growth, but instead by determining how large plants can get and how long they can live for,” says postdoctoral researcher Sean Michaletz, the study’s lead author.
“This means that plants in warm, wet environments can grow more because their larger size and longer growing season enable them to capture more resources, not because climate increases the speed of their metabolism.”Climate matters, too
The finding does not, however, mean that climate is unimportant for plant productivity. “Climate is still an important factor, but our understanding of how it influences ecosystem functioning has now changed,” Michaletz says.
The findings suggest that mathematical models used for predicting the effects of global climate change can be improved by accounting for the effects of plant size and plant age on net primary productivity.
“Understanding exactly how climate controls net primary production is important for understanding the plant-atmosphere feedbacks that control climate change,” Michaletz says.
“In other words,” Enquists adds, “to better predict how ecosystems will change with climate, we need to understand what influences the amount of plant biomass in a given area as well as its age.”
Researchers from Fujian Normal University in China and Kenyon College also contributed to the study.
Source: University of Arizona
Persistent health disparities by race may be related, in part, to anxiety about being confronted by negative racial stereotypes while receiving healthcare.
Stereotype threat, which is the threat of being judged by or confirming a negative stereotype about a group you belong to, has already been shown to influence the outcome of standardized testing, such as performance on the SAT (the most widely used college admissions exam).
For example, when confronted with a negative stereotype about their group identity, some black students become anxious that they will perform poorly on a test and, thereby, confirm negative stereotypes about the intellectually ability of people of their race. As a consequence of cognitive load from this performance anxiety, students actually become more likely to perform poorly.
In a similar vein, the researchers found that black women who strongly identified with their race were more likely to feel anxious in a healthcare setting—particularly if that setting included messaging that promoted negative racial stereotypes, even if inadvertently.‘Situation and identity’
It is already well documented that black women underutilize healthcare when compared to white women—possibly hurting their health overall.Related Articles On Futurity
- Duke UniversityIn Chicago's poor neighborhoods, residents suffer and get sick
- Northwestern UniversityBlack students more likely to tweet
- After Fukushima, uncertainty escalated fear
New research suggests that this underutilization could be prompted by anxiety and other socio-emotional consequences of stereotype threat.
“This may help to explain some of the as yet unaccounted for ethnic and socioeconomic differences in morbidity and mortality across the lifespan,” says Cleopatra Abdou, assistant professor in the School of Gerontology and the department of psychology at the University of Southern California.
“Historically, the discourse surrounding health and health disparities has focused on nature, nurture, and the interaction of the two. With this study, we are bringing situation and identity into the equation.”
The study appears in the journal Cultural Diversity and Ethnic Minority Psychology.Stereotypes
Participants in the study sat in virtual doctor’s waiting and exam rooms, which displayed posters depicting black women confronting unplanned pregnancy or AIDS—conspicuous examples of negative health-relevant racial stereotypes.
Black women who reported that they felt a strong connection with their ethnicity or ethnic group experienced the highest levels of anxiety while waiting in the rooms with the posters, while white women with a strong connection to their ethnicity experienced the lowest anxiety levels, suggesting that strong white identity may provide immunity from healthcare-related stereotype threat, Abdou says. Women of either group with low ethnic identity fell in the middle of the range.
“This is stereotype threat-induced anxiety,” Abdou says. “It’s important to note that this anxiety is not present when we don’t prime highly identified African American women with negative stereotypes of African American women’s health.”
This research represents the first-ever empirical test of stereotype threat in the health sciences. Although stereotype threat theory is popular in the social sciences, with hundreds of studies documenting its effects on academic and other types of performance in recent decades, Abdou and coauthor Adam Fingerhut of Loyola Marymount University are the first to experimentally apply stereotype threat theory to the domain of healthcare and health disparities more broadly.Unexpected consequences
“This study is important as a first step to understanding how stereotypes play out in healthcare settings and affect minority individual’s experiences with healthcare providers. Further research is needed to understand the potential downstream effects, including reduced trust in physicians and delay in seeking healthcare as a way to avoid stereotype threat, which may have long-term implications for health among blacks,” says Fingerhut.
Posters like the ones Abdou and Fingerhut used can be commonly found in doctors’ offices to promote admirable goals, such as AIDS awareness, but the use of specific ethnicities in their messaging can have unexpected negative consequences, Abdou says.
“There is value in public health messaging that captures the attention of specific groups, particularly the groups at greatest risk, but we have to be mindful of unintended byproducts of these efforts and think outside the box to circumvent them,” Abdou says.
The Michigan Center for Integrative Approaches to Health Disparities and the USC Advancing Scholarship in the Humanities and Social Sciences Initiative supported the study.
The post Racial stereotypes linked to health care disparity appeared first on Futurity.
Bacteria use their bodies as well as their flagella to move through fluids, report researchers. The finding could shed new light on the evolution of cell body shape.
Many bacteria swim using flagella, corkscrew-like appendages that push or pull bacterial cells like tiny propellers. It’s long been assumed that the flagella do all the work during swimming, while the rest of the cell body is just along for the ride.
But the new research shows that in at least one species, the cell body is actively carving out a helical trajectory through the water that produces thrust and contributes to the organism’s ability to swim.
“To our knowledge, this is the first time that it’s been shown quantitatively how the cell body is involved in the swimming motion,” says senior author Kenny Breuer, professor of engineering at Brown University.
“For the most part, people thought that the body didn’t do very much—that it was literally just a drag on the cell—but here we show that it really contributes.”
The findings appear in the Proceedings of the National Academy of Sciences.A single cell
Studying how microscopic bacteria swim requires sophisticated imaging techniques. Traditional microscopes can only show so much. The creatures swim right through the field of view in a blink of an eye, revealing few details about how they’re moving.Related Articles On Futurity
- Michigan State UniversityNo 'fitness peak' in sight for evolving bacteria
- Monash UniversityTo kill infection, try ‘warhead’ virus
- Johns Hopkins UniversityVaccinate mosquitoes to stop malaria?
For this study, the researchers used a method that let them follow bacterial cells closely as they swim in real time.
The technique uses a tracking microscope built by Bin Liu, a former postdoctoral researcher at Brown who will soon begin as an assistant professor at the University of California, Merced.
Researchers can view the swimming bacteria on the microscope’s mobile stage. Once the microscope locks on to a bacterium, the stage moves according to the bacterium’s movement, keeping it in the center of the microscope’s view.
“The innovation in this paper, I think, is we follow single individuals for around 30 seconds, which is a long time in the bacterial sense—thousands of revolutions of a flagellum,” Breuer says.
“In most studies of bacteria, you look at lots of bacteria collectively and average them. But there’s a tremendous scientific value to following a single cell. We can see how much cell-to-cell variation there is. We can see how much things vary over time and so on.”‘Kidney bean’ swimmers
The researchers used the imaging technique to look at bacteria called Caulobacter crescentus, a species with a kidney bean-shaped body and a single flagellum.
Caulobacter swim in two different ways—sometimes with the flagellum pushing from the rear, and sometimes with the flagellum pulling from the front.
When Liu, Breuer, and their colleagues watched C. crescentus swim under their microscope, they were surprised by what they saw.
“The first result, which we didn’t believe at first, was that the bacteria go faster forward than they do backward,” Breuer says. “We thought that couldn’t be true because the physics says it should be exactly the same forward and backward. So we said, ‘How can this be?’ That’s when we looked more deeply.”
In taking that closer look, the researchers noticed that the cell body traces a wobbly, helical trajectory as it moves—a trajectory that looks a bit like the body is traveling though an invisible spiral tube.
The spiral was less pronounced when the bacteria go in reverse compared to when they go forward. Using a mathematical model based on resistive force theory, the researchers show that the thrust produced by the different body motions accounts for the differences in swimming speed. The bacterial body, it turns out, is more than a docile passenger.Other fluids
The finding could shed new light on the evolution of cell body shape, Breuer says.
“There are big questions as to why certain species have certain cell shapes,” he says. “This might lead to some answers there.”
The researchers also plan to use the imaging technique to look at swimming dynamics in other types of bacteria and in more complex fluids, like mucus.
“Real fluids like mucus or gel have complex properties and we want to continue this work into that realm,” Breuer says. “All of this helps to explain how bacteria are adapted to swim in certain conditions and that has application to disease propagation and even to fertility, because a lot of this applies to sperm as well.”
The National Science Foundation supported the work.
Source: Brown University
People with schizophrenia are more likely to end up in prison in states with tight Medicaid policies governing antipsychotic drugs, research shows.
The new study comes amid media scrutiny over whether cutbacks in mental health actually save money when other costs are taken into account.Related Articles On Futurity
- University of MichiganYoga may ease depression while pregnant
- Yale UniversityIn US, 1 in 8 kids maltreated by age 18
- Duke UniversityWhy we can't really 'live in the moment'
Some health plans require an extra approval step before tests or treatments can be ordered for patients. This step—called prior authorization—is intended to encourage physicians to select cost-effective options by requiring justification for the selection of more expensive options.
Likewise, prior authorization policies adopted by state Medicaid programs aim to reduce costs associated with some medications, especially those drugs used to treat schizophrenia.
However, an unintended consequence of these policies may be that more mentally-ill patients are being incarcerated, raising questions about the “cost-effectiveness” of these formulary restrictions.Penny wise, pound foolish
Published in the American Journal of Managed Care, the study reports that states requiring prior authorization for atypical antipsychotics had less serious mental illness overall but higher shares of inmates with psychotic symptoms than the national average.
The study concludes that prior authorization of atypical antipsychotics was associated with a 22 percent increase in the likelihood of imprisonment, compared with the likelihood in a state without such a requirement.
“This paper demonstrates that our policies around schizophrenia may be penny wise and pound foolish,” says Dana Goldman, director of the Leonard D. Schaeffer Center for Health Policy & Economics at the University of Southern California.
“Limiting access to effective therapy may save states some Medicaid money in the short run, but the downstream consequences—including more people in prisons and more criminal activity—could be a bad deal for society.”Troubling concerns
The study examined survey data from 16,844 prison inmates in states with and without restrictive authorization requirements overlaid with state Medicaid policies and data as well as usage rates of atypical antipsychotics (a newer drug class that is frequently targeted by prior authorization requirements).
The study’s findings come amid a wave of scrutiny surrounding the cost and consequences of failing to adequately provide for mental health care, including the nexus between shortchanging mental health and rising prison expenditures.
“The media has picked up on how incarcerating the mentally ill raises a range of troubling concerns, from the high cost of incarceration, to the inadequate treatment of mentally ill inmates, and the potential for self-inflicted harm among these patients,” says Darius Lakdawalla, professor and chair of pharmaceutical development and regulatory innovation.
“At the same time, the American public is increasingly worried about untreated mental illness triggering violent behavior in the community. Our study suggests state Medicaid policies may be part of the solution to these problems.”
The disastrous March 22 landslide that killed 43 people in the rural Washington state community of Oso involved the “remobilization” of a 2006 landslide on the same hillside, a new study shows.
The research indicates the landslide, the deadliest in US history, happened in two major stages. The first stage remobilized the 2006 slide, including part of an adjacent forested slope from an ancient slide, and was made up largely or entirely of deposits from previous landslides.Related Articles On Futurity
- University of RochesterArctic turtle fossil reveals clues to climate change
- Yale UniversityT. rex's favorite meal? Other T. rex
- University of MinnesotaMapping the bottom of the world
The first stage ultimately moved more than six-tenths of a mile across the north fork of the Stillaguamish River and caused nearly all the destruction in the Steelhead Haven neighborhood.
The second stage started several minutes later and consisted of ancient landslide and glacial deposits. That material moved into the space vacated by the first stage and moved rapidly until it reached the trailing edge of the first stage.
In the study, released Tuesday on the four-month anniversary of the slide, scientists and engineers from the Geotechnical Extreme Events Reconnaissance Association (GEER) report that intense rainfall in the three weeks before the slide likely was a major issue, but factors such as altered groundwater migration, weakened soil consistency because of previous landslides, and changes in hillside stresses played key roles.Rare, but not extraordinary
“Perhaps the most striking finding is that, while the Oso landslide was a rare geologic occurrence, it was not extraordinary,” says Joseph Wartman, associate professor of civil and environmental engineering at University of Washington.
“We observed several other older but very similar long-runout landslides in the surrounding Stillaguamish River Valley. This tells us these may be prevalent in this setting over long time frames. Even the apparent trigger of the event—several weeks of intense rainfall—was not truly exceptional for the region.”
Another important finding is that spring of 2014 was not a big time for landslides in Northwest Washington, says team co-leader Jeffrey Keaton, a principal engineering geologist with AMEC Americas, an engineering consulting and project management company. “The Oso landslide was the only major one that occurred in Snohomish County or the Seattle area this spring,” he says.6,000 years of landslides
The investigating team was formed and approved within days of the landslide, but began work at the site about eight weeks later, after search and recovery activities were largely completed.
The researchers documented conditions and collected data that could be lost over time. Their report is based largely on data collected during a four-day study of the entire landslide area in late May. It focuses on data and observations directly from the site, but also considers information such as local geologic and climate conditions and eyewitness accounts.
The researchers reviewed evidence for a number of large landslides in the Stillaguamish Valley around Oso during the previous 6,000 years, many of them strongly resembling the site of the 2014 slide.
There is solid evidence, for example, of a slide just west of this year’s slide that also ran out across the valley. In addition, they reviewed published maps showing the entire valley bottom in the Oso area is made up of old landslide deposits or areas where such deposits have been reworked by the river and left on the flood plain.
Large landslides such as the March event have happened in the same area as often as every 400 years (based on 15 mapped large landslides) to every 1,500 years (based on carbon dating of what appears to be the oldest of four generations of large slides) during the last six millennia.
The size of the landslide area grew slowly starting in the 1930s until 2006, when it increased dramatically. That was followed by this year’s catastrophically larger slide.‘Risk’ not ‘hazard’
Studies in previous decades indicated a high landslide risk for the Oso area, the researchers say, but they note it does not appear there was any publicly communicated understanding that debris from a landslide could run as far across the valley as it did in March. In addition to the fatalities, that event seriously injured at least 10 people and caused damage estimated at more than $50 million.
“For me, the most important finding is that we must think about landslides in the context of ‘risk’ rather than ‘hazard,’” Wartman says. “While these terms are often used interchangeably, there is a subtle but important difference. Landslide hazard, which was well known in the region, tells us the likelihood that a landslide will occur, whereas landslide risk tells us something far more important—the likelihood that human losses will occur as a result of a landslide.
“From a policy perspective, I think it is very important that we begin to assess and clearly communicate the risks from landslides.”Other study conclusions
- Past landslides and associated debris deposited by water should be carefully investigated when mapping areas for zoning purposes.
- Influence of precipitation on destabilizing a slope should consider both cumulative amounts and short-duration intensities in assessing the likelihood of initial or renewed slope movement.
- Methods to identify and delineate potential landslide runout zones need to be revisited and re-evaluated.
Other team members include David Montgomery of the University of Washington, Robert Gilbert of the University of Texas, Jean Benoit of the University of New Hampshire, Scott Anderson of the Federal Highway Administration, and John deLaChapelle of Golder Associates Inc.
GEER is funded by the National Science Foundation. Its goal is to collect perishable data immediately in the wake of extreme events such as earthquakes, hurricanes, tsunamis, landslides, or floods.
Source: University of Washington
Children who have poor language skills during their toddler years may also be unable to control their behavior—which can lead to ADHD and other disorders of inattention and hyperactivity.
“Young children use language in the form of private or self-directed speech as a tool that helps them control their behavior and guide their actions, especially in difficult situations,” says Isaac Petersen of the clinical science program in the psychological and brain sciences department at Indiana University.Related Articles On Futurity
- Cornell UniversityBilingual kids have tuned-in brains
- RutgersFamily genes link autism to language problems
- Washington University in St. LouisConfused by health insurance jargon?
“Children who lack strong language skills, by contrast, are less able to regulate their behavior and ultimately more likely to develop behavior problems.”
Early childhood development has increasingly become a focus for public policy—in debates over universal preschool, recognition of a “word gap” between rich and poor children, and new pediatric recommendations on reading to infants.
“Children’s brains are most malleable earlier on, especially for language,” says John Bates, professor in the psychological and brain sciences department and coauthor of the study.
“Children are most likely to acquire skills in language and self-regulation early on. Many of the states are starting to focus on preschool, edging toward universal preschool. But early development specialists are not necessarily available. I would have programs more readily available to families—and focused on children most at risk as early as possible.”Self-regulation is key
Many previous studies have shown a correlation between behavior problems and language skill. Children with behavior problems, particularly those with attention deficits and hyperactivity, such as in ADHD, often have poor language skills.
Whether one of these problems precedes the other and directly causes it was until recently an open question. But in a longitudinal study published last year, researchers concluded that the arrow points decisively from poor language ability to later behavioral problems, rather than the reverse.
The current study, published in the journal Development and Psychology, shows that it does this by way of self-regulation, a varied concept that includes physical, emotional, cognitive, and behavioral control.
Self-regulation is integral to children’s capacity to adapt to social situations and to direct their actions toward future goals. The absence of self-regulation abilities is a key predictor and component of future behavior problems.
A number of studies have sought to explain the role of language in the development of self-regulation in terms of the cognitive and neurological mechanisms by which they are linked. This study traces the way they unfold over time and the role of self-regulation in this process.
To do this, the researchers followed a group of 120 toddlers for a year, beginning when they were age two and a half and following up when they were 36 months and 42 months old. At each of these points they tested the children’s language skills and behavioral self-regulation, using tests for verbal comprehension and spoken vocabulary, as well as three tasks measuring self-regulating abilities.
They also used parent and secondary caregiver assessments of behavioral problems. The findings suggest that language skill predicted growth in self-regulation, and self-regulation, in turn, predicted behavioral adjustment.20 million more words
The study lends renewed force to the argument that early childhood may offer a pathway for reducing social inequality. For what makes the “developmental cascade” from language to behavior particularly troubling, the researchers point out, is that children most at risk for a deficit in language ability, those from lower-income households, are often the least likely to get the services needed to remedy the problem.
Studies, for example, have shown a “word gap” between children of low income and those in affluent families, who hear 20 million more words by age three than their low-income counterparts. This gap results in less developed verbal and reading skills. If, as this study suggests, poor language skills lead to problems with self-regulation and behavior, this can in turn contribute to the less easily reversible and more costly social or academic problems in adolescence and later, adulthood.
“Don’t expect all children to be at the same level early on. If their language is slow to develop and self-regulation is lacking, they are likely to catch up with proper supports,” Petersen says.
“Among those who are slow, some could develop problems. If, by the age of three and a half, a child is still lagging, it may be worth pursuing treatment for language and self-regulation skills—the earlier the better.”
Angela Staples, research assistant professor at University of Virginia, is a coauthor of the study. The National Institute of Mental Health and the Eunice Kennedy Shriver National Institute of Child Health and Human Development supported the research.
Source: Indiana University
A new study shows that older African Americans are 24 percent less likely to fall than are whites.
“Millions of older adults living in community settings are just one bad fall away from a nursing home,” says lead author Emily Nicklett, assistant professor of social work at the University of Michigan. “Identifying risk and protective factors can inform falls prevention interventions and policies.”Related Articles On Futurity
- University of QueenslandTo keep brain fit, go for a run
- Stanford UniversitySagging skin and bones as we age
- University of California, Davis'White matter' may give first hints of cognitive decline
Nicklett and colleague Robert Joseph Taylor, professor and faculty associate at the Institute for Social Research, examined data on falls incidence and frequency from the Health and Retirement Study from 2000 to 2010. The study followed nearly 10,500 African American, Latino, and non-Hispanic white older adults.
Functional limitations, including difficulty walking across the room or preparing meals, and health problems such as high blood pressure, cancer, and diabetes, also predicted greater odds of experiencing a fall for adults 65 and older, the study shows. The findings appear in the Journal of Aging and Health.
Previous research indicates that older African Americans were more likely to live in extended family households. The availability of assistance at home could help older adults avoid scenarios or behaviors that could lead to falls.
Although beyond the scope of this study, the authors suggest that older whites could be at highest risk for initial and recurrent falls because they are more likely to participate in activities with a high risk of falling, such as outdoor chores and leisure-time physical activity.
Non-Hispanic white older adults are also more likely to live in suburban settings than other groups, which could account for some of the differences in falls.
The authors will next examine whether housing type and availability of support within one’s household lowers the risk of falling.
Source: University of Michigan
A brainstem circuit in mice could help explain how active movement changes the way the brain processes sensory information.
“Previous studies have examined changes in the visual cortex of mice during running. What was unknown was how do running and vision get linked together in the first place?” say Cristopher Niell, biology professor in the Institute of Neuroscience at the University of Oregon and the senior author of a paper in Neuron.
The “aha moment” that inspired the study came five years ago when Niell, as a postdoctoral fellow in Michael Stryker’s lab at the University of California, San Francisco, was examining visual perception in mice. He observed that running appeared to be changing how neurons in the brain were firing.Related Articles On Futurity
- Brown UniversitySeeing the light preps brain for vision
- Johns Hopkins UniversityBrain's 'GPS' gets us from here to there
- University of WashingtonBrain may be key factor in onset of diabetes
“We found that running turned up the magnitude in the mouse’s visual cortex by about two-fold—the signals were basically twice as strong when the mouse was running,” Niell say. The initial finding demonstrated a mind-body connection in the mouse visual system. Following up on this finding, Niell’s team sought to identify neural circuits that could link movement and vision together.
The researchers focused on the brain’s mesencephalic locomotor region (MLR), which has been shown to mediate running and other forms of activity in many species. They hypothesized that neural pathways originating in the MLR could serve a dual role—sending a signal down to the spinal cord to initiate locomotion, and another up to the cortex to turn up the visual response.
Using optogenetic methods, the team created genetically sensitized neurons in the MLR region of the mouse brain that could be activated by light. The team then recorded the resulting increased visual responses in the cortex.
Their results demonstrated that the MLR can indeed lead to both running and increased responsiveness in cortex, and that these two effects could be dissociated, showing that they are conveyed via separate pathways.
Next, researchers activated the terminals of the neurons’ axons in the basal forebrain, a region that sends neuromodulatory projections to the visual cortex. Stimulation here also induced changes in the cortex, but without the intermediary step of running. Interestingly, the basal forebrain is known to use the neuromodulator acetycholine, which is often associated with alertness and attention.Humans, too?
It is unclear whether humans experience heightened visual perception while running, but the study adds to growing evidence that the processes governing active movement and sensory processing in the brain are tightly connected.
Similar regions have been targeted in humans for therapeutic deep-brain stimulation to treat motor dysfunction in patients with Parkinson’s disease. Activating this circuit might also provide a means to enhance neuroplasticity, the brain’s capacity to rewire itself.
“While it seems that moving and sensing are two independent processes, a lot of new research suggests that they are deeply coupled,” says lead author Moses Lee, a visiting scholar from the University of California, San Francisco. “My hope is that our study can help solidify our understanding of how the brain functions differently in ‘alert’ states,”
Other authors include researchers from Johns Hopkins School of Medicine and the University of California, Berkeley.
The National Institutes of Health supported the research.
Source: University of Oregon
In the past decade, an unexpectedly high number of calves have been born to North Atlantic right whales—a species once projected for extinction.
The baby boom is linked to a climate-induced shift in the oceanic ecosystem, one that has improved the feeding conditions for the whales in the northwest Atlantic and especially the Gulf of Maine during the first decade of the 2000s.Related Articles On Futurity
- University of FloridaSaber-tooth cat was a Florida native
- McGill UniversityOre shows signs of microbes in early oceans
- University of TorontoArctic 'tree rings' show 650 years of sea ice change
Relative to the lean reproductive years right whales suffered during the 1990s, the population during the first decade of the 21st century has seen a significant increase in reproduction, researchers say.
By the late 19th century, the whaling industry had all but killed off North Atlantic right whales, valued for their oil, which was used to burn lamps and lubricate machinery.
The whale population has exhibited a gradual but bumpy recovery since the species gained protected status during the mid-20th century.
Annual surveys of the population since the 1970s have allowed researchers to study right whale population dynamics. During the 1980s, thanks to good feeding conditions, the population exhibited positive growth.Available prey
During the early 1990s, the Arctic underwent a decade-long climate shift, resulting in an influx of freshwater into the northwest Atlantic. This “great salinity anomaly” of the 1990s freshened coastal shelf ecosystems in the northwest Atlantic, says oceanographer Charles H. Greene, a professor at Cornell University.
The ecosystem became unfavorable for the copepod species Calanus finmarchicus—the whales’ main source of nutrition. With feeding conditions for the whales especially poor when Calanus suffered a crash in 1998, the whale population, in turn, failed to reproduce in 1999 and 2000.
The subsequent resurgence of Calanus during the 2000s has been linked to a shift in the Arctic climate system back to conditions similar to the 1980s. This “decadal-scale variability in right whale reproduction may be largely driven by fluctuations in prey availability linked to climate-associated ecosystem regime shifts,” the researchers say.
“Rather than facing the prospect of eventual extinction, as was forecast at the beginning of the new millennium, the right whale population in 2010 was on a positive trajectory toward recovery,” the researchers write in the journal Oceanography.
It’s important to note, however, that “continued elevated rates of right whale calf production are contingent upon favorable future prey conditions.”
The National Science Foundation and the Department of Defense supported the research.
Source: Cornell University
A new combination drug therapy cures chronic hepatitis C in most patients also infected with HIV—and without the side effects of current treatments.
The advance is important, researchers say, because about a third of HIV patients in the United States also have hepatitis C. There are an estimated 7 million co-infected patients worldwide.
“In many settings, hepatitis C is now a leading cause of death among HIV co-infected patients,” says Mark Sulkowski, professor of medicine at Johns Hopkins University.
The new study “represents the first clinical trial to demonstrate that we can cure hepatitis C in patients with HIV co-infection without the use of interferon,” Sulkowski says. “As such, it represents a transformative step in our approach to this therapeutic area.”
Co-infected patients have been challenging to treat because they have difficulty tolerating traditional treatments for hepatitis C. Injections of interferon-alpha cause flu-like symptoms or other problems in most patients, and other drugs can interact with anti-retroviral medications used to treat HIV.No interference
The new all-oral regimen—sofosbuvir and ribavirin—that was tested in the multicenter clinical trial is already considered “on label,” because data from the study were considered in the Food and Drug Administration’s approval of sofosbuvir in December.Related Articles On Futurity
- University of LeedsDesign anti-viral drugs to shut the ‘suitcase’
- Emory UniversityHIV keeps morphing to escape immune system
- University of MichiganAntibiotics that outwit bacteria
For the study, published in the Journal of the American Medical Association, researchers enrolled patients from the US and Puerto Rico through 34 academic, private practice, and community health centers.
In total, doctors administered sofosbuvir and ribavirin for either 12 or 24 weeks to a total of 223 HIV-1 patients chronically co-infected with hepatitis C. Twelve weeks after treatment ended, researchers tested patients again for hepatitis C infection to determine if treatment was effective.
Depending on their hepatitis genotype and history of prior treatment, the new regimen cured hepatitis C in between 67 and 94 percent of the patients. The drugs did not interfere with patients’ HIV treatment.
“The likelihood that a patient with chronic, long-standing hepatitis C infection would have spontaneous cure is near zero,” says Sulkowski, medical director of the Johns Hopkins Infectious Disease Center for Viral Hepatitis. “So if these patients had not been treated, none would have been cured.”
Researchers from Duke University; University of Washington; University of California, Davis; University of California, San Francisco; Quest Clinical Research; Kaiser Permanente; Philadelphia FIGHT; Gilead Sciences; Fundacion De Investigacion; and Icahn School of Medicine at Mount Sinai also contributed to the study.
Gilead Sciences Inc. provided drugs and funding for the study described in this article. Sulkowski also is a paid Scientific Advisory Board member for Gilead Sciences. The terms of this arrangement are being managed by Johns Hopkins University in accordance with its conflict of interest policies.
Source: Johns Hopkins University
The post Drug combo can cure hepatitis C in patients with HIV appeared first on Futurity.
When people suffer from sleep deprivation, they tend to misremember details of events, a new study finds.
Distorted memory can have serious consequences in areas such as criminal justice, where eyewitness misidentifications are thought to be the leading cause of wrongful convictions in the United States.Related Articles On Futurity
- Brandeis UniversityMolecule makes flies do nothing but eat and sleep
- University of WarwickWhy older adults should aim for 8 hours of sleep
- Washington University in St. LouisStay hungry to stay awake
The study, published online in Psychological Science, finds that participants deprived of a night’s sleep were more likely to flub the details of a simulated burglary they were shown in a series of images.
“We found memory distortion is greater after sleep deprivation,” say co-investigator Kimberly Fenn, an associate professor of psychology at Michigan State University. “And people are getting less sleep each night than they ever have.”
The Centers for Disease Control and Prevention (CDC) calls insufficient sleep an epidemic and says it’s linked to vehicle crashes, industrial disasters, and chronic diseases such as hypertension and diabetes.
The researchers conducted experiments to gauge the effect of insufficient sleep on memory. The results: Participants who were kept awake for 24 hours—and even those who got five or fewer hours of sleep—were more likely to mix up event details than participants who were well rested.
“People who repeatedly get low amounts of sleep every night could be more prone in the long run to develop these forms of memory distortion,” Fenn say. “It’s not just a full night of sleep deprivation that puts them at risk.”
Source: Michigan State