Syndicate content
Research news from top universities.
Updated: 1 hour 4 min ago

Why ‘Molly’ is all the rage with some teens

Wed, 06/25/2014 - 05:05

Researchers are trying to get a handle on why some teenagers are more likely to use the drug ecstasy, known as MDMA or by street names such as Molly, Mandy, E, and X.

The drug is common at dance parties and electronic music festivals because it enhances perceptions of lights, music, and socializing.

“We found that roughly 4.4 percent of high school seniors reported use of ecstasy within the last year, with males being at particularly high risk for use,” says Joseph J. Palamar, assistant professor of population health at New York University’s Langone Medical Center. “We delineated many important sociodemographic risk factors, but the most consistent and important risk factor we found is use of other drugs.”

Hispanic and black teens

The used data from Monitoring the Future (MTF), a nation-wide ongoing annual study of the behaviors, attitudes, and values of American secondary school students. The MTF survey is administered in approximately 130 public and private schools throughout 48 states in the US. Roughly 15,000 high school seniors are assessed annually. 

Related Articles On Futurity

The researchers examined the most current data for high school seniors (modal age = 18), focusing on 26,504 students from 2007-2012.

A key finding of this study was that other drug use tended to drown out significant sociodemographic predictors of ecstasy use. Results suggest that while there are many important sociodemographic factors associated with ecstasy use, the researchers note that prevention of other drug use may be the most important factor in preventing ecstasy use, in part, because use of other drugs appears to override protective effects of certain characteristics.

Females and religious students were consistently at lower odds for use. Black and Hispanic students, and students living with two parents, were at lower odds for ecstasy use, unless other substance use was already present.

“Odds of use were consistently increased for students with weekly income of >$50 from a job or >$10 weekly from other sources. Students residing in a city were also at increased risk, as were those who reported lifetime use of alcohol, cigarettes, marijuana, or other illicit drugs,” Palamar says. “This is among the first studies to examine how student income is related to ecstasy use.”

Trends in ecstasy use across racial groups have been shifting in recent years. Although white students have the highest prevalence of ecstasy use overall, rates of use among Hispanics is now close to that of whites, and rates in black students have increased over recent years. 

Rates of ecstasy use might actually be higher than reported as some students may not know that “Molly” is the same drug as MDMA/ecstasy (as MTF did not ask about Molly specifically).

Designer drugs

Given the positive expectations of experiences many adolescents associate with ecstasy, prevention efforts should focus on educating adolescents and young adults at the highest risk with regard to the potential harmful effects of ecstasy, and just as importantly, education needs to continue to focus on preventing use of drugs that normally precede ecstasy use.

“Ecstasy use also tends to precede use of other club drugs so preventing ecstasy use (e.g., among those who attend nightclubs and parties) may also prevent initiation and use of drugs such as ketamine (“Special K”) and GHB,” Palamar says.

Harm reduction messages (e.g., steps to prevent health related risks such as dehydration) disseminated as part of the nightclub and dance event scenes are needed for those who reject abstinence. Prevention messages geared toward users who are not affiliated with nightlife scenes (e.g., because they are too young to attend clubs and may use ecstasy at home or parties) may be particularly challenging, the researchers say.

In addition, Palamar warns that “Hundreds of new designer drugs have emerged in recent years, some of which were created to mimic the effects of ecstasy. Many individuals may be ingesting what they think is ecstasy, but it may in fact be an even more dangerous new substance. Likewise, today ecstasy commonly comes in powder form instead of pill form, which may even further increase the chances of receiving the drug cut with additional designer substances.”

While many public health messages about ecstasy use have been traditionally calibrated toward club attendees, the researchers advocate for drug education geared toward the general population that can more fully educate those at risk for ecstasy use along with education of other more “traditional” drugs.

“Harm reduction education is greatly needed, regardless of where and under what circumstances the drug is taken, Palamar says. “As ecstasy becomes increasingly popularized and often contains adulterants, those who reject abstinence need to be able to make informed decisions about use in order to minimize potential harmful effects.”

The journal Substance Use & Misuse published the findings.

Source: NYU

The post Why ‘Molly’ is all the rage with some teens appeared first on Futurity.

Fly-through brain images could unravel how we think

Wed, 06/25/2014 - 04:14

Scientists have improved on a new imaging technology that provides spectacular fly-through views of the brain and how it’s wired.

CLARITY was introduced last year and has been used by laboratories around the world to better understand the brain’s wiring.

However two technological fixes could make it even more broadly adopted, says Karl Deisseroth, professor of bioengineering and of psychiatry and behavioral sciences at Stanford University.

The first problem was that laboratories were not set up to reliably carry out the CLARITY process. Second, the most commonly available microscopy methods were not designed to image the whole transparent brain.

“There have been a number of remarkable results described using CLARITY,” Deisseroth says, “but we needed to address these two distinct challenges to make the technology easier to use.”

Related Articles On Futurity

In a paper published in Nature Protocols, Deisseroth presented solutions to both of those bottlenecks.

“These transform CLARITY, making the overall process much easier and the data collection much faster,” he says.

He and coauthors, including postdoctoral fellows Raju Tomer and Li Ye and graduate student Brian Hsueh, anticipate that even more scientists will now be able to take advantage of the technique to better understand the brain at a fundamental level, and also to probe the origins of brain diseases.

Clearing out the fat

When you look at the brain, what you see is the fatty outer covering of the nerve cells within, which blocks microscopes from taking images of the intricate connections between deep brain cells. The idea behind CLARITY was to eliminate that fatty covering while keeping the brain intact, complete with all its intricate inner wiring.

The way Deisseroth and his team eliminated the fat was to build a gel within the intact brain that held all the structures and proteins in place. They then used an electric field to pull out the fat layer that had been dissolved in an electrically charged detergent, leaving behind all the brain’s structures embedded in the firm water-based gel, or hydrogel. This is called electrophoretic CLARITY.

The electric field aspect was a challenge for some labs. “About half the people who tried it got it working right away,” Deisseroth says, “but others had problems with the voltage damaging tissue.”

This kind of challenge is normal when introducing new technologies.

When he first introduced optogenetics, which allows scientists to control individual nerves using light, a similar proportion of labs were not initially set up to easily implement the new technology, and ran into challenges.

To help expand the use of CLARITY, the team devised an alternate way of pulling out the fat from the hydrogel-embedded brain—a technique they call passive CLARITY. It takes a little longer, but still removes all the fat, is much easier and does not pose a risk to the tissue.

Chemicals, a warm bath, and time

“Electrophoretic CLARITY is important for cases where speed is critical, and for some tissues,” Deisseroth says. “But passive CLARITY is a crucial advance for the community, especially for neuroscience.”

Passive CLARITY requires nothing more than some chemicals, a warm bath, and time.

Many groups have begun to apply CLARITY to probe brains donated from people who had diseases like epilepsy or autism, which might have left clues in the brain to help scientists understand and eventually treat the disease. But scientists, including Deisseroth, had been wary of trying electrophoretic CLARTY on these valuable clinical samples with even a very low risk of damage.

“It’s a rare and precious donated sample, you don’t want to have a chance of damage or error,” he says. “Now the risk issue is addressed, and on top of that you can get the data very rapidly.”

See fine wiring structures

The second advance had to do this rapidity of data collection. In studying any cells, scientists often make use of probes that will go into the cell or tissue, latch onto a particular molecule, then glow green, blue, yellow, or other colors in response to particular wavelengths of light. This is what produces the colorful cellular images that are so common in biology research. Using CLARITY, these colorful structures become visible throughout the entire brain, since no fat remains to block the light.

But here’s the hitch. Those probes stop working, or get bleached, after they’ve been exposed to too much light. That’s fine if a scientist is just taking a picture of a small cellular structure, which takes little time. But to get a high-resolution image of an entire brain, the whole tissue is bathed in light throughout the time it takes to image it point by point. This approach bleaches out the probes before the entire brain can be imaged at high resolution.

The second advance of the new paper addresses this issue, making it easier to image the entire brain without bleaching the probes. “We can now scan an entire plane at one time instead of a point,” Deisseroth says. “That buys you a couple orders of magnitude of time, and also efficiently delivers light only to where the imaging is happening.”

The technique is called light sheet microscopy and has been around for a while, but previously didn’t have high enough resolution to see the fine details of cellular structures.

“We advanced traditional light sheet microscopy for CLARITY, and can now see fine wiring structures deep within an intact adult brain,” Deisseroth says.

His lab built their own microscope, but the procedures are described in the paper, and the key components are commercially available. Additionally, Deisseroth’s lab provides free training courses in CLARITY, modeled after his optogenetics courses, to help disseminate the techniques.

The work is funded by the Defense Advanced Research Projects Agency (DARPA), National Institute of Mental Health, National Science Foundation, the National Instituteon Drug Abuse, the Simons Foundation, and the Wiegers Family Fund.

Source: Stanford University

The post Fly-through brain images could unravel how we think appeared first on Futurity.

Visitors leave Antarctica open to invasive species

Tue, 06/24/2014 - 08:49

Antarctica’s ice-free areas—home to most of the continent’s biodiversity—needs better protection from human impact and climate change, say environmental scientists.

University of Queensland researcher Justine Shaw says most of Antarctica’s biodiversity occurred in the less than one percent of the continent, which is permanently ice-free. Of that small area, only 1.5 percent belongs to the Antarctic Specially Protected Areas under the Antarctic Treaty System.

Related Articles On Futurity

A new study has found that many of the continent’s ice-free protected areas are at risk from invasive species.

Shaw says the Antarctic continent’s tiny ice-free area, where most of the native wildlife and plants are found, need adequate and representative protected areas.

“With more research facilities being built and increasing tourism to Antarctica, the simple ecosystems are at risk from human activities including pollution, trampling, and invasive species such as insects and grass,” Shaw says.

More than 40,000 people visit Antarctica each year.

Shaw’s study found that all 55 areas designated for protection of land-based biodiversity were close to sites of human activity, with seven at high risk for biological invasion. Five of the distinct ice-free eco-regions have no protected areas.

‘A true wilderness’

The study, published in PLOS Biology, shows that Antarctica’s protected area system fell well short of the Aichi Biodiversity Targets—an international biodiversity strategy that aims to reduce threats to biodiversity, and to protect ecosystems, species, and genetic diversity.

“Many people think that Antarctica’s biodiversity is well protected because it’s isolated and no one lives there, but it is at risk,” says Shaw. “Our study found that the protected area system of Antarctica ranks in the lowest 25 percent of assessed countries.”

Hugh Possingham, of the National Environmental Research Program (NERP) Decisions Hub, says Antarctica is one of the last places on Earth without cities, agriculture, or mining.

“It is unique in this respect—a true wilderness—and if we don’t establish adequate and representative protected areas in Antarctica this unique and fragile ecosystem could be lost,” he says.

“Although our study shows that the risks to biodiversity from increasing human activity are high, they are even worse when considered together with climate change.

“The combined effect provides even more incentive for a better system of area protection in Antarctica.”

The NERP Environmental Decisions Hub, University of Queensland’s School of Biological Sciences, the Australian Antarctic Division, and Monash University’s School of Biological Sciences supported the research. Researchers from the Australian Antactic Division and Monash University also contributed to the study.

Source: University of Queensland

The post Visitors leave Antarctica open to invasive species appeared first on Futurity.

Decay to fermions backs up Higgs boson discovery

Tue, 06/24/2014 - 08:37

A new study offers evidence of the direct decay of the Higgs boson to fermions, which are among the particles anticipated by the Standard Model of physics.

Published in Nature Physics, the finding fits what researchers expected to see amid the massive amount of data provided by the Large Hadron Collider (LHC). The giant collider smashed protons together in 2012, as researchers hoped that the encounter would produce the short-lived Higgs boson, leaving in its traces signs of its decay.

“In July 2012, we knew we had discovered some sort of boson, and it looked a lot like it was a Higgs boson,” says Paul Padley, professor of physics and astronomy at Rice University. “To firmly establish it’s the Standard Model Higgs boson, there are a number of checks we have to do. This paper represents one of these fundamental checks.”

The graphic shows particle traces extending from a proton-proton collision at the Large Hadron Collider in 2012. The event shows characteristics expected from the decay of the Standard Model Higgs boson to a pair of photons. Further analysis of collisions in 2011 and 2012 has found evidence that the Higgs also decays into fermion particles, according to the new paper. (Courtesy of CERN)

‘Crime scene’ analysis

The decades-long search for the Higgs, which physicists believe gives mass to the fundamental particles, has been a primary focus of the $6 billion LHC. The Compact Muon Solenoid (CMS) is one of two main experiments at the collider. The other, ATLAS, has also found strong evidence of fermions from decaying Higgs bosons, though that team has yet to publish its results.

Related Articles On Futurity

The collider is shut down for an upgrade to be completed next year, but the mountain of data from the first run of experiments through 2012 has yielded spectacular results, Padley says.

Sifting through the data, is “like doing the analysis at a crime scene, when they look to see which gun fired the bullets. As we find more evidence, it looks more like a Standard Model Higgs boson. This paper is important because it really establishes that it’s decaying to fermions.”

Fermions are even more elemental particles that include quarks and leptons (which include another subparticle, electrons). They exist for only a minute fraction of a second after emerging from the decaying boson, but because they’re moving away from the collision at tremendous speed, they can be tracked.

More Higgs bosons?

Capturing their traces takes highly sophisticated equipment, says Karl Ecklund, assistant professor of physics and astronomy. “I’ve been working on the pixel detector, the innermost part of CMS. It’s a bit like a 66-million-pixel video camera that takes 40 million frames a second. We’re basically watching the collisions of the proton beams and looking for all the charged particles that come out.”

Layers of electronic sensors that surround the collider track many kinds of particles, each of which leaves a unique signature that includes its lifetime and path. “We’re able to connect the dots to see these tracks,” Ecklund says. “For the Higgs studies, particularly in the case of the fermions, we’re looking for Higgs-to-bottom quarks and Higgs-to-tau (antitau) pairs. Taus are heavy versions of the electron.”

Researchers are building and testing CMS components for the upgraded LHC, which they expect CERN will boot up next spring for the second run of experiments starting in the summer of 2015. “It’s going to be focused on new things that could appear at higher energies,” Ecklund says. “One definite target will be seeing more Higgs bosons, which should tell us a lot more about their properties.

“The main excitement is going to be that because the energy is higher, we could produce the Higgs in association with other particles,” he says. Of particular interest will be evidence of heavy top quarks and how they relate to the Higgs. “They should actually have the strongest coupling to the Higgs because the top quark and the Higgs combined make a fairly heavy thing to produce.”

‘Hints of new physics’

The upgraded LHC should provide the necessary energy to produce many more top quarks and Higgs bosons. “We’re interested in understanding how the top quark fits into the Standard Model and whether, since it’s so heavy, it could have a special role in relating to the Higgs. Maybe there are hints of new physics that aren’t in the Standard Model,” he says. “We know the standard model is incomplete.”

“The discovery of the Higgs boson was a beginning, not the end,” Padley says. “The first step is to measure with great precision the properties of the Higgs boson we’ve discovered, and then use it as a tool for further discovery.

“We’re trying to probe questions about the universe and dark matter. In fact, there was a big study of the priorities in particle physics, and the number one priority for the entire field is to study the properties of this boson and use it as a tool for discovery. This represents a step down that path.”

Source: Rice University

The post Decay to fermions backs up Higgs boson discovery appeared first on Futurity.

Vibrating glove could teach you Braille

Tue, 06/24/2014 - 08:16

A new wireless computing glove can help people learn to read and write Braille—and they don’t even have to be paying attention.

“The process is based on passive haptic learning (PHL),” says Thad Starner, professor at Georgia Tech. “We’ve learned that people can acquire motor skills through vibrations without devoting active attention to their hands.”

A wearable computing technology helps people learn how to read and write Braille as they concentrate on other tasks. (Credit: Georgia Tech)

In a new study, Starner and PhD student Caitlyn Seim examined how well the gloves work to teach Braille.

Related Articles On Futurity

Each study participant wore a pair of gloves with tiny vibrating motors stitched into the knuckles. The motors vibrated in a sequence that corresponded with the typing pattern of a pre-determined phrase in Braille.

Audio cues let the users know the Braille letters produced by typing that sequence. Afterwards, everyone tried to type the phrase one time, without the cues or vibrations, on a keyboard.

The sequences were then repeated during a distraction task.

Participants played a game for 30 minutes and were told to ignore the gloves. Half of the participants felt repeated vibrations and heard the cues, while the others only heard the audio cues. When the game was over, participants tried to type the phrase without wearing the gloves.

“Those in the control group did about the same on their second attempt (as they did in their pre-study baseline test),” Starner says. “But participants who felt the vibrations during the game were a third more accurate. Some were even perfect.”

Starner had previously created a technology-enhanced glove that can teach beginners how to play piano melodies in 45 minutes. He and Steim expected to see a wide disparity between the two groups based on the results of the piano glove study. But they were surprised the passive learners in the Braille study picked up an additional skill.

From typing Braille to reading it

“Remarkably, we found that people could transfer knowledge learned from typing Braille to reading Braille,” Seim says. “After the typing test, passive learners were able to read and recognize more than 70 percent of the phrase’s letters.”

No one in the study had previously typed on a Braille keyboard or knew the language. The study also didn’t include screens or visual feedback, so participants never saw what they typed. They had no indication of their accuracy throughout the study. “The only learning they received was guided by the haptic interface,” Seim says.

Seim is currently in the middle of a second study that uses PHL to teach the full Braille alphabet during four sessions. Of the eight participants so far, 75 percent of those receiving PHL reached perfect typing performance. None of the control group had zero typing errors. PHL participants have also been able to recognize and read more than 90 percent of all the letters in the alphabet after only four hours.

Nearly 40 million people worldwide are blind. However, because Braille instruction is widely neglected in schools, only 10 percent of those who are blind learn the language. Braille is also difficult to learn later in life, when diabetics, wounded veterans, or older people are prone to lose their sight.

The Braille studies will be presented in Seattle this September at the 18th International Symposium on Wearable Computers (ISWC).

The National Science Foundation provided partial support for the study. Any conclusions expressed are those of the principal investigator and may not necessarily represent the official views of the NSF.

Source: Georgia Tech

The post Vibrating glove could teach you Braille appeared first on Futurity.

20-minute ‘snapshot’ gives teachers better feedback

Tue, 06/24/2014 - 07:55

A new assessment tool offers a less subjective way to evaluate classroom instruction and only takes 20 minutes to complete, report researchers.

The assessment also provides immediate and meaningful feedback making it an important new tool for understanding and improving instructional quality, according to psychologists from the University of Rochester and the University of North Carolina at Chapel Hill.

“Education researchers broadly agree that teachers matter,” explains coauthor Professor Edward Deci of the University of Rochester. “But there is less consensus about precisely what defines effective instruction and how to measure it.

“This assessment is able to capture the vital signs of teaching. It’s a bit like a doctor taking your blood pressure and pulse for a quick picture of your health.”

Deci, Diane Early, a scientist at the University of North Carolina at Chapel Hill, and Ronald Rogge, an associate professor of psychology at the University of Rochester, report their results in the High School Journal.

Engagement, Alignment, Rigor

In this study, the researchers asked trained observers to rate the classroom instruction of 58 math and English teachers in four high schools in Arizona using a tool developed by the Institute for Research and Reform in Education, a non-profit that uses evidence-based practices to help struggling schools.

Related Articles On Futurity

The 15-item tool focuses on three aspects of instruction: the engagement of students, how closely schoolwork aligns with state and local standards, and whether coursework is appropriately challenging.

Called the EAR Protocol—short for Engagement, Alignment, and Rigor—the assessment already has been used in more than 100 schools, but this current study is the first to test the its objectivity and ability to predict student learning as measured by standardized tests.

The protocol is based on educational research showing that when students’ basic psychological needs are met, learning outcomes improve, explains Deci. For example, when teachers are excited about their subjects and supportive, students are more likely to be engaged.

When instructors present challenging schoolwork along with structured supports for mastering those assignments, students build a genuine sense of competence and confidence.

“It’s like learning how to play tennis. You improve when you play with someone who is just a bit better than you are,” he says.

The researchers found that higher classroom ratings for engagement, alignment, and rigor were correlated with better student outcomes on standardized tests, after controlling for prior year test scores.

Objective feedback

“The assessment captures surprisingly complex and fundamental qualities of teaching,” says Early. “It’s easy to use and 20 minutes is short enough for administrators to fit into the confines of their busy workday. And it’s adaptable for all grades and subjects, from math and English to art and physical education.”

The study also showed that observers can use the tool reliably. “Different observers of the same classroom came to the same conclusions,” explains Early.

By highlighting areas where teachers need improvement, the assessments can help identify what kinds of professional development may be most helpful, the authors write.

Follow-up assessments can then test to see if additional training enhances classroom instruction. “It’s hard to know whether you are improving if there is no objective feedback measure,” explains Early.

The assessment also helps teachers and administrators focus on the same key indicators of teaching quality: engagement, alignment, and rigor. “If adopted widely, the evaluations could provide a common language for talking about the vital signs of high-quality teaching,” says Early.

The Department of Education’s Institute of Education Sciences and the Bill & Melinda Gates Foundation supported the work.

Source: University of Rochester

The post 20-minute ‘snapshot’ gives teachers better feedback appeared first on Futurity.

Can a mouthguard sensor make football safer?

Tue, 06/24/2014 - 06:43

A new device could one day give real-time measurements of the head impacts sustained by football players.

The research could also help characterize the forces sustained in more common head traumas, such as car accidents and falls.

The debilitating effects of repeated concussions on NFL players have been well documented. What scientists still don’t clearly know is whether those injuries are the result of thousands of tiny impacts, or singular, crushing blows to the brain.

For the past few years, David Camarillo, an assistant professor of bioengineering at Stanford University, and his colleagues have been supplying Stanford football players with special mouthguards equipped with accelerometers that measure the impacts players sustain during a practice or game.

A multiple exposure shows the effect of an impact to the top of the helmet in a laboratory experiment. A head impact detection system using the mouthguard device distinguishes between these head impacts and non-impact noise, such as dropping the mouthguard on the floor. (Credit: Linda A. Cicero/ Stanford)

Previous studies have suggested a correlation between the severity of brain injuries and the biomechanics associated with skull movement from an impact.

Camarillo’s group uses a sensor-laden mouthguard because it can directly measure skull accelerations—by attaching to the top row of teeth—which is difficult to achieve with sensors attached to the skin or other tissues.

So far, the researchers have recorded thousands of these impacts, and have found that players’ heads frequently sustain accelerations of 10 g forces, and, in rarer instances, as much as 100 g forces. By comparison, space shuttle astronauts experience a maximum of 3 g forces on launch and reentry.

Building a better mouthguard

Although these mouthguards have provided a wealth of data, they were not very discerning: A player tossing his mouthguard to the ground can register the same force as if he had been run over by a linebacker.

Related Articles On Futurity

This has required Camarillo’s team to spend hours going through videos of games and practices to determine whether each player’s time-stamped data matches a true impact or a spurious event, says Lyndia Wu, a bioengineering doctoral student in Camarillo’s lab and the lead author of a new paper in IEEE Transactions on Biomedical Engineering.

To overcome this dilemma, the researchers incorporated infrared proximity sensors into the mouthpiece, so that it can detect when the device is firmly seated against the player’s teeth. (Teeth have a special property whereby they absorb and scatter infrared light, allowing the sensor to be triggered when in direct contact with the teeth.)

Furthermore, machine-learning algorithms sift out additional “noisy” signals to only focus on real impacts.

Wu says that both of these improvements make it faster to collect data, which will become critical for expanding research to other subject populations and collecting a larger data set to ultimately prove what specific aspects of head acceleration cause concussions.

“We do know that sustaining a second injury right after the first injury will exacerbate the trauma, so detecting that injury is critical,” Wu says. “However, diagnosis often relies on players to self-report injuries, which doesn’t work often for a variety of reasons.

“A player typically shakes it off, thinking he will be fine, without telling the coaches or trainers. Eventually, we hope to have a device that is able to screen for injury in real time.”

Off the field

The newly developed technology has been tested on an impact dummy in the lab, and shows 99 percent accuracy in detecting head impacts. The next step involves refining the algorithms using field data.

A common issue with some of the commercially available systems is that they can provide too many warning alerts. Camarillo says it will be critical to make sure that the mouthguard can separate true impacts from other events, such as chewing, and to also conduct further studies to understand the true physiological significance of impacts.

Camarillo’s lab is also interested in developing new helmets or other protective headgear; with the instrumented mouthguard, they can collect head biomechanics data to gain insight into injury mechanisms, which will in turn guide preventative technology design.

The work also has important implications off the football field. Although football players are at heightened risk, Camarillo says, the number one cause of traumatic brain injuries is falling, which is most common among children and the elderly.

“Our football team has been extremely cooperative and interested in helping solve this problem,” Camarillo says. “Football players willingly put themselves at risk at a well-defined point and time in space for us to carry out our research in this ‘lab.’ What we are learning from them will help lead to technologies that will one day make bike-riding and driving in your car safer, too.”

Source: Stanford University

The post Can a mouthguard sensor make football safer? appeared first on Futurity.

Social spiders die off without personality mix

Tue, 06/24/2014 - 05:20

New research finds that personality can determine a spider’s specialization—caregiver or hunter-warrior.

While most spiders are soloists, a few species, such as the Anelosimus studiosus (found in Tennessee, among other places) live in groups. And unlike ants, for example, their specialization isn’t a matter of size or physical structure.

(Credit: dinesh rao/Flickr)

PhD student Colin Wright and his mentor Jonathan Pruitt, assistant professor of behavioral ecology at the University of Pittsburgh, separated the docile spiders from the aggressive by observing how much space they demanded from fellow colony members. Aggressive females demand more space than docile ones.

Related Articles On Futurity

The team ran the spiders through a series of tests, examining their performance in colony defense, prey capture, parenting skills, and web repair.

The aggressive cohort was great at defending the web, capturing prey, and repairing their web. But they were awful parents.

“We didn’t know what the docile spiders did,” Wright says. “Were they just freeloaders?” No, it turns out, they were the ones who were capable of rearing large numbers of offspring.

In a separate study, Pruitt also created all docile, all aggressive, and mixed colonies of spiders.

The docile colonies died out first. No one was there to protect them from “parasite” spiders that picked off their young and stole their prey. The all-aggressor group died off second, as they became cannibalistic toward their young.

The mixed group thrived.

The findings appear in the Proceedings of the National Academy of Sciences.

Source: University of Pittsburgh

The post Social spiders die off without personality mix appeared first on Futurity.

Viruses bloom in patients with lingering sepsis

Mon, 06/23/2014 - 14:38

A new study links prolonged episodes of sepsis—a life-threatening infection and leading cause of death in hospitals—to the reactivation of dormant viruses in ill patients.

In healthy people, latent viruses are kept in check by the immune system. But new research provides strong evidence that when sepsis lingers for more than a few days, which is common, viruses re-emerge and enter the bloodstream, signaling that the immune system has become suppressed.

This state of immune suppression may leave patients unable to fight off secondary infections, such as pneumonia.

A research team at Washington University School of Medicine in St. Louis published their findings in PLOS ONE. Their work suggests that drugs that “rev up” the immune system could be incorporated into the treatment of prolonged sepsis.

Therapy shift

“A controversy has existed over whether patients with sepsis progress to a state of immune suppression,” says co-senior author Gregory Storch, a virologist and chief of the Division of Pediatric Infectious Diseases. “The finding that critically ill patients with sepsis have a number of different viruses circulating in the bloodstream is compelling evidence they are immune-suppressed and dramatically could alter therapy for sepsis.”

Surprisingly, the researchers detected levels of viruses in sepsis patients that were on par with those seen in patients who have had organ transplants and are taking immune-suppression drugs to prevent rejection.

“This is an indicator of the degree of immune suppression in septic patients, and it tells us they are highly immune-suppressed,” explains senior author Richard S. Hotchkiss, professor of anesthesiology.

Sepsis infections

More than 200,000 patients in the United States die annually of sepsis. The condition develops when the body mounts a massive immune response to infection, triggering excessive inflammation that can lead quickly to organ failure. While some patients die soon after the condition develops, most sepsis deaths occur four or more days after onset.

Related Articles On Futurity

“We’ve gotten much better at getting sepsis patients through the initial phase,” says Hotchkiss, who also is a professor of medicine and of surgery. “But too many patients die several weeks or months after sepsis sets in, primarily of secondary infections.”

Earlier research has hinted that lingering sepsis may be linked to immune suppression. But a lack of compelling evidence has kept the debate going.

For the current study, first authors Andrew Walton and Jared Muenzer used polymerase chain reaction (PCR) testing to detect a range of viruses in blood and urine samples from 560 critically ill patients with sepsis, who were treated in the surgical and medical intensive care units at Barnes-Jewish Hospital.

As a comparison, they performed the same test on 161 critically ill patients in the hospital who did not have sepsis and 164 healthy patients who were having outpatient surgery.

Latent viruses

“We were looking for viruses that people are commonly exposed to early in life and that persist in the body in a latent form that doesn’t cause sickness,” says Storch, the professor of Pediatrics. “No one had really looked at this in a comprehensive fashion before. These viruses have the potential to reactivate if the immune system is suppressed.”

Patients with lingering sepsis had markedly higher levels of viruses detectable in the blood, compared with the healthy controls and critically ill patients without sepsis. Among the sepsis patients, for example, the researchers found that 53 percent had Epstein-Barr virus, 24 percent had cytomegalovirus, 14 percent had herpes-simplex virus, and 10 percent had human herpes simplex virus-7.

These viruses generally don’t lead to significant illness in people who are healthy but can cause problems in patients who are immune-suppressed.

The researchers noted that 43 percent of patients with sepsis had two or more viruses detected in their blood or urine during their hospital stays. However, this finding may underestimate the frequency of viral infections because some of these patients were not tested for all viruses. In a subgroup of 209 sepsis patients tested for all viruses, 54 percent were positive for two or more.

Trigger for additional infections

Additionally, the researchers found that septic patients with higher levels of viruses detected in their blood were more likely than critically ill patients without sepsis to have more severe illnesses, secondary fungal, and bacterial infections, and longer stays in the intensive care unit.

Also, septic patients with evidence of cytomegalovirus in plasma, a component of blood, had significantly higher 90-day death rates than septic patients who tested negative for the virus, although it is not clear whether the cytomegalovirus contributed to the additional deaths.

Viral testing

“We stumbled onto more viruses than we expected, and we don’t know yet whether some of these viruses are causing problems in their own right,” Storch says. “We think this paper will stimulate others to carry out further investigations of the role of latent viruses in sepsis.”

A further direction for researchers is to determine whether PCR testing for a panel of latent viruses could be used as a read out of the state of a person’s immunity. If this is the case, doctors could perform PCR testing in patients with cancer, autoimmune diseases and infections and use results to guide treatments.

The findings also open the door to new ways of treating sepsis. Over the years, a number of treatments have been evaluated to treat sepsis, but none has worked well. The new research indicates that, in addition to using powerful antibiotics to fight off infections in patients with sepsis, immunotherapy drugs that boost the immune system may be an effective therapy.

The team is planning several clinical trials of such drugs in sepsis patients in the near future.

Thee National Institutes of Health funded the research.

Source: Washington University in St. Louis

The post Viruses bloom in patients with lingering sepsis appeared first on Futurity.

Why African ranchers should let elephants gorge on poison apples

Mon, 06/23/2014 - 14:18

Wild African elephants might offer ranchers their best chance to eradicate the “Sodom apple”—a toxic invasive plant that has overrun vast swaths of East African savanna and pastureland.

Should the reference to the smitten biblical city be unclear, the Sodom apple, or Solanum campylacanthum, is a wicked plant.

Not a true apple, the relative of the eggplant smothers native grasses with its thorny stalks, while its striking yellow fruit provides a deadly temptation to sheep and cattle.

A five-year study shows that elephants and impalas, among other wild animals, can not only safely gorge themselves on the plant, but can efficiently regulate its otherwise explosive growth. Without elephants ripping the plant from the ground, or impalas devouring dozens of its fruits at a time, the shrub easily conquers the landscape.

Just as the governments of nations such as Kenya prepare to pour millions into eradicating the plant, the findings, published in Proceedings of the Royal Society B, present a method for controlling the Sodom apple that is cost-effective for humans and beneficial for the survival of African elephants, says first author Robert Pringle, assistant professor of ecology and evolutionary biology at Princeton University.

Win-win for elephants, ranchers

“The Holy Grail in ecology is these win-win situations where we can preserve wildlife in a way that is beneficial to human livelihoods,” Pringle says. Similarly, two earlier studies showed that allowing livestock to graze with wild animals such as zebras greatly improved the quality of the domesticated animals’ diet.

Related Articles On Futurity

“It’s a nice example of how conservation needn’t be about sacrifice. It often is—let’s be honest. But there are situations where you can get a win-win,” he says. “This opens the door for people whose main interest is cattle to say, ‘Maybe I do want elephants on my land.’ Elephants have a reputation as destructive, but they may be playing a role in keeping pastures grassy.”

Elephants and impalas can withstand the poison of S. campylacanthum because they belong to a class of herbivores known as “browsers” that subsist on woody plants and shrubs, many species of which pack a toxic punch. On the other hand, “grazers” such as cows, sheep, and zebras primarily eat grass, which is rarely poisonous.

Grazers easily succumb to to the Sodom apple, that causes emphysema, pneumonia, bleeding ulcers, brain swelling, and death.

Ecological mayhem

As more African savanna is converted into pasture, the proliferation of the Sodom apple may only get worse, Pringle says, which means that the presence of elephants to eat it may become more vital to the ecosystem and livestock.

The Sodom apple thrives on ecological mayhem, such as the stress of overgrazing put on the land, Pringle says. “Typically, people will overload the land with more cattle than it can support. Then they remove the animals that eat the plant.”

The researchers present enough data to potentially determine the amount of pastureland that wild Sodom-apple eaters would be able to keep free of the noxious plant, says Ricardo Holdo, a savanna ecologist and assistant professor of biological sciences at the University of Missouri.

Holdo, who is familiar with the research but had no role in it, says that beyond removing the Sodom apple, animals such as elephants and impalas could potentially increase the food available to cattle. This is a departure from the conventional view in Africa that livestock and wild animals compete for the same scarce resources.

“There is enough quantitative information in this paper that they can probably model this effect in a meaningful way,” Holdo says. “When you add the wild (herbivores), they have a negative effect on the Solanum, so they’re actually promoting a higher biomass of high-quality habitat for livestock. So, it’s a win-win in the sense that you’re creating a situation in which you can both have livestock and wild animals, and probably actually increase your yield for livestock.”

Functional redundancy

Researchers say this is one of the first studies to examine “functional redundancy” in land animals. Functional redundancy refers to the situation in which one species declines or goes extinct and another species steps in to fulfill the same ecological role. This consideration helps ecologists predict the overall effect of extinction on an entire ecosystem.

In this case, the effect of large mammals such as elephants and impalas on the Sodom apple population—and perhaps the populations of other plants—is unlikely to be duplicated by another animal species.

“That’s an important question because some species are quite vulnerable to extinction and others aren’t,” Pringle says. “The ones that go first tend to be the biggest, or the tastiest, or the ones with ivory tusks. We’re trying to gauge how the world is changing, and we need to understand to what extent these threatened animals have unique ecological functions.”

The majority of studies on functional redundancy have been conducted in aquatic systems because large land animals can be hard to control in an experiment. The new study is also unusually long by ecology standards, Pringle says—the researchers observed similar patterns year after year.

Keeping elephants out

“A big part of the reason we don’t understand functional redundancy very well in terrestrial ecosystems is because it’s difficult to manipulate land species,” he says. “Doing these experiments in the kind of environment like you have in Kenya is really challenging—keeping elephants out of anything is really a huge challenge.”

Pringle was roughly three years into a study about the effects of elephants on plant diversity when he noticed that the Sodom apple was conspicuously absent from some experiment sites. He and other researchers had set up 36 exclosures—designed to keep animals out rather than in—totaling nearly 89 acres (36 hectares) at the Mpala Research Centre in Kenya, a multi-institutional research preserve with which Princeton has been long involved.

There were four types of exclosure: one type open to all animals; another where only elephants were excluded; one in which elephants and impalas were excluded; and another off limits to all animals.

It was in the sites that excluded elephants and impala that the Sodom apple particularly flourished, which defied everything he knew about the plant.

Who’s eating the apple?

“This study was really fortuitous. I had always thought that these fruits were horrible and toxic, but when I saw them in the experiment, I knew some animal was otherwise eating them. I just didn’t know which one,” Pringle says. “The question became, ‘Who’s eating the apple?’ It’s a very interesting and simple question, but once you get the answer it raises a lot of other questions.”

Using the exclosures established for the original experiment, Pringle and his co-authors used cameras to document the zest with which wild African browsers will eat S. campylacanthum.

Pringle worked with Corina Tarnita, a mathematical biologist and assistant professor of ecology and evolutionary biology. They specifically observed the foraging activity of elephants, impalas, small-dog-sized antelopes known as dik-diks, and rodents. They captured about 30,000 hours of foraging using cameras they had focused on particular plants. The researchers also marked several hundred Sodom-apple fruit to track how many were eaten, and measured the average height, mortality and reproducibility of Sodom-apple plants in all the exclosures.

In one end, out the other

The Sodom apple proliferated with each group of animal that was excluded. At one point, the plant’s density was three-times greater in areas restricted to all animals than those that permitted all of them.

In February 2011, the researchers counted an average of less than one fruit per plant in the exclosure open to all animals, meaning that nearly every fruit produced by the plants was being consumed. In the plots closed to elephants, that average increased to three fruits per plant. When both impala and elephants were kept away, the average jumped to around 50 fruits per plant, and fruits were more likely to be eaten by insects rather than dik-diks or rodents.

There is a catch to the elephants’ and impalas’ appetite for the Sodom apple: When fruit goes in one end, seeds come out the other. Though some seeds are destroyed during digestion, most reemerge and are potentially able to germinate.

The researchers developed a mathematical model to conduct a sort of cost-benefit analysis of how the Sodom apple’s ability to proliferate is affected by being eaten. The model weighed the “cost” to the plant of being partially consumed against the potential benefit of having healthy seeds scattered across the countryside in an animal’s droppings. They then used the model to determine whether different animal species had an overall positive or negative influence on the population of Sodom-apple plants.

The whole plant

While elephants ate an enormous amount of Solanum seeds, they also often destroyed the entire plant, ripping it out of the ground and stuffing the whole bush into their mouths. The model showed that to offset the damage an elephant wreaks on a plant, 80 percent of the seeds the animal eats would have to emerge from it unscathed. On top of that, each seed would have to be 10-times more likely to take root than one that simply fell to the ground from its parent.

Impalas, on the other hand, can have a positive overall effect on the plants. Impalas ate the majority of the fruit consumed—one impala ate 18 fruit in just a few minutes. But they don’t severely damage the parent plant while feeding and also spread a lot of seeds in their dung. Of the seeds eaten by an impala, only 60 percent would need to survive, and those seeds would have to be a mere three-times more likely to sprout than a seed that simply fell from its parent.

“A model allows you to explore a space you’re not fully able to reach experimentally,” says Tarnita. “Once you’ve explored it, however, the conclusions and predictions need to be confronted with reality. This model helped us conclude that although it is theoretically possible for elephants to benefit the plant, that outcome is extremely unlikely.”

Researchers from University of Wyoming, University of Florida, University of California, Davis, the Mpala Center, and University of British Columbia are coauthors of the study. The National Science Foundation, the National Sciences and Engineering Research Council of Canada, the Sherwood Family Foundation, the National Geographic Committee for Research and Exploration, and the Princeton Environmental Institute’s Grand Challenges Program supported the work.

Source: Princeton University

The post Why African ranchers should let elephants gorge on poison apples appeared first on Futurity.

Looking beyond brain plaque to treat Alzheimer’s

Mon, 06/23/2014 - 13:55

Researchers have found a new drug target to treat Alzheimer’s—one that also has the potential to serve as a diagnostic tool for the disease.

The recent failure in clinical trials of once-promising Alzheimer’s drugs under development by large pharmaceutical companies prompted the new study.

“Billions of dollars were invested in years of research leading up to the clinical trials of those Alzheimer’s drugs, but they failed the test after they unexpectedly worsened the patients’ symptoms,” says Gong Chen, a professor of biology at Penn State.

Related Articles On Futurity

The research behind those drugs had targeted the long-recognized feature of Alzheimer’s brains: the sticky buildup of the amyloid protein known as plaques, which can cause neurons in the brain to die.

“The research of our lab and others now has focused on finding new drug targets and on developing new approaches for diagnosing and treating Alzheimer’s disease,” Chen explains.

“We recently discovered an abnormally high concentration of one inhibitory neurotransmitter in the brains of deceased Alzheimer’s patients,” Chen says.

GABA gateway

The neurotransmitter, called GABA (gamma-aminobutyric acid), showed up in deformed cells called “reactive astrocytes” in a brain structure called the dentate gyrus. This structure is the gateway to hippocampus, an area of the brain that is critical for learning and memory.

The GABA neurotransmitter was drastically increased in the deformed versions of the normally large, star-shaped “astrocyte” cells which, in a healthy individual, surround and support individual neurons in the brain.

“Our research shows that the excessively high concentration of the GABA neurotransmitter in these reactive astrocytes is a novel biomarker that we hope can be targeted in further research as a tool for the diagnosis and treatment of Alzheimer’s disease,” Chen says.

His lab also found that the high concentration of the GABA neurotransmitter in the reactive astrocytes is released through an astrocyte-specific GABA transporter, a novel drug target found in this study, to enhance GABA inhibition in the dentate gyrus.

With too much inhibitory GABA neurotransmitter, the neurons in the dentate gyrus are not fired up like they normally would be when a healthy person is learning something new or remembering something already learned.

Improved memory

Importantly, Chen says, “After we inhibited the astrocytic GABA transporter to reduce GABA inhibition in the brains of the AD mice, we found that they showed better memory capability than the control AD mice.

“We are very excited and encouraged by this result because it might explain why previous clinical trials failed by targeting amyloid plaques alone. One possible explanation is that while amyloid plaques may be reduced by targeting amyloid proteins, the other downstream alterations triggered by amyloid deposits, such as the excessive GABA inhibition discovered in our study, cannot be corrected by targeting amyloid proteins alone,” Chen says.

“Our studies suggest that reducing the excessive GABA inhibition to the neurons in the brain’s dentate gyrus may lead to a novel therapy for Alzheimer’s disease. An ultimate successful therapy may be a cocktail of compounds acting on several drug targets simultaneously.”

In addition to Chen, other members of the research team include Zheng Wu and Ziyuan Guo at Penn State and Marla Gearing at Emory University.

The National Institutes of Health and Penn State University’s Eberly College of Science Stem Cell Fund supported the study, which appears in Nature Communications.

Source: Penn State

The post Looking beyond brain plaque to treat Alzheimer’s appeared first on Futurity.

X-rays unlock milk’s fatty secrets

Mon, 06/23/2014 - 13:27

Scientists are using X-rays to take a closer look at the detailed structure of milk and how its fats interact with our digestive system. What they’re learning could provide a blueprint to create new milk products, including formula for premature babies.

“By unlocking the detailed structure of milk we have the potential to create milk loaded with fat-soluble vitamins and brain-building molecules for premature babies, or a drink that slows digestion so people feel fuller for longer,” says Stefan Salentinig of Monash University Institute of Pharmaceutical Sciences.

“We could even harness milk’s ability as a ‘carrier’ to develop new forms of drug delivery.”

Milk’s unique structure

By chemically recreating the digestive system in a glass beaker and adding cows’ milk, the team found that milk has a unique structure—an emulsion of fats, nutrients, and water forms a structure that enhances digestion.

Related Articles On Futurity

The researchers used specialized instruments at the Australian Synchrotron to simulate digestion. Enzymes and water were added to milk fat to break it down, and the synchrotron’s small angle X-ray scattering beam showed that when digested, the by-products of milk become highly organized.

The structure is similar to a sponge, Salentinig says.

“We knew about the building blocks of milk and that milk fat has significant influence on the flavor, texture, and nutritional value of all dairy food. But what we didn’t know was the structural arrangement of this fat during digestion.

“We found that when the body starts the digestion process, an enzyme called lipase breaks down the fat molecules to form a highly geometrically ordered structure. These small and highly organized components enable fats, vitamins, and lipid-soluble drugs to cross cell membranes and get into the circulatory system Salentinig says.

The next phase of the research will see the team work with nutritionists to better make the link between these new findings and dietary outcomes, and utilize these findings to design and test improved medicines.

The Australian Research Council funded the research, which appears in the journal ACS Nano.

Source: Monash University

The post X-rays unlock milk’s fatty secrets appeared first on Futurity.

Living near pesticides in pregnancy ups autism risk

Mon, 06/23/2014 - 09:03

Pregnant women living in close proximity to chemical pesticide application had a two-thirds higher risk of having a child with autism spectrum disorder or other developmental delay, according to a new study.

The associations were stronger when the exposures occurred during the second and third trimesters of the women’s pregnancies.

The large, multisite California-based study examined associations between specific classes of pesticides, including organophosphates, pyrethroids, and carbamates, applied during the study participants’ pregnancies and later diagnoses of autism and developmental delay in their offspring. It appears online in Environmental Health Perspectives.

“This study validates the results of earlier research that has reported associations between having a child with autism and prenatal exposure to agricultural chemicals in California,” says lead study author Janie F. Shelton, a University of California, Davis, graduate student who now consults with the United Nations.

Related Articles On Futurity

“While we still must investigate whether certain sub-groups are more vulnerable to exposures to these compounds than others, the message is very clear: women who are pregnant should take special care to avoid contact with agricultural chemicals whenever possible.”

California is the top agricultural producing state in the nation, grossing $38 billion in revenue from farm crops in 2010. Statewide, approximately 200 million pounds of active pesticides are applied each year, most of it in the Central Valley, north to the Sacramento Valley and south to the Imperial Valley on the California-Mexico border.

While pesticides are critical for the modern agriculture industry, certain commonly used pesticides are neurotoxic and may pose threats to brain development during gestation, potentially resulting in developmental delay or autism.

The study was conducted by examining commercial pesticide application using the California Pesticide Use Report and linking the data to the residential addresses of approximately 1,000 participants in the Northern California-based Childhood Risk of Autism from Genetics and the Environment (CHARGE) Study.

The study includes families with children between two and five diagnosed with autism or developmental delay or with typical development. The majority of study participants live in the Sacramento Valley, Central Valley, and the greater San Francisco Bay Area.

Twenty-one chemical compounds were identified in the organophosphate class, including chlorpyrifos, acephate, and diazinon. The second most commonly applied class of pesticides was pyrethroids, one quarter of which was esfenvalerate, followed by lambda-cyhalothrin permethrin, cypermethrin, and tau-fluvalinate. Eighty percent of the carbamates were methomyl and carbaryl.

What the maps show

For the study, researchers used questionnaires to obtain study participants’ residential addresses during the pre-conception and pregnancy periods. The addresses then were overlaid on maps with the locations of agricultural chemical application sites based on the pesticide-use reports to determine residential proximity. The study also examined which participants were exposed to which agricultural chemicals.

“We mapped where our study participants’ lived during pregnancy and around the time of birth. In California, pesticide applicators must report what they’re applying, where they’re applying it, dates when the applications were made, and how much was applied,” says principal investigator Irva Hertz-Picciotto, a MIND Institute researcher and professor and vice chair of the department of public health sciences at UC Davis.

“What we saw were several classes of pesticides more commonly applied near residences of mothers whose children developed autism or had delayed cognitive or other skills.”

The researchers found that during the study period approximately one-third of CHARGE Study participants lived in close proximity—within 1.25 to 1.75 kilometers—of commercial pesticide application sites.

Some associations were greater among mothers living closer to application sites and lower as residential proximity to the application sites decreased, the researchers found.

Organophosphates applied over the course of pregnancy were associated with an elevated risk of autism spectrum disorder, particularly for chlorpyrifos applications in the second trimester.

Pyrethroids were moderately associated with autism spectrum disorder immediately prior to conception and in the third trimester. Carbamates applied during pregnancy were associated with developmental delay.

Pesticides and the fetal brain

Exposures to insecticides for those living near agricultural areas may be problematic, especially during gestation, because the developing fetal brain may be more vulnerable than it is in adults.

Because these pesticides are neurotoxic, in utero exposures during early development may distort the complex processes of structural development and neuronal signaling, producing alterations to the excitation and inhibition mechanisms that govern mood, learning, social interactions, and behavior.

“In that early developmental gestational period, the brain is developing synapses, the spaces between neurons, where electrical impulses are turned into neurotransmitting chemicals that leap from one neuron to another to pass messages along,” Hertz-Picciotto says.

“The formation of these junctions is really important and may well be where these pesticides are operating and affecting neurotransmission.”

Reducing exposure

Research from the CHARGE Study has emphasized the importance of maternal nutrition during pregnancy, particularly the use of prenatal vitamins to reduce the risk of having a child with autism.

While it’s impossible to entirely eliminate risks due to environmental exposures, Hertz-Picciotto says that finding ways to reduce exposures to chemical pesticides, particularly for the very young, is important.

“We need to open up a dialogue about how this can be done, at both a societal and individual level,” she says. “If it were my family, I wouldn’t want to live close to where heavy pesticides are being applied.”

The National Institute of Environmental Health Sciences and the US Environmental Protection Agency supported the work.

Source: UC Davis

The post Living near pesticides in pregnancy ups autism risk appeared first on Futurity.

‘Partners in crime’ let us indulge and keep us in check

Mon, 06/23/2014 - 09:00

People are natural accomplices who like to conspire together to enjoy a small indulgence, but also to resist temptation together when it matters most.

Researchers staged a series of experiments that paired consumers against different temptations and gauged how closely their reactions mirrored each other and how they felt about each other afterward.

“We like moral support when the stakes are high, but we enjoy having a ‘partner in crime’ when the stakes are lower,” says Kelly L. Haws, an associate professor at Vanderbilt Owen Graduate School of Management.

Ties that bind

When researchers tracked how many pieces of candy test subjects consumed during a short film, they found that most duos ate about the same amount.

Related Articles On Futurity

“We find evidence of a general tendency for peers to ultimately match behaviors when facing a mutual temptation,” write Haws and Michael L. Lowe of Texas A&M in the study published in the Journal of Consumer Research.

Further, test subjects who ate a small amount of candy each later reported liking their partner more than when the study began. But participants who say they ate large amounts of candy reported liking their partner less than when the study began.

“We feel a greater sense of affiliation with a person when we eat or buy something considered bad, but not terrible, with a friend,” Haws says. “Likewise, we feel a stronger affiliation when a friend reaffirms a decision not to overindulge.”


Haws says this research is applicable to diverse self-control decisions from eating to spending money.

“The basic finding holds that if we’re with a friend and there’s a large amount of money at stake, it helps us feel better about the relationship if together we decline to waste a large amount of money,” Haws says.
The findings have relevance for marketers, policymakers, and consumers, the researchers say.

“Marketers can apply these findings to inform a number of important decisions related to promoting goods perceived as indulgences,” Haws says.

“Knowing that consumers prefer partners in crime when indulging on a small scale can inform decisions regarding communication strategies and messages, as well as promotional offers, perhaps by using a friends-and-family type of approach.”

On the other hand, knowing that mutually abstaining is also rewarding can help policy makers wishing to combat behaviors such as overspending, drug use, and overeating, the researchers say.

“You see this idea manifested in programs such as Weight Watchers, which builds around the idea of accountability and moral support for abstention,” Haws says.

Finally, consumers can use the knowledge to their advantage as they seek to control their decisions in social settings.

Source: Vanderbilt University

The post ‘Partners in crime’ let us indulge and keep us in check appeared first on Futurity.

Depressed young women more likely to die from heart disease

Mon, 06/23/2014 - 08:29

Women who are 55 and younger are twice as likely to suffer a heart attack, require artery-opening procedures, or die from heart disease if they are moderately or severely depressed.

Related Articles On Futurity

“Women in this age group are also more likely to have depression, so this may be one of the ‘hidden’ risk factors that can help explain why women die at a disproportionately higher rate than men after a heart attack,” says study author Amit Shah, assistant professor of epidemiology and assistant professor of medicine (cardiology) at Emory University.

For the study, published in the Journal of the American Heart Association, investigators assessed symptoms of depression in 3,237 people with known or suspected heart disease (34 percent women, average age 62.5 years) scheduled for coronary angiography, an X-ray that diagnoses disease in the arteries that supply blood to the heart.

After nearly three years of follow-up, they found:
  • In women 55 and younger, after adjusting for other heart disease risk factors, each 1-point increase in symptoms of depression was associated with a 7 percent increase in the presence of heart disease.
  • In men and older women, symptoms of depression didn’t predict the presence of heart disease.
  • Women 55 and younger were 2.2 times as likely to suffer a heart attack, die of heart disease, or require an artery-opening procedure during the follow-up period if they had moderate or severe depression.
  • Women 55 and younger were 2.5 times as likely to die from any cause during the follow-up period if they had moderate or severe depression.

“All people, and especially younger women, need to take depression very seriously,” Shah says. Depression itself is a reason to take action, but knowing that it is associated with an increased risk of heart disease and death should motivate people to seek help.”

Providers should ask more questions and be aware that young women are especially vulnerable to depression, and that it may increase the risk to their heart, Shah says.

“Although the risks and benefits of routine screening for depression are still unclear, our study suggests that young women may benefit from special consideration,” says senior study author Viola Vaccarino, professor of medicine. “Unfortunately, this group has largely been understudied before.”

Source: Emory University

The post Depressed young women more likely to die from heart disease appeared first on Futurity.

How nerves get neighbors to take out the trash

Mon, 06/23/2014 - 08:05

Some cells don’t recycle their own worn-out parts. Instead, they let their neighbors handle waste disposal.

Those are the findings of a new study that challenges the belief that healthy cells are universally responsible for cleaning up after themselves.

Scientists found that nerves in the eyes of mice pass off old, damaged energy-producing mitochondria to nearby support cells. The results could offer clues to the origins of glaucoma, researchers say. The findings may also offer insights into Parkinson’s and Alzheimer’s diseases, amyotrophic lateral sclerosis, and other illnesses that involve a buildup of “garbage” in brain cells.

“This was a very surprising study for us, because the findings go against the common understanding that each cell takes care of its own trash,” says Nicholas Marsh-Armstrong, research scientist at the Kennedy Krieger Institute and associate professor of neuroscience at Johns Hopkins University School of Medicine.

Cell powerhouses

Marsh-Armstrong and Mark H. Ellisman, a neuroscientist at the University of California, San Diego, had previously discovered that retinal ganglion cells, which transmit visual information from the eye to the brain, might be handing off bits of themselves to astrocytes, cells that surround and support the signal-transmitting neurons.

Related Articles On Futurity

The retinal ganglion cells appeared in that earlier research to make the transfer to astrocytes at the optic nerve head, the beginning of the long tendril that connects the eye and the brain. The two researchers suspected that the neuronal bits being passed on to astrocytes were mitochondria, which are known as the powerhouses of the cell.

To find out if this was the case, researchers genetically modified mice so that they produced indicators that glowed in the presence of chewed-up mitochondria. They then used cutting-edge electron microscopy to reconstruct 3D images of what happened at the optic nerve head.

As reported in the June 17 online early edition of the Proceedings of the National Academy of Sciences, the astrocytes were, indeed, breaking down large numbers of mitochondria from neighboring retinal ganglion cells.

A leading cause of blindness

The location of the process at the optic nerve head is particularly interesting, Marsh-Armstrong notes. That is the site thought to be at fault in glaucoma, a condition that damages the optic nerve, resulting in vision loss.

He plans to investigate whether the mitochondria disposal process is relevant to this disease, the second leading cause of blindness worldwide.

But the implications of the results go beyond the optic nerve head, since a buildup of “garbage” inside cells causes neurodegenerative diseases such as Parkinson’s, Alzheirmer’s, and ALS.

“By showing that this type of alternative disposal happens, we’ve opened up the door for others to investigate whether similar processes might be happening with other cell types and cellular parts other than mitochondria,” he says.

The National Eye Institute, the Glaucoma Research Foundation, the Melza M. and Frank Theodore Barr Foundation, the National Center for Research Resources, the National Institute on Drug Abuse’s Human Brain Project, the National Institute of General Medical Sciences, and the National Science Foundation funded the study.

Source: Johns Hopkins University

The post How nerves get neighbors to take out the trash appeared first on Futurity.

‘Magic Island’ pops up on Saturn’s moon

Mon, 06/23/2014 - 07:54

Radar images of Ligeia Mare, the second-largest sea on Saturn’s moon Titan, taken by NASA’s Cassini spacecraft reveal a bright, mysterious geologic object where one never existed before.

Scientifically speaking, this spot is considered a “transient feature,” but the astronomers also call it “Magic Island.”

Reporting in Nature Geoscience, the scientists say this may be the first observation of dynamic, geological processes in Titan’s northern hemisphere.

Ligeia Mare, the second largest sea on the Saturn moon Titan, sports its usual coastline in the top image. A mysteriously bright object appears on Ligeia Mare in the bottom image. (Credit: NASA/JPL-Caltech/Cornell)

“This discovery tells us that the liquids in Titan’s northern hemisphere are not simply stagnant and unchanging, but rather that changes do occur,” says Jason Hofgartner, a Cornell University graduate student in the field of planetary sciences and the paper’s lead author.

“We don’t know precisely what caused this ‘magic island’ to appear, but we’d like to study it further.”

Titan, the largest of Saturn’s 62 known moons, is a world of lakes and seas. The moon—smaller than our own planet—bears close resemblance to watery Earth, with wind and rain driving the creation of strikingly familiar landscapes.

Related Articles On Futurity

Under its thick, hazy nitrogen-methane atmosphere, astronomers have found mountains, dunes, and lakes. But in lieu of water, liquid methane and ethane flow through river-like channels into seas the size of Earth’s Great Lakes.

To discover this geologic feature, the astronomers relied on an old technique—flipping. The Cassini spacecraft sent data on July 10, 2013, to the Jet Propulsion Laboratory at the California Institute of Technology for image processing.

Within a few days, Hofgartner and his colleagues flipped between older Titan images and the newly processed pictures for any hint of change. This is a long-standing method used to discover asteroids, comets, and other worlds. “With flipping, the human eye is pretty good at detecting change,” says Hofgartner.

Prior to the July 2013 observation, that region of Ligeia Mare had been completely devoid of features, including waves.

Titan’s seasons change on a longer time scale than Earth’s. The moon’s northern hemisphere is transitioning from spring to summer. The astronomers think the strange feature may result from changing seasons.

In light of the changes, Hofgartner and the other authors speculate on four reasons for this phenomenon:

  • Northern hemisphere winds may be kicking up and forming waves on Ligeia Mare. The radar imaging system might see the waves as a kind of “ghost” island.
  • Gases may push out from the sea floor of Ligeia Mare, rising to the surface as bubbles.
  • Sunken solids formed by a wintry freeze could become buoyant with the onset of warmer temperatures during the late Titan spring.
  • Ligeia Mare has suspended solids, which are neither sunken nor floating, but act like silt in a terrestrial delta.

“Likely, several different processes—such as wind, rain, and tides—might affect the methane and ethane lakes on Titan. We want to see the similarities and differences from geological processes that occur here on Earth,” Hofgartner says. “Ultimately, it will help us to understand better our own liquid environments here on the Earth.”

In addition to Hofgartner, Cornell authors include: Alex Hayes, assistant professor of planetary sciences; Jonathan Lunine, professor of physical sciences; and Phil Nicholson, professor of astronomy. A portion of the research took place at the Jet Propulsion Laboratory, under a contract with NASA.

Source: Cornell University

The post ‘Magic Island’ pops up on Saturn’s moon appeared first on Futurity.

Drug cocktail might help diabetics make insulin

Mon, 06/23/2014 - 07:48

Combining two different medications could help patients with Type 1 diabetes at least partially regain the ability to produce their own insulin.

For a new study, Michael Haller, an endocrinologist at University of Florida, looked for problematic cells of the immune system that could be behind a patient’s inability to produce insulin and wiped them out with a medication called Thymoglobulin, a drug initially developed for use in organ transplantation.

Related Articles On Futurity

Then he used a medication called Neulasta, a drug designed to improve the lives of people with certain forms of cancer, to stimulate the production of new and potentially beneficial immune cells.

“The treatment is almost like trying to hit the reset button on the immune system,” Haller says. “We’re trying to wipe out the bad cells and stimulate the good cells at the same time.”

Haller treated 17 adult Type 1 diabetes patients for two weeks with the cocktail therapy and then followed them for a year. Another eight patients were given a placebo.

By the end of the year, the patients treated with the cocktail had increased their ability to produce insulin, which indicates that the Thymoglobulin was successful in killing the bad immune system cells, and the Neulasta was successful in stimulating new, healthy immune cells.

The patients’ ability to produce insulin also indicates they had an increase in beta cells—the cells responsible for producing insulin in the pancreas.

Haller presented his findings this month at the annual meeting of the American Diabetes Association in San Francisco.

Profound results

Conventional diabetes wisdom says that within just a few months of the onset of Type 1 diabetes, there are very few of the insulin-producing beta cells left in the pancreas, says Mark Atkinson, a co-investigator of the study and member of the department of pathology, immunology,  and laboratory medicine.

That the treatment seemed to stimulate insulin production in people with established Type 1 diabetes made the researchers “cautiously optimistic,” Atkinson says. “The results that Dr. Haller saw in his first study are profound.”

Another new aspect of the study is that it worked with patients who had been long diagnosed with the disease. Typically, studies examine patients who are newly diagnosed and still have a reasonable number of beta cells producing insulin. The patients in Haller’s study had been living with Type 1 diabetes between four months and two years.

“The model has mostly been to test therapies aimed at beta cell preservation in people who have just been diagnosed,” Haller says. “But obviously, the majority of patients living with the disease have been living with the disease for a long time, so people become disenfranchised from the research process. We’re interested in making life better for these patients.”

Combination therapies

Atkinson began considering Thymoglobulin as a treatment for diabetes nearly a decade ago. He and fellow co-investigator Desmond Schatz, associate chairman of the department of pediatrics, authored a paper advocating a combination approach to treating Type 1 diabetes.

Based on Schatz’s belief in combination therapies, the group began shepherding a cocktail of Thymoglobulin and Neulasta through early studies done with mouse models.

“Despite tremendous strides in our understanding of the natural history of Type 1 diabetes, we are as yet unable to cure and prevent the disease,” Schatz says. “This study is a step in that direction, toward a biological cure.”

The patients in Haller’s study will be followed for three to five years to see if their bodies will preserve the insulin-producing beta cells. The researchers’ next step will be to recruit patients who have been newly diagnosed with the disease to conduct a larger trial.

Haller says he hopes the approach will help patients manage their disease more easily.

“If we can confirm the results in a larger effort, the study could potentially be paradigm-shifting for our field in that it documents we should really be looking at combination therapies in treating Type 1 diabetes,” Haller says.

“Our ultimate goal is to prevent and cure this disease, but we have to crawl before we walk, and walk before we run. This study is an important step forward in our efforts to make life easier for patients with Type 1 diabetes.”

Source: University of Florida

The post Drug cocktail might help diabetics make insulin appeared first on Futurity.

Uterus ‘switch’ lets childbirth get going

Mon, 06/23/2014 - 07:30

A new study shows that a potassium ion channel called hERG in the uterus is responsible for difficult labor among overweight pregnant women.

Acting as a powerful electrical brake, hERG works during pregnancy to suppress contractions and prevent premature labor. However, at the onset of labor a protein acts as a switch to turn hERG off, removing the brake and ensuring that labor can take place.

Related Articles On Futurity

Specifically, testing the electrical signals in small amounts of uterine tissue taken from women who had an elective caesarean before labor started and women who needed an emergency caesarean during their labor, proved that hERG was dysregulated in overweight women.

Pregnant women who are overweight often continue pregnant past their due date or progress slowly when labor begins.

Overweight women have higher rates of medical interventions around labor and birth, including higher rates of induction for prolonged pregnancy, and higher rates of Caesarean section as a result of failure to progress in labor.

Lead researcher Professor Helena Parkington of Monash University says this “switch” needs to be turned off to allow labor contractions to occur, but remains turned on in overweight women.

“The reason it stays on is that the ‘molecular hand’ that should turn the switch off fails to appear in sufficient quantities in the uterine muscle of overweight women when labor should be occurring. These women also respond poorly to our current methods of induction,” says Parkington.

Professor Shaun Brennecke of the University of Melbourne and Royal Women’s Hospital says the finding significantly advances understanding of how labor progresses, with implications for all women who have complicated labors.

“The clinical significance of this discovery is that, having identified the problem responsible for dysfunctional labor in overweight women, we are now able to look at developing a safe, effective, and specific treatment to correct the problem.”

“For example, a drug to turn off the switch to allow normal labor to start and progress,” he says.

The study appears in Nature Communications.

Sources: University of Melbourne, Monash University

The post Uterus ‘switch’ lets childbirth get going appeared first on Futurity.

Maps show how cocaine moves between US cities

Fri, 06/20/2014 - 11:20

Cities in the US north and northeast tend to be destination cities for cocaine trafficking, while cities in the south and along the west coast are source cities, according to a new tracking system.

Cities in other regions, like Chicago and Atlanta, are major hubs.

Researchers say law enforcement authorities need to get a better handle on cocaine trafficking patterns if they are to take control of one of the world’s largest illegal drug markets.

Siddharth Chandra, an economist at Michigan State University, studied wholesale powdered cocaine prices in 112 cities to identify city-to-city links for the transit of the drug, using data published by the National Drug Intelligence Center of the US Department of Justice from 2002 to 2011.

Field intelligence officers and local, regional and federal law enforcement sources collected the data during drug arrests and investigations.

“These data enable us to identify suspected links between cities that may have escaped the attention of drug enforcement authorities,” says Chandra, director of the Asian Studies Center.

“By identifying patterns and locations, drug policy and enforcement agencies could provide valuable assistance to federal, state and local governments in their decisions on where and how to allocate limited law enforcement resources to mitigate the cocaine problem.”

Spikes in cocaine prices

Chandra analyzed prices for 6,126 pairs of cities for possible links. If two cities are connected, prices will move in lockstep. So if there’s a spike in cocaine prices in the city of origin—or a source city, where the drug originates—that spike will be transmitted to all cities dependent upon that source.

Related Articles On Futurity

Cocaine will flow from the city with the lower price to the city with the higher price. It takes a number of transactions for cocaine to reach its end price, and prices tend to go up as drugs move from one city to another, he says.

While cities in the north and northeast are destination cities for cocaine, Chandra found cities in the southern US and along the west coast are source cities. In addition, cities in other regions, like Chicago and Atlanta, are major hubs for cocaine.

Chandra also created a map of possible drug routes, which he compared with a map produced by the National Drug Intelligence Center, and found that a number of his routes hadn’t been identified.

Drug trafficking puzzle

Chandra cautions his methodology shouldn’t be used in isolation. Instead, it is one piece of the larger drug trafficking puzzle. Individual bits of the publicly available NDIC data may or may not be accurate based on whether drug smugglers tell the truth. However, the combined data could lead to better drug enforcement.

The NDIC was closed in 2012 and Chandra isn’t sure if another agency is continuing to collect data. But his research shows the importance of doing so.

“As an economist, the big takeaway is that prices carry some valuable information about trafficking in illegal goods,” Chandra says.

The study is published in the Journal of Drug Issues.

Source: Michigan State University

The post Maps show how cocaine moves between US cities appeared first on Futurity.

« Back