SULAIR Home

Futurity.org

Syndicate content
Research news from top universities.
Updated: 35 min 56 sec ago

Switzerland tops list of innovative economies

Mon, 07/21/2014 - 06:48

Switzerland has the most innovative economy, followed by the United Kingdom and Sweden, according to this year’s Global Innovation Index—a survey of 143 countries that uses 81 indicators to gauge innovation capabilities and measurable results.

The United States came in sixth. The study was released July 18 in Sydney, Australia at the B20 gathering of international business leaders.

Related Articles On Futurity

“When reviewing the GII quality indicators, top performing middle-income economies are closing the gap with high-income economies,” says Soumitra Dutta, Dean of the Samuel Curtis Johnson Graduate School of Management at Cornell University.

“China significantly outperforms the average score of high-income economies across the combined quality indicators. To close the gap even further, middle-income economies must continue to invest in strengthening their innovation ecosystems and closely monitor the quality of their innovation indicators.”

This the fourth consecutive year that Switzerland has led the rankings. A new entry into the top 10 this year is Luxembourg, ranked ninth. China, Brazil, and India lead the middle-income economies.

Top 10 most innovative economies:
  1. Switzerland
  2. United Kingdom
  3. Sweden
  4. Finland
  5. Netherlands
  6. United States
  7. Singapore
  8. Denmark
  9. Luxembourg
  10. Hong Kong

The survey confirms the persistence of divides in global innovation. Among the top 10 and top 25 economies, the rankings have changed but the list remains otherwise unaltered. A divide exists where less innovative economies have difficulty keeping up with the rate of progress of higher-ranking economies, even when making gains themselves.

This can be partially explained by difficulties in growing and retaining the human resources necessary for sustained innovation, which is the focus of this year’s report.

Sub-Saharan Africa

Among low-income countries displaying above-par performance, the sub-Saharan African region now makes up nearly 50 percent of the so-called “innovation learner” economies, defined by the report’s authors as those that perform at least 10 percent higher than their peers for their level of gross domestic product.

Sub-Saharan Africa now has more “innovation learner” economies than any other region, with five countries joining that status in 2014: Burkina Faso, Gambia, Malawi, Mozambique, and Rwanda. These economies demonstrate rising levels of innovation, particularly in human capital, research, and market sophistication, the researchers say.

The index is a benchmarking tool for business executives, policymakers, and others seeking insight into the state of innovation around the world. INSEAD and the World Intellectual Property Organization, an agency of the United Nations, are copublishers with Cornell of the report.

Source: Cornell University

The post Switzerland tops list of innovative economies appeared first on Futurity.

Culture sets the tone when people ‘hear voices’

Mon, 07/21/2014 - 06:42

People suffering from schizophrenia may hear “voices”—auditory hallucinations—differently depending on their cultural context, according to new research.

Anthropologist Tanya Luhrmann and her colleagues find that voice-hearing experiences of people with serious psychotic disorders in the United States hear voices that are harsh and threatening, while in Africa and India, they are more benign and playful.

This may have clinical implications for how to treat people with schizophrenia, she suggests.

The experience of hearing voices is complex and varies from person to person, according to Luhrmann, a professor of anthropology at Stanford University and first author of the article in the British Journal of Psychiatry.

Luhrmann says that American clinicians “sometimes treat the voices heard by people with psychosis as if they are the uninteresting neurological byproducts of disease which should be ignored. Our work found that people with serious psychotic disorder in different cultures have different voice-hearing experiences. That suggests that the way people pay attention to their voices alters what they hear their voices say. That may have clinical implications.”

Cultural variations

Luhrmann says the role of culture in understanding psychiatric illnesses in depth has been overlooked.

Related Articles On Futurity

“The work by anthropologists who work on psychiatric illness teaches us that these illnesses shift in small but important ways in different social worlds. Psychiatric scientists tend not to look at cultural variation. Someone should, because it’s important, and it can teach us something about psychiatric illness,” says Luhrmann, an anthropologist trained in psychology.

For the research, Luhrmann and her colleagues interviewed 60 adults diagnosed with schizophrenia—20 each in San Mateo, California; Accra, Ghana; and Chennai, India. Overall, there were 31 women and 29 men with an average age of 34. They were asked how many voices they heard, how often, what they thought caused the auditory hallucinations, and what their voices were like.

“We then asked the participants whether they knew who was speaking, whether they had conversations with the voices, and what the voices says. We asked people what they found most distressing about the voices, whether they had any positive experiences of voices and whether the voice spoke about sex or God,” she says.

The findings revealed that hearing voices was broadly similar across all three cultures, according to Luhrmann. Many of those interviewed reported both good and bad voices, and conversations with those voices, as well as whispering and hissing that they could not quite place physically. Some spoke of hearing from God while others says they felt like their voices were an “assault” upon them.

Bombarding voices

The striking difference was that while many of the African and Indian subjects registered predominantly positive experiences with their voices, not one American did. Rather, the US subjects were more likely to report experiences as violent and hateful—and evidence of a sick condition.

The Americans experienced voices as bombardment and as symptoms of a brain disease caused by genes or trauma.

One participant described the voices as “like torturing people, to take their eye out with a fork, or cut someone’s head and drink their blood, really nasty stuff.” Other Americans (five of them) even spoke of their voices as a call to battle or war—”the warfare of everyone just yelling.”

Moreover, the Americans mostly did not report that they knew who spoke to them and they seemed to have less personal relationships with their voices, according to Luhrmann.

Among the Indians in Chennai, more than half (11) heard voices of kin or family members commanding them to do tasks. “They talk as if elder people advising younger people,” one subject said. That contrasts to the Americans, only two of whom heard family members.

Also, the Indians heard fewer threatening voices than the Americans—several heard the voices as playful, as manifesting spirits or magic, and even as entertaining. Finally, not as many of them described the voices in terms of a medical or psychiatric problem, as all of the Americans did.

In Accra, Ghana, where the culture accepts that disembodied spirits can talk, few subjects described voices in brain disease terms. When people talked about their voices, 10 of them called the experience predominantly positive; 16 of them reported hearing God audibly. “‘Mostly, the voices are good,’” one participant remarked.

An intrusion to the self

Why the difference? Luhrmann offers an explanation: Europeans and Americans tend to see themselves as individuals motivated by a sense of self identity, whereas outside the West, people imagine the mind and self interwoven with others and defined through relationships.

“Actual people do not always follow social norms,” the scholars note. “Nonetheless, the more independent emphasis of what we typically call the ‘West’ and the more interdependent emphasis of other societies has been demonstrated ethnographically and experimentally in many places.”

As a result, hearing voices in a specific context may differ significantly for the person involved, they wrote. In America, the voices were an intrusion and a threat to one’s private world—the voices could not be controlled.

However, in India and Africa, the subjects were not as troubled by the voices—they seemed on one level to make sense in a more relational world. Still, differences existed between the participants in India and Africa; the former’s voice-hearing experience emphasized playfulness and sex, whereas the latter more often involved the voice of God.

Voices as relationships

The religiosity or urban nature of the culture did not seem to be a factor in how the voices were viewed, Luhrmann says.

“Instead, the difference seems to be that the Chennai (India) and Accra (Ghana) participants were more comfortable interpreting their voices as relationships and not as the sign of a violated mind,” the researchers wrote.

The research, Luhrmann observed, suggests that the “harsh, violent voices so common in the West may not be an inevitable feature of schizophrenia.” Cultural shaping of schizophrenia behavior may be even more profound than previously thought.

The findings may be clinically significant, according to the researchers. Prior research showed that specific therapies may alter what patients hear their voices say.

One new approach claims it is possible to improve individuals’ relationships with their voices by teaching them to name their voices and to build relationships with them, and that doing so diminishes their caustic qualities. “More benign voices may contribute to more benign course and outcome,” they write.

Source: Stanford University

The post Culture sets the tone when people ‘hear voices’ appeared first on Futurity.

Patients tell more secrets to virtual humans

Mon, 07/21/2014 - 06:16

Patients are more willing to disclose personal information to virtual humans than to actual ones, likely because computers don’t make judgments or look down on people the way another human might.

The findings show promise for people suffering from post-traumatic stress and other mental anguish, says Gale Lucas, a social psychologist at University of Southern California’s Institute for Creative Technologies.

Virtual human “Ellie” gave participants a sense of anonymity, making them more willing to disclose personal information in a private setting without fear of criticism. (Credit: ICT via USC)

In intake interviews, people were more honest about their symptoms, no matter how potentially embarrassing, when they believed that a human observer wasn’t in on the conversation.

“In any given topic, there’s a difference between what a person is willing to admit in person versus anonymously,” Lucas says.

The study provides the first empirical evidence that virtual humans can increase a patient’s willingness to disclose personal information in a clinical setting. It also presents compelling reasons for doctors to start using virtual humans as medical screeners. The honest answers acquired by a virtual human could help doctors diagnose and treat their patients more appropriately.

Virtual humans

For the study, which will appear in Computers in Human Behavior, researchers recruited 239 adults through Craigslist. The participants, whose ages ranged from 18 to 65, were invited to a laboratory to interact with a virtual human as if they were being admitted to a clinic or hospital.

Related Articles On Futurity

Subjects were interviewed as part of an evaluation of SimSensei, a virtual human application that can be used to identify signals of depression and other mental health issues through real-time sensing and recognition of nonverbal behaviors. It is a research project designed to assess the potential of such technology and is based on more than 10 years of virtual human research and development at ICT.

Some participants were told the virtual human’s responses were fully automated, others were told that the responses were remotely controlled, puppet-like, by a human observer in another room. In reality, all participants were randomly assigned a fully or semi-automated virtual human, regardless of what they were told, and their video interviews were later analyzed by a (human) researcher.

The mere belief that participants were interacting with only a computer made them more open and honest, researchers found, even when the virtual human asked personal questions such as, “What’s something you feel guilty about?” or “Tell me about an event, or something that you wish you could erase from your memory.”

‘Way better than talking to a person’

In addition, video analysis of the study subjects’ facial expressions showed that they were also more likely to show more intense signs of sadness—perhaps the most vulnerable of expressions—when they thought only pixels were present.

One participant who thought the virtual human was automated said the experience was “way better than talking to a person. I don’t really feel comfortable talking about personal stuff to other people.”

In contrast, a participant who thought the conversation was being observed told researchers, “I wish you hadn’t told me that other people were in the other room listening in. It was weird, like, I don’t even know these people. I would have said a lot more stuff if they weren’t there.”

Sense of anonymity

The virtual human named Ellie opened the conversation with questions such as “Where are you from originally?” and incorporated feedback (“I’m sorry to hear that”) and empathetic listening through facial expressions and nodding. Secondly, “her” virtual nature gave participants a sense of anonymity, making them more willing to disclose personal information in a private setting without fear of criticism.

“We know that developing a rapport and feeling free of judgment are two important factors that affect a person’s willingness to disclose personal information,” says coauthor Jonathan Gratch, director of virtual humans research and a professor of computer science.

“The virtual character delivered on both these fronts and that is what makes this a particularly valuable tool for obtaining information people might feel sensitive about sharing.”

The researchers were careful to emphasize that the virtual human could supplement—not replace—trained clinicians. Still, the implications of the findings are plentiful both in terms of reducing costs and improving care. Several are being explored in developing projects, including virtual humans to help detect signs of depression, provide screening services for patients in remote areas, or act as role-playing partners for training health professionals.

The Defense Advanced Research Projects Agency and the US Army funded the research.

Source: USC

The post Patients tell more secrets to virtual humans appeared first on Futurity.

Walking on all fours isn’t ‘backward’ evolution

Mon, 07/21/2014 - 05:37

Five siblings in a family who live in a remote corner of Turkey walk exclusively on their hands and feet. Since 2005, scientists have debated the nature of their disability, with speculation that they represent a “backward” stage of evolution.

The study, published online this month in PLOS ONE, shows that contrary to previous claims, people with the family members’ condition, called Uner Tan Syndrome (UTS), do not walk in the diagonal pattern characteristic of nonhuman primates such as apes and monkeys.

Adaptive or devolving?


According to a theory developed by Uner Tan of Cukurova University in Turkey, people with UTS are a human model for reverse evolution, or “devolution,” offering new insights into the human transition from four-legged to two-legged walking.

Related Articles On Futurity

Previous research countering this view has proposed that the quadrupedalism associated with UTS is simply an adaptive response to the impaired ability to walk bipedally in individuals with a genetic mutation, but this is the first study that disproves claims that this form of walking resembles that of nonhuman primates.

Straight-line strides

As part of the study, the researchers analyzed 518 quadrupedal walking strides from several videos of people with various forms of UTS, including footage from the BBC2 documentary of the five Turkish siblings, The Family That Walks on All Fours. They compared these walking strides to previous studies of the walking patterns of healthy adults who were asked to move around a laboratory on all fours.

According to the findings, nearly all of the human subjects (in 98 percent of the total strides) walked in lateral sequences, meaning they placed a foot down and then a hand on the same side and then moved in the same sequence on the other side. Apes and other nonhuman primates, however, walk in a diagonal sequence, in which they put down a foot on one side and then a hand on the other side, continuing that pattern as they move along.

Biomechanical principals

“Although it’s unusual that humans with UTS habitually walk on four limbs, this form of quadrupedalism resembles that of healthy adults and is thus not at all unexpected,” says Liza Shapiro, an anthropologist at The University of Texas at Austin. “As we have shown, quadrupedalism in healthy adults or those with a physical disability can be explained using biomechanical principles rather than evolutionary assumptions.”

The study also shows that Tan and his colleagues appeared to have misidentified the walking patterns among people with UTS as primate-like by confusing diagonal sequence with diagonal couplets.

Sequence refers to the order in which the limbs touch the ground, while couplets (independent of sequence) indicate the timing of movement between pairs of limbs. People with UTS more frequently use diagonal couplets than lateral couplets, but the sequence associated with the couplets is almost exclusively lateral.

“Each type of couplet has biomechanical advantages, with lateral couplets serving to avoid limb interference, and diagonal couplets providing stability,” Shapiro says. “The use of diagonal couplets in adult humans walking quadrupedally can thus be explained on the basis of biomechanical considerations, not reverse evolution.”

The study includes coauthors from Northeast Ohio Medical University, the University of Arizona, and New York University.

Source: University of Texas at Austin

The post Walking on all fours isn’t ‘backward’ evolution appeared first on Futurity.

How a missile detector can stop malaria ‘in its tracks’

Fri, 07/18/2014 - 08:51

Scientists have found a new use for an anti-tank Javelin missile detector: to identify malaria parasites in blood.

Originally developed for Javelin heat-seeking missiles, a special imaging device known as a focal plane array (FPA) gives highly detailed information on a sample area in minutes.

The technique is based on Fourier transform infrared spectroscopy, which provides information on how molecules vibrate. Scientists say the novel idea, published in the journal Analyst, could set a new gold standard for malaria testing.

Missiles and parasites

The heat-seeking detector, which is coupled to an infrared imaging microscope, allowed the team to detect the earliest stages of the malaria parasite in a single red blood cell.

The infrared signature from the fatty acids of the parasites enabled the scientists to detect the parasite at an earlier stage, and crucially determine the number of parasites in a blood smear.

Lead researcher Bayden Wood, an associate professor at Monash University, says to reduce mortality and prevent the overuse of antimalarial drugs, a test that can catch malaria at its early stages is critical.

“Our test detects malaria at its very early stages, so that doctors can stop the disease in its tracks before it takes hold and kills. We believe this sets the gold standard for malaria testing,” Wood says.

“There are some excellent tests that diagnose malaria. However, the sensitivity is limited and the best methods require hours of input from skilled microscopists, and that’s a problem in developing countries where malaria is most prevalent,” he adds.

Four-minute count down

As well as being highly sensitive, the new test has a number of advantages—it gives an automatic diagnosis within four minutes, doesn’t require a specialist technician and can detect the parasite in a single blood cell.

Related Articles On Futurity

The disease, which is caused by the malaria parasite, kills 1.2 million people every year. Existing tests look for the parasite in a blood sample. However the parasites can be difficult to detect in the early stages of infection. As a result the disease is often spotted only when the parasites have developed and multiplied in the body.

Professor Leann Tilley from the University of Melbourne says the test could make an impact in large-scale screening of malaria parasite carriers who do not present the classic fever-type symptoms associated with the disease.

“In many countries only people who display signs of malaria are treated. But the problem with this approach is that some people don’t have typical flu-like symptoms associated with malaria, and this means a reservoir of parasites persists that can reemerge and spread very quickly within a community,” she says.

“Our test works because it can detect the malaria parasite at the very early stages and can reliably detect it in an automated manner in a single red blood cell. No other test can do that,” Tilley adds.

Source: Monash University

The post How a missile detector can stop malaria ‘in its tracks’ appeared first on Futurity.

Your ‘bestie’ is probably your distant cousin

Fri, 07/18/2014 - 08:33

People tend to pick friends who resemble them genetically. In fact, according to a new study, close friends are the genetic equivalent of fourth cousins, on average.

“This gives us a deeper accounting of the origins of friendship,” says Nicholas Christakis, professor of sociology, evolutionary biology, and medicine at Yale University. “Not only do we form ties with people superficially like ourselves, we form ties with people who are like us on a deep genetic level. They’re like our kin, though they’re not.”

(Credit: jennifer yin/Flickr)

Christakis and James Fowler, professor of medical genetics and political science at University of California, San Diego, looked at some 1.5 million gene variants in order to do their analysis. They drew from a dataset known as the Framingham Heart Study, which offered details on both the friendships and genetics of its participants.

Using 1,932 unique subjects, researchers genetically compared pairs of friends with pairs of strangers from the same population. None of the pairs involved people who were related to each other.

Same sense of smell

What they found was that friends share about 1 percent of the same gene variants—a highly significant number in the eyes of geneticists. They also uncovered telling details within that analysis.

Related Articles On Futurity

For example, friends are quite similar in gene variants having to do with the sense of smell. They are less likely to share gene variants relating to immunity against specific diseases.

The researchers says this might indicate that while it’s advantageous for friends to prefer the same smells, it might also be beneficial for them to have genetic shields for different disease threats.

In essence, the study says, friends are “functional kin.” The researchers even developed a “friendship score” that suggests how likely two people are to become friends, based on their genetics. The research is published in the Proceedings of the National Academy of Sciences.

Christakis says one of the more interesting aspects of the study has to do with the rate of evolution for genes shared by friends. Those gene variants are seeing the most evolutionary activity overall, meaning that friendship may play a role in the speed of human evolution.

“The fitness of many genes may depend on whether similar genes are in evidence in people we befriend,” Christakis says. “My genetic fitness depends on my own genes and my friends’ genes.”

Researchers emphasized that the study is not a statement about ethnicity or race. The study’s dataset, they says, was dominated by people of the same European extraction. In other words, even within an ethnically similar population, people tended to choose friends with a closer genetic profile. “These results are evident above and beyond any tendency people might have to associate with other people of the same ethnic or racial group,” Fowler says.

The study also points to the need for further scientific examination of the role of friendship.

“Human beings are one of the few species who form long-term, non-reproductive relationships with other members of our species,” Fowler noted. “This role of affiliation is important. It ties into the success of our species.”

Source: Yale University

The post Your ‘bestie’ is probably your distant cousin appeared first on Futurity.

Love or lust? The eyes tell all

Fri, 07/18/2014 - 08:17

Where someone’s gaze falls could indicate almost instantly whether attraction is based on feelings of love or of lust.

Scientists say if the gaze is focused on a stranger’s face, then love is possible, but if the gaze focuses more on the stranger’s body, then the attraction is more sexual in nature. That automatic judgment can occur in as little as half a second, producing different gaze patterns.

“Although little is currently known about the science of love at first sight or how people fall in love, these patterns of response provide the first clues regarding how automatic attentional processes, such as eye gaze, may differentiate feelings of love from feelings of desire toward strangers,” says lead author Stephanie Cacioppo, director of the High-Performance Electrical NeuroImaging Laboratory at the University of Chicago.

Related Articles On Futurity

Previous research by Cacioppo has shown that different networks of brain regions are activated by love and sexual desire. In this study, the team performed two experiments to test visual patterns in an effort to assess two different emotional and cognitive states that are often difficult to disentangle from one another—romantic love and sexual desire.

Male and female students from the University of Geneva viewed a series of black-and-white photographs of persons they had never met. In part one of the study, participants viewed photos of young, adult heterosexual couples who were looking at or interacting with each other. In part two, participants viewed photographs of attractive individuals of the opposite sex who were looking directly at the camera/viewer. None of the photos contained nudity or erotic images.

In both experiments, participants were placed before a computer and asked to look at different blocks of photographs and decide as rapidly and precisely as possible whether they perceived each photograph or the persons in the photograph as eliciting feelings of lust or romantic love.

Quick as a wink

The study, published in the journal Psychological Science, showed no significant difference in the time it took subjects to identify romantic love versus sexual desire, which suggests how quickly the brain can process both emotions, the researchers note.

But analysis of the eye-tracking data from the two studies revealed marked differences in eye movement patterns, depending on whether the subjects reported feeling sexual desire or romantic love.

People tended to visually fixate on the face, especially when they said an image elicited a feeling of romantic love. However, with images that evoked sexual desire, the subjects’ eyes moved from the face to fixate on the rest of the body. The effect was found for male and female participants.

“By identifying eye patterns that are specific to love-related stimuli, the study may contribute to the development of a biomarker that differentiates feelings of romantic love versus sexual desire,” says coauthor John Cacioppo, director of the Center for Cognitive and Social Neuroscience. “An eye-tracking paradigm may eventually offer a new avenue of diagnosis in clinicians’ daily practice or for routine clinical exams in psychiatry and/or couple therapy.”

Coauthor Mylene Bolmont, a graduate student at the University of Geneva, Switzerland, contributed to the study.

Source: University of Chicago

The post Love or lust? The eyes tell all appeared first on Futurity.

Plant ‘library’ could help protect food supply

Fri, 07/18/2014 - 08:17

The first comprehensive library of genetic switches in plants will be available to scientists around the world.

The collection contains about 2,000 clones of plant transcription factors—nature’s genetic on-off switches. The research that made the library possible, an eight-year process, appears in Cell Reports.

“They’re like smart missiles that go into the nucleus and bind to specific sequences of DNA,” corresponding author Steve A. Kay of University of Southern California says of transcription factors. “They will regulate genes, switching them on or off, according to how that cell needs to respond to its environment.”

Clones of these “master switches” are contained in the “wells” of microtiter plates. The library will be sent to stock centers, which will distribute the plates to national and international scientists studying plants.

“We wanted to make a high-fidelity, gold-standard collection of transcription factors that’s going to serve the plant community all over the world,” says Kay, who researches circadian rhythms, or the biological clock, in plants.

The availability of these clones has great implication for scientists such as Kay, whose research sets the stage to design a more robust plant for future food security.

An ‘instruction manual’ to plants

“Ultimately, this collection will help us understand at the molecular level the mechanisms of how plants work,” says Jose Pruneda-Paz, co-first author on the paper. Pruneda-Paz helped to create the library as a postdoctoral researcher, first in Kay’s laboratory at The Scripps Research Institute, then at the University of California, San Diego, where he is now a faculty member.

Related Articles On Futurity

“By manipulating those transcription factors, we will be able to ultimately improve plant traits such as stress resistance or seed quantity and quality,” Pruneda-Paz says. “This is the larger goal.”

Kay elaborates: “Along the way we are going to understand the wiring—the instruction manual—for how plants grow and develop. From that knowledge base comes all the translational opportunities.”

The clones were taken from Arabidopsis, a flowering plant related to cabbage and mustard.

“You can think of Arabidopsis as the mouse of the botanical world,” Kay says. “In the same way we learn a great deal about human biology from flies and mice, we learn a huge amount about clock biology because Arabidopsis is a great plant to grow in the lab.”

The collection will help researchers in the underfunded field of plant research. Of all biomedical research, the federal government spends approximately one percent on plant research.

“Given how important food is to human health, that’s rather concerning,” Kay says. “Most people in the US aren’t worried about starvation, but they are worried about dying of cancer. To those of us who stand in the aisle of the supermarket, it’s hard to believe there could be anything like a food shortage. It’s like climate change—it’s not often right in front of your eyes.”

‘Reverse genetics’

Pruneda-Paz and Ghislain Breton, another former postdoctoral researcher in Kay’s lab and co-first author of the Cell Reports study, recall what led to the idea of creating a library in 2006.

Researching how plants adapt to light/dark cycles, they zeroed in on the production of CCA1 and LHY, transcription factors that regulate Arabidopsis clock genes. Traditionally, to figure out the function of a gene would be to mutate the gene and try to decipher whether those mutations were responsible for a certain phenotype.

But they were unable to find answers this way.

“So we decided to use this reverse genetics approach,” Pruneda-Paz says. “Rather than trying to do a mutation to find a gene, we could clone all the transcription factors and then figure out which transcription factors bind and regulate the expression of CCA1. For this, we started this collection.”

Breton and Pruneda-Paz oversaw the initial construction of the library. In 2010 when Breton left for an assistant professorship at the University of Texas, in Houston, Pruneda-Paz completed the project.

Pruneda-Paz and Kay finished the project at UCSD and USC, where they were aided by a robotic platform that can conduct thousands of experiments per day. Grippers at the end of a computer-controlled “arm” located at the center of this platform can grab a microtiter plate and move it to various stations necessary for an experiment.

“So the goal behind this was to build something that can be the foundation of many other projects,” Breton says, adding that already 70 research projects in the US and Europe have resulted from the library. “It will not only be used for circadian clock research, but for many other plant biology projects in the future.”

Out in the cold

One study made possible by the library was published in Current Biology on July 7. In the research, Kay’s laboratory learned how plants regulate their gene expression in the cold. Using the library, they conducted tests isolating an interaction between two key genes—LUX and CBF1—now known to be responsible for freezing tolerance in plants.

The research showed how plants adapt to temperature changes during the normal course of the day-night cycle, and to extreme temperature change such as frost.

“We had very little idea how cold intersected with the clock, and this really reinforced the idea that transcriptional regulation is key,” says Brenda Chow, first author on the Current Biology study, who will soon start a position at GenBank, a genetic sequence database in Maryland.

“The library has been very useful across the plant community,” she says. “Specifically for my project, it was a unique way to identify the interaction between CBF1 and LUX. It would have been very difficult to identify this any other way.”

Source: USC

The post Plant ‘library’ could help protect food supply appeared first on Futurity.

The world’s first predator had a simple, wormlike brain

Fri, 07/18/2014 - 07:53

The world’s earliest known predator, which lived about 520 million years ago, had a brain that was far less complex than those found in some of its prey.

Scientists made the discovery by studying a fossilized brain from one of the animals, known as anomalocaridids—which translates to “abnormal shrimp.”

The fierce-looking arthropods were first discovered as fossils in the late 19th century but not properly identified until the early 1980s. They still have scientists arguing over where they belong in the tree of life.

“Our discovery helps to clarify this debate,” says Nicholas Strausfeld, professor of neuroscience at the University of Arizona.

“It turns out the top predator of the Cambrian had a brain that was much less complex than that of some of its possible prey and that looked surprisingly similar to a modern group of rather modest wormlike animals.”

The brain in the fossil is from a new species given the name Lyrarapax unguispinus. Analysis suggests the creatures are related to a branch of animals whose living descendants are known as onychophorans or velvet worms. These wormlike animals are equipped with stubby unjointed legs that end in a pair of tiny claws.

Onychophorans, which are also exclusively predators, grow to no more than a few inches in length and are mostly found in the Southern Hemisphere, where they roam the undergrowth and leaf litter in search of beetles and other small insects. Two long feelers extend from the head, attached in front of a pair of small eyes.

Related to velvet worms

The anomalocaridid fossil resembles the neuroanatomy of today’s onychophorans in several ways, Strausfeld says. Onychophorans have a simple brain located in front of the mouth and a pair of ganglia—a collection of nerve cells—located in the front of the optic nerve and at the base of their long feelers.

“And—surprise, surprise—that is what we also found in our fossil,” Strausfeld says, pointing out that anomalocaridids had a pair of clawlike grasping appendages in front of the eyes.

“These top predators in the Cambrian are defined by just their single pair of appendages, wicked-looking graspers, extending out from the front of their head,” he says. “These are totally different from the antennae of insects and crustaceans. Such frontally disposed appendages are not found in any other living animals with the exception of velvet worms.”

The similarities of their brains and other attributes suggest that the anomalocaridid predators could have been very distant relatives of today’s velvet worms.

Grasping appendages

“This is another contribution towards the new field of research we call neuropaleontology,” says Xiaoya Ma of the Natural History Museum in London, a coauthor on the paper. “These grasping appendages are a characteristic feature of this most celebrated Cambrian animal group, whose affinity with living animals has troubled evolutionary scientists for almost a century. The discovery of preserved brain in Lyrarapax resolves specific anatomical correspondences with the brains of onychophorans.”

Related Articles On Futurity

“Being able to directly associate appendages with parts of the brain in Cambrian animals is a huge advantage,” says Gregory Edgecombe, also at the Natural History Museum and another coauthor of the paper that is published in the journal Nature.

“For many years now paleontologists have struggled with the question of how different kinds of appendages in Cambrian fossils line up with each other and with what we see in living arthropods. Now for the first time, we didn’t have to rely just on the external form of the appendages and their sequence in the head to try and sort out segmental identities, but we can draw on the same tool kit we use for extant arthropods—the brain.”

The researchers recently presented evidence of the oldest known fossil of a brain belonging to arthropods related to insects and crustaceans and another belonging to a creature related to horseshoe crabs and scorpions.

“With this paper and our previous reports in Nature, we have identified the three principal brain arrangements that define the three groups of arthropods that exist today,” Strausfeld says. “They appear to have already coexisted 520 million years ago.”

Did predators drive brain evolution?

The Lyrarapax fossil was found in 2013 by co-author Peiyun Cong near Kunming in the Chinese province of Yunnan. Coauthors Ma and Edgecombe participated in the analysis, as did Xianguang Hou—who discovered the Chengjiang fossil beds in 1984—at the Yunnan Key Laboratory for Paleobiology at the University of Yunnan.

“Because its detailed morphology is exquisitely preserved, Lyrarapax is amongst the most complete anomalocaridids known so far,” Cong says.

Just over five inches long, Lyrarapax was dwarfed by some of the larger anomalocaridids, which reached more than three feet in length. Paleontologists excavating lower Cambrian rocks in southern Australia found that some anomalocaridids had huge compound eyes, up to 10 times larger than the biggest dragonfly eye, befitting what must have been a highly efficient hunter, Strausfeld says.

The fact that the brain of the earliest known predator appears much simpler in shape than the previously unearthed brains of its contemporaries raises intriguing questions, one of which is whether it is possible that predators drove the evolution of more complex brains.

“With the evolution of dedicated and highly efficient predators, the pressure was on other animals to be able to detect and recognize potential danger and rapidly coordinate escape movements,” Strausfeld says. “These requirements may have driven the evolution of more complex brain circuitry.”

Source: University of Arizona

 

The post The world’s first predator had a simple, wormlike brain appeared first on Futurity.

Rover images hint at a warm and wet Mars

Fri, 07/18/2014 - 07:03

Images from the Mars rover Curiosity show ancient fossilized Earth-like soils that suggest the possibility of microbial life.

The data and images gathered at the bottom of the Gale impact crater date to some 3.7 billion years ago and provide evidence that Mars was once a much warmer and wetter place.

Image of soil from Gale Crater. (Credit: University of Oregon)

NASA rovers have shown Martian landscapes littered with loose rocks from impacts or layered by catastrophic floods, rather than the smooth contours of soils that soften landscapes on Earth.

However, Curiosity’s recent data show Earth-like soil profiles with cracked surfaces lined with sulfate, ellipsoidal hollows and concentrations of sulfate comparable with soils in Antarctic Dry Valleys and Chile’s Atacama Desert.

For a new analysis published online in the journal Geology, scientists studied mineral and chemical data published by researchers closely tied with the Curiosity mission.

Not proof of life

“The pictures were the first clue, but then all the data really nailed it,” says Gregory Retallack, professor of geological sciences at University of Oregon.

Related Articles On Futurity

“The key to this discovery has been the superb chemical and mineral analytical capability of the Curiosity Rover, which is an order of magnitude improvement over earlier generations of rovers.

“The new data show clear chemical weathering trends, and clay accumulation at the expense of the mineral olivine, as expected in soils on Earth. Phosphorus depletion within the profiles is especially tantalizing, because it attributed to microbial activity on Earth.”

The ancient soils don’t prove that Mars once contained life, but they do add to growing evidence that an early wetter and warmer Mars was more habitable than the planet has been in the past 3 billion years.

Familiar ground

Curiosity rover is now exploring topographically higher and geologically younger layers within the crater, where the soils appear less conducive to life. For a record of older life and soils on Mars, new missions will be needed to explore older and more clayey terrains, Rellack says.

Surface cracks in the deeply buried soils suggest typical soil clods. Vesicular hollows, or rounded holes, and sulfate concentrations, he said, are both features of desert soils on Earth.

“None of these features is seen in younger surface soils of Mars,” Retallack says. “The exploration of Mars, like that of other planetary bodies, commonly turns up unexpected discoveries, but it is equally unexpected to discover such familiar ground.”

The newly discovered soils provide more benign and habitable soil conditions than known before on Mars. Their dating to 3.7 billion years ago puts them into a time of transition from “an early benign water cycle on Mars to the acidic and arid Mars of today,” Retallack says.

Mars microbes?

Life on Earth is believed to have emerged and began diversifying about 3.5 million years ago, but some scientists have theorized that potential evidence that might take life on Earth farther back was destroyed by plate tectonics, which did not occur on Mars.

The potential discovery of these fossilized soils in the Gale Crater dramatically increases the possibility that Mars has microbes, Malcolm Walter of the Australian Centre for Astrobiology, who was not involved in the research. “There is a real possibility that there is or was life on Mars.”

Steven Benner of the Westheimer Institute of Science and Technology in Florida has speculated that life is more likely to have originated on a soil planet like Mars than a water planet like Earth, Retallack says.

In an email, Benner writes that Retallack’s paper “shows not only soils that might be direct products of an early Martian life, but also the wet-dry cycles that many models require for the emergence of life.”

Source: University of Oregon

The post Rover images hint at a warm and wet Mars appeared first on Futurity.

Satellite tracking puts duck migration on the map

Fri, 07/18/2014 - 06:22

Researchers have used satellite tracking technology to monitor Mallard duck migration from Canada to the American Midwest and back again.

Their findings show that 2011-2012 migrations extensively used public and private wetland conservation areas.

Scientists now have baseline information for future research into what influences the ducks’ migration flight paths, landing site selection, and foraging behavior. The data will also be useful to conservationists looking for ways to ensure healthy duck populations into the future.

The research shows private lands enrolled in the USDA’s Wetland Reserve Program (WRP) have become a critical component of the ducks’ migrations, says Lisa Webb, cooperative assistant professor of wildlife at University of Missouri and research ecologist with the USGS Cooperative Research Unit. The WRP provides landowners with technical and financial support for restoring and maintaining wetland areas that have conservation benefits.

Migratory ducks also use sanctuaries on public areas, such as state wildlife management areas and the National Wildlife Refuge System extensively.

A mallard with a tracking device on its back lands on a Missouri wetland. (Credit: Mike Wintroath/Arkansas Game and Fish Commission)

Tracking ducks from orbit

The project attached small solar-powered tracking devices to the ducks, which transmitted their locations every 4 hours. Data  bounced off a satellite and then went to the researchers who monitored the ducks’ progress in real time.

Related Articles On Futurity

“This allowed us to evaluate their behavior and biology on an exceptionally detailed scale throughout their annual migration cycle,” says Dylan Kesler, co-investigator and assistant professor of fisheries and wildlife.

“Previously, we only knew when the birds left and when they arrived and little else. Now, we have an extensive dataset from which to understand the role of different habitats and other factors in migratory populations. We can begin to understand migration in a way that it has never been understood before.”

Kesler points out that scientists also now know more about the pre-migratory feeding habits of the ducks. The ducks’ ability to gain weight before their long flights is an important factor in how many will make it to their northern breeding grounds and southern wintering stations.

Identifying ways to support duck populations is important because more than 50 percent of American wetlands have been lost since 1800, says Kesler. This loss has affected migratory bird populations and migration timing and routes, he says.

The satellite tracker followed the ducks from Canada to Mississippi. This is the first time that this data has been captured. (Credit: U. Missouri)

The mallard is probably the most familiar and abundant duck in North America. The male has a green head and chestnut breast while females are mottled brown. Both sexes have a blue speculum (wing patch) bordered on both sides by white.

The mallard nests in the spring throughout Canada and northern US, with eggs hatching in April and May. They migrate to a winter home in the Midwestern and southern US in September and October. They take off back to Canada in February and March.

The National Wildlife Refuge System is the largest protected area network in North America specifically designated for wildlife conservation and includes more than 150 million acres stretching from Alaska to the Caribbean.

The program is designed to maintain wetlands and other wildlife habitat in areas where surrounding lands have been converted to golf courses, cities, and agriculture.

Better conservation areas

The new dataset is so extensive that William Beatty, postdoctoral fellow of University of Missouri Fisheries and Wildlife Sciences, is using the information as the basis for a computer model to understand future migrations and the effect that human populations, physical obstructions, agriculture, and conservation areas may have on migratory bird populations.

The satellite tracking data showed the researchers something not known until now—during their migrations ducks forage for food up to 20 miles away from their roosting areas. Beatty says this discovery shows how the conservation areas being used by the ducks can be improved.

He recommends establishing multiple types of conservation areas in selected locations to promote wetland habitat diversity to give birds a variety of food choices.

“The role of habitat conditions during the non-breeding season—fall migration, winter and spring migration—is not fully understood,” Webb says.

“Now that scientists and conservationists better understand which habitat features are important to mallards during the non-breeding season, we can begin to consider how results may help conservation agencies as they decide where to restore new wetland habitat for waterfowl.”

The research results appear in Biological Conservation.

Source: University of Missouri

The post Satellite tracking puts duck migration on the map appeared first on Futurity.

Prehistoric teeth show people ate medicinal weeds

Thu, 07/17/2014 - 08:48

Plaque on prehistoric human teeth offers a whole new perspective on our ancestors’ diet and their relationship with plants.

The research suggests that prehistoric people living in Central Sudan may have understood both the nutritional and medicinal qualities of many plants, including the purple nut sedge (Cyperus rotundus), regarded as a nuisance weed today.

Related Articles On Futurity

The research was carried out at Al Khiday, a pre-historic site on the White Nile in Central Sudan. For at least 7,000 years, beginning before the development of agriculture and continuing after agricultural plants were also available, the people of Al Khiday ate the plant purple nut sedge. The plant is a good source of carbohydrates and has many useful medicinal and aromatic qualities.

“Purple nut sedge is today considered to be a scourge in tropical and sub-tropical regions and has been called the world’s most expensive weed due to the difficulties and high costs of eradication from agricultural areas,” says lead author Karen Hardy, a professor at the Universitat Autònoma de Barcelona and an honorary research associate at the University of York.

“By extracting material from samples of ancient dental calculus we have found that rather than being a nuisance in the past, its value as a food, and possibly its abundant medicinal qualities were known. More recently, it was also used by the ancient Egyptians as perfume and as medicine.

“We also discovered that these people ate several other plants and we found traces of smoke, evidence for cooking, and for chewing plant fibers to prepare raw materials. These small biographical details add to the growing evidence that prehistoric people had a detailed understanding of plants long before the development of agriculture.”

Burial ground

Al Khiday is a complex of five archaeological sites near Omdurman. One of the sites is predominantly a burial ground of pre-Mesolithic, Neolithic, and Later Meroitic age. As a multi-period cemetery, it gave the researchers a useful long-term perspective on the material recovered.

“Al Khiday is a unique site in the Nile valley, where a large population lived for many thousands of years. This study demonstrates that they made good use of the locally available wild plant as food, as raw materials, and possibly even as medicine,” says Donatella Usai, from the Instituto Italiano per l’Africa e l’Oriente in Rome, who led the excavation.

The researchers found ingestion of the purple nut sedge in both pre-agricultural and agricultural periods. The plant’s ability to inhibit Streptococcus mutans, a bacterium that contributes to tooth decay, may have contributed to the unexpectedly low level of cavities found in the agricultural population.

The findings are detailed in a paper published in the journal PLOS ONE.

Beyond meat and protein

“The evidence for purple nut sedge was very clear in samples from all the time periods we looked at. This plant was evidently important to the people of Al Khiday, even after agricultural plants had been introduced,” says Stephen Buckley, a research fellow at the University of York’s BioArCh research facility, who conducted the chemical analyses.

“The development of studies on chemical compounds and microfossils extracted from dental calculus will help to counterbalance the dominant focus on meat and protein that has been a feature of pre-agricultural dietary interpretation, up until now,” says Hardy.

“The new access to plants ingested, which is provided by dental calculus analysis, will increase, if not revolutionize, the perception of ecological knowledge and use of plants among earlier prehistoric and pre-agrarian populations.”

The Italian Minister of Foreign Affairs, Istituto Italiano per l’Africa e l’Oriente, Centro Studi Sudanesi e Sub-Sahariani, and the Universities of Milano, Padova and Parma funded the fieldwork. The National Corporation for Antiquities and Museums (NCAM) of Sudan also supported the research.

Source: University of York

The post Prehistoric teeth show people ate medicinal weeds appeared first on Futurity.

These mutant worms can’t get drunk

Thu, 07/17/2014 - 07:41

Neuroscientists have used human molecules to create mutant worms that don’t get drunk on alcohol.

“This is the first example of altering a human alcohol target to prevent intoxication in an animal,” says Jon Pierce-Shimomura, assistant professor at University of Texas at Austin.

An alcohol target is any neuronal molecule that binds alcohol, of which there are many.

One important aspect of this modified alcohol target, a neuronal channel called the BK channel, is that the mutation only affects its response to alcohol.

The BK channel typically regulates many important functions, including activity of neurons, blood vessels, the respiratory tract, and bladder. The alcohol-insensitive mutation does not disrupt these functions at all.

“We got pretty lucky and found a way to make the channel insensitive to alcohol without affecting its normal function,” says Pierce-Shimomura, who is corresponding author of the study published in the Journal of Neuroscience.

The research has potential applications for treating people addicted to alcohol, he says. “Our findings provide exciting evidence that future pharmaceuticals might aim at this portion of the alcohol target to prevent problems in alcohol abuse disorders. However, it remains to be seen which aspects of these disorders would benefit.”

Unlike drugs such as cocaine, which has a specific target in the nervous system, the effects of alcohol on the body are complex and have many targets across the brain. The various other aspects of alcohol addiction, such as tolerance, craving, and the symptoms of withdrawal, may be influenced by different alcohol targets.

The worms used in the study, Caenorhabditis elegans, model intoxication well. Alcohol causes the worms to slow their crawling and exhibit less wriggling from side to side. The intoxicated worms also stop laying eggs, which build up in their bodies and can be easily counted.

‘James Bond’ drug

Unfortunately, C. elegans are not as ideal for studying the other areas of alcohol addiction, but mice make an excellent model.

The modified human BK channel used in the study, which is based on a mutation discovered by lead author and graduate student Scott Davis, could be inserted into mice. These modified mice would allow scientists to investigate whether this particular alcohol target also affects tolerance, craving, and other symptoms relevant to humans.

Pierce-Shimomura speculated that their research could even be used to develop a “James Bond” drug someday that would enable a spy to drink his opponent under the table without getting drunk himself. Such a drug could potentially be used to treat alcoholics because it would counteract the intoxicating and potentially addicting effects of the alcohol.

Research associate Luisa Scott and undergraduate student Kevin Hu were also coauthors of the study.

The ABMRF/The Foundation for Alcohol Research, the National Institute on Alcohol Abuse and Alcoholism, and the Waggoner Center for Alcohol and Addiction Research at the University of Texas at Austin provided funding for the study.

Source: University of Texas at Austin

The post These mutant worms can’t get drunk appeared first on Futurity.

Fewer strokes for older Americans, but experts worry

Thu, 07/17/2014 - 07:16

The number of older Americans who suffer strokes and the number of younger Americans who die from strokes is on the decline, but the growing rates of obesity threaten to reverse the trend, experts say.

A recent study found a 24 percent overall decline in first-time strokes in each of the last two decades and a 20 percent overall drop per decade in deaths after stroke.

The decline in stroke risk was concentrated mainly in the over-65 set, however, with little progress in reducing strokes among younger people.

In contrast, the drop in stroke-related deaths was primarily found among those under age 65, with mortality rates holding firm in older people.

“We can congratulate ourselves that we are doing well, but stroke is still the No. 4 cause of death in the United States,” says coauthor Josef Coresh, professor of epidemiology at Johns Hopkins University.

“This research points out the areas that need improvement. It also reminds us that there are many forces threatening to push stroke rates back up and, if we don’t address them head-on, our gains may be lost.”

Obesity and stroke

Coresh says he worries what the obesity epidemic, which began in the 1990s, will mean for stroke trends.  As millions more people are diagnosed with hypertension and diabetes—which often go hand-in-hand with obesity—they will face increased risk for stroke, he says.

Related Articles On Futurity

For the study, published in the Journal of the American Medical Association, researchers used data from the Atherosclerosis Risk in Communities study, which tracked residents of four US communities who were between the ages of 45 and 64 when the study began in the late 1980s.

In this analysis, they followed 14,357 participants who had no history of stroke in 1987. The research team looked for all stroke hospitalizations and deaths from then to the end of 2011.

Seven percent of the study sample (1,051 people) had a stroke during that period. Of those, 10 percent died within 30 days. Another 21 percent, 40 percent, and 58 percent died within one year, within five years, and by the end of the study in 2011.

800,000 strokes a year

Each decade, the number of deaths occurring within 10 years of a stroke was reduced by roughly eight deaths per 100 cases. The decrease was not across the board; it was primarily the result of stroke victims under the age of 65 surviving longer.

While they varied by age, the results were similar across race and gender, a finding that encouraged researchers since a previous study suggested African-American stroke rates were not improving.

Decreases in stroke incidence and mortality are partly due to more successful control of risk factors such as blood pressure or smoking and to the wide use of statin medications to control cholesterol. An increase in diabetes likely acted in the opposite direction, however, pushing stroke rates back up to some extent.

Stroke severity and improvements in treatment likely also impacted the data, though the study could not measure the exact role they played.

The age disparities suggest areas where physicians and researchers may want to focus in the future to prevent strokes in those under 65 and reduce deaths in those over 65. Nearly 800,000 Americans suffer strokes each year; of those, about 600,000 are first-time strokes.

Pay attention to subgroups

“Stroke is not only one of the main causes of death, but a leading cause of long-term disability in adults. Therefore, prevention is the best strategy,” says study leader Silvia Koton, a visiting faculty member at the Bloomberg School and incoming nursing chair at Tel Aviv University.

The number of US death certificates listing stroke as the underlying cause of death has decreased for a long time. Only studies such as this one, however, can distinguish whether the decline is due to a reduction in the number of strokes or whether people are just living longer after having them, researchers say.

The new study also confirmed the occurrence of each stroke by reviewing each medical chart using uniform criteria.

“Since rates are not equally falling across the board, physicians and policymakers need to pay closer attention to specific subgroups,” Koton says. “These data are also helpful in monitoring the results of how we care for people of all ages, hopefully helping them even before they have a stroke.”

The National Heart, Lung and Blood Institute supported the research.

Source: Johns Hopkins University

The post Fewer strokes for older Americans, but experts worry appeared first on Futurity.

2 drugs work better than 1 to stop cancer

Thu, 07/17/2014 - 06:16

A new combination drug dramatically slows tumor growth in mice with few side effects.

Researchers combined two drugs: a COX-2 inhibitor, similar to the one in Celebrex, and another drug that stops blood vessels from forming. The combined effect is much more potent than using either drug individually at higher doses, according to a recent study.

“We’ve been studying the effects of COX and sEH inhibitors, both by themselves and in combination, for several years,” says Bruce Hammock, professor at University of California, Davis, and senior author of a study published in the Proceedings of the National Academy of Sciences.

“We were surprised to find that the dual inhibitor was more active than higher doses of each compound, either individually or together.

“By combining the two molecules into one we got much greater potency against several diseases and completely unique effects in terms of blocking tumor growth and metastasis.”

Lung and breast tumors

Both COX and sEH enzymes control lipid signaling, which has long been associated with inflammation, cell migration, proliferation, hypertension, and other processes.

Related Articles On Futurity

COX inhibitors block production of inflammatory and pain-inducing lipids, while sEH inhibitors preserve anti-hypertensive, anti-inflammatory and analgesic compounds. Separate COX and sEH inhibitors were previously found to work together in reducing inflammation and neuropathic pain.

After testing individual COX-2 and sEH inhibitors, the team synthesized the drug (PTUTB), the first combined COX-2/sEH inhibitor. They then tested it against human lung and breast tumors, both in vitro and in mice.

They found that the new drug blocked the growth of endothelial cells, which help blood vessels form. This reduced lung and breast tumor growth by 70 to 83 percent.

Minimal side effects

“This represents a new mechanism to control blood vessel and tumor growth,” Hammock says, who notes that there were minimal side effects, including no cardiovascular or gastrointestinal effects.

“This is particularly important when administering COX-2 inhibitors, which have well-known cardiovascular risks. However, the added sEH inhibitor appears to block COX-2′s side effects.”

Though the research was focused exclusively on cancer, the dual compound could benefit other conditions, such as macular degeneration, Hammock says.

“If we move beyond cancer, this drug combination could block a number of pathologies, ranging from cardiac hypertrophy to neuropathic pain. The compound looks quite powerful for a number of conditions.”

Other researchers from UC Davis and UC San Diego contributed to the study.

Source: UC Davis

The post 2 drugs work better than 1 to stop cancer appeared first on Futurity.

Practice makes you better, but not perfect

Wed, 07/16/2014 - 13:18

No matter how much you do it, practice may never make you an expert. But it probably will make you better.

“This question is the subject of a long-running debate in psychology,” says Fred Oswald, professor and chair of psychology at Rice University. “Why do so few people who are involved in sports such as golf, musical instruments such as the violin, or careers such as law or medicine ever reach an expert level of performance?”

Related Articles On Futurity

For a new study published in Psychological Science, Oswald and colleagues reviewed 88 previous studies (more than 11,135 total participants) published through 2014 that investigated relevant research on practice predicting performance in music, games, sports, educational, and occupational domains.

Within each domain, the researchers averaged the reported results across all relevant studies and found that “deliberate practice”—defined as engagement in structured activities created specifically to improve performance in a specific field—explained 26 percent of the variance in performance for games, 21 percent for music, 18 percent for sports, 4 percent for education, and less than 1 percent for professions.

Nature vs. nurture

“Deliberate practice was a strong overall predictor of success in many performance domains, and not surprisingly, people who report practicing a lot generally tend to perform at a higher level than people who practice less,” Oswald says.

“However, perhaps the more important contribution of our study is that no matter how strongly practice predicted performance in our findings, there was always statistical room for other personal factors to predict learning a skill and performing successfully, including basic abilities.”

Significant amounts of research have already identified basic abilities as also being important to predicting performance, but some researchers tend to minimize them and consider practice as the sole determinant of performance.

“Other factors matter as well, but even so, no one says that practice will ever hurt you; but be careful if you are walking tightropes.”

Rice University, Princeton University, and Michigan State University funded the study.

Source: Rice University

The post Practice makes you better, but not perfect appeared first on Futurity.

Tough girls don’t make marriages crumble

Wed, 07/16/2014 - 12:45

Researchers say “not so fast” to claims that daughters cause divorce. They argue that girls may be hardier than boys, even in the womb, and more likely to survive pregnancies stressed by a troubled marriage.

In the United States, couples with daughters are somewhat more likely to divorce than couples with sons. Many scholars have read those numbers as evidence that daughters cause divorce.

Previous studies have argued that fathers prefer boys and are more likely to stay in marriages that produce sons. Conversely, the argument runs, men are more likely to leave a marriage that produces daughters. That scholarly claim has been around for decades, and has gained a following in popular culture.

“Many have suggested that girls have a negative effect on the stability of their parents’ union,” says Duke University economist Amar Hamoudi, who co-authored the new study with Jenna Nobles, a University of Wisconsin-Madison sociologist. “We are saying: ‘Not so fast.’”

Robust females

Hamoudi, who teaches in the Sanford School of Public Policy and is a fellow of the Duke Center for Child and Family Policy, points to a very different potential explanation for differing divorce rates: the robustness of female embryos.

Related Articles On Futurity

Throughout the life course, girls and women are generally hardier than boys and men. At every age from birth to age 100, boys and men die in greater proportions than girls and women.

Epidemiological evidence also suggests that the female survival advantage actually begins in utero. These more robust female embryos may be better able to withstand stresses to pregnancy, including stresses caused by relationship conflict.

Based on an analysis of longitudinal data from a nationally representative sample of US residents from 1979 to 2010, Hamoudi and Nobles say a couple’s level of relationship conflict predicts their likelihood of subsequent divorce. Their findings are published in the journal Demography.

Conflict predicts sex of children

Strikingly, the authors also found that a couple’s level of relationship conflict at a given time also predicted the sex of children born to that couple at later points in time. Women who reported higher levels of marital conflict were more likely in subsequent years to give birth to girls, rather than boys.

“Girls may well be surviving stressful pregnancies that boys can’t survive,” Hamoudi says. “Thus girls are more likely than boys to be born into marriages that were already strained.”

Hamoudi and Nobles also make a broader point that reaches beyond the issue of divorce. Population studies typically begin at birth, Hamoudi says. Yet if demographers and other social scientists want to fully understand how family dynamics affect populations, they need to consider the months before birth as well.

“It’s time for population studies to shine a light on the period of pregnancy,” Hamoudi says. “The clock does not start at birth.”

The Center for Demography of Health and Aging at the University of Wisconsin-Madison helped support the research.

Source: Duke University

The post Tough girls don’t make marriages crumble appeared first on Futurity.

Coral skeletons record changes in ocean temps

Wed, 07/16/2014 - 08:38

Just as growth rings from trees offer clues to past climate change, corals can do the same for changes in the ocean.

Scientists know that ice sheets wax and wane as the concentration of CO2 decreases and increases in the atmosphere. Researchers believe that the deep ocean—which stores 60 times more inorganic sources of carbon than is found in the atmosphere—must play a vital role in this shift.

Related Articles On Futurity

To investigate this, the researchers analyzed the calcium carbonate skeletons of corals collected from deep in the North Atlantic Ocean. The corals were built up from 11,000 to 18,000 years ago out of CO2 dissolved in the ocean.

“We used a new technique that has been developed at Caltech, called clumped isotope thermometry, to determine what the temperature of the ocean was in the location where the coral grew,” says Nivedita Thiagarajan, a postdoctoral scholar in geochemistry at California Institute of Technology (Caltech) and lead author of the paper published in the journal Nature.

“We also used radiocarbon dating and uranium-series dating to estimate the deep-ocean ventilation rate during this time period.”

Warm water under cold water

The deep ocean started warming before the start of a rapid climate change event about 14,600 years ago. During this time, the Earth was transitioning from a glacial period—when ice sheets covered a large portion of Earth—to the current interglacial period.

“We found that a warm-water-under-cold-water scenario developed around 800 years before the largest signal of warming in the Greenland ice cores, called the Bølling–Allerød,” says Jess F. Adkins, professor of geochemistry and global environmental science.

“CO2 had already been rising in the atmosphere by this time, but we see the deep-ocean reorganization brought on by the potential energy release to be the pivot point for the system to switch from a glacial state, where the deep ocean can hold onto CO2, and an interglacial state, where it lets out CO2.”

“Studying Earth’s climate in the past helps us understand how different parts of the climate system interact with each other,” says Thiagarajan. “Figuring out these underlying mechanisms will help us predict how climate will change in the future.”

Other authors of the paper are from Caltech and from UC Irvine.

Source: Caltech

The post Coral skeletons record changes in ocean temps appeared first on Futurity.

Neutered golden retrievers have more health problems

Wed, 07/16/2014 - 07:27

Golden retrievers are more vulnerable than labs to the long-term health effects of neutering. Goldens tend to have higher rates of certain joint disorders and devastating cancers, new research shows.

“We found in both breeds that neutering before the age of 6 months, which is common practice in the United States, significantly increased the occurrence of joint disorders—especially in the golden retrievers,” says lead investigator Benjamin Hart.

Related Articles On Futurity

“The data, however, showed that the incidence rates of both joint disorders and cancers at various neuter ages were much more pronounced in golden retrievers than in the Labrador retrievers,” Hart says, a distinguished professor emeritus in University of California, Davis, School of Veterinary Medicine. Results of the study appear online in the journal PLOS ONE.

Hart notes that the findings not only offer insights for researchers in both human and veterinary medicine, but are also important for breeders and dog owners contemplating when, and if, to neuter their dogs. Dog owners in the United States are overwhelmingly choosing to neuter their dogs, in large part to prevent pet overpopulation or avoid unwanted behaviors.

This new comparison of the two breeds was prompted by the research team’s earlier study, reported in February 2013, which found a marked increase in the incidence of two joint disorders and three cancers in golden retrievers that had been neutered.

Comparing breeds

The golden retriever and the Labrador retriever were selected for this study because both are popular breeds that have been widely accepted as family pets and service dogs. The two breeds also are similar in body size, conformation and behavioral characteristics.

The study was based on 13 years of health records from the UC Davis School of Veterinary Medicine for neutered and non-neutered male and female Labrador retrievers and golden retrievers between the ages of 1 and 8 years of age. These records included 1,015 golden retriever cases and 1,500 Labrador retriever cases.

The researchers compared the two breeds according to the incidence of three cancers: lymphosarcoma, hemangiosarcoma, and mast cell tumor. They also calculated the incidence for each breed of three joint disorders: hip dysplasia, cranial cruciate ligament tear, and elbow dysplasia.

The researchers also noted in these cases whether the dogs had been neutered before the age of 6 months, between 6 and 11 months, between 12 and 24 months or between age 2 and 9 years of age.

Neutering and joint disorders

In terms of joint disorders, the researchers found that non-neutered males and females of both breeds experienced a five-percent rate of one or more joint disorders. Neutering before the age of 6 months was associated with a doubling of that rate to 10 percent in Labrador retrievers.

In golden retrievers, however, the impact of neutering appeared to be much more severe. Neutering before the age of 6 months in goldens increased the incidence of joint disorders to what Hart called an “alarming” four-to-five times that of non-neutered dogs of the same breed.

Male goldens experienced the greatest increase in joint disorders in the form of hip dysplasia and cranial cruciate ligament tear, while the increase for Labrador males occurred in the form of cranial cruciate ligament tear and elbow dysplasia.

“The effects of neutering during the first year of a dog’s life, especially in larger breeds, undoubtedly reflects the vulnerability of their joints to the delayed closure of long-bone growth plates, when neutering removes the gonadal, or sex, hormones,” Hart says.

Neutering and cancers

The data also revealed important differences between the breeds in relation to the occurrence of cancers. In non-neutered dogs of both breeds, the incidence of one or more cancers ranged from 3 to 5 percent, except in male goldens, where cancer occurred at an 11-percent rate.

Neutering appeared to have little effect on the cancer rate of male goldens. However, in female goldens, neutering at any point beyond 6 months elevated the risk of one or more cancers to three to four times the level of non-neutered females.

Neutering in female Labradors increased the cancer incidence rate only slightly.

“The striking effect of neutering in female golden retrievers, compared to male and female Labradors and male goldens, suggests that in female goldens the sex hormones have a protective effect against cancers throughout most of the dog’s life,” Hart says.

The American Kennel Club Canine Health Foundation and the Center for Companion Animal Health at UC Davis funded the research.

Source: UC Davis

The post Neutered golden retrievers have more health problems appeared first on Futurity.

Spinach leaves vibrate to kick off photosynthesis

Wed, 07/16/2014 - 07:18

Vibrations deep within spinach leaves enhance the efficiency of photosynthesis—the energy conversion process that powers life on our planet.

The discovery could potentially help engineers make more efficient solar cells and energy storage systems. It also injects new evidence into an ongoing “quantum biology” debate over exactly how photosynthesis manages to be so efficient.

Through photosynthesis, plants and some bacteria turn sunlight, water, and carbon dioxide into food for themselves and into oxygen for animals to breathe. It’s perhaps the most important biochemical process on Earth and scientists don’t yet fully understand how it works.

Related Articles On Futurity

The study’s findings identify specific molecular vibrations that help enable charge separation—the process of kicking electrons free from atoms in the initial steps of photosynthesis.

“Both biological and artificial photosynthetic systems take absorbed light and convert it to charge separation. In the case of natural photosynthesis, that charge separation leads to biochemical energy,” explains Jennifer Ogilvie, an associate professor of physics and biophysics at the University of Michigan and lead author of a paper published in Nature Chemistry.

“In artificial systems, we want to take that charge separation and use it to generate electricity or some other useable energy source such as biofuels.”

Charge separation

It takes about one-third of a second to blink your eye. Charge separation happens in roughly one-hundredth of a billionth of that amount of time. Ogilvie and her research group developed an ultrafast laser pulse experiment that can match the speed of these reactions. By using carefully timed sequences of ultrashort laser pulses, Ogilvie and coworkers were able to initiate photosynthesis and then take snapshots of the process in real time.

The researchers worked with Charles Yocum, a professor emeritus, to extract what’s called the photosystem II reaction centers from the leaves.

Located in the chloroplasts of plant cells, photosystem II is the group of proteins and pigments that does the photosynthetic heavy lifting. It’s also the only known natural enzyme that uses solar energy to split water into hydrogen and oxygen.

Spinach leaves

To get a sample, the researchers bought a bag of spinach leaves from a grocery store. “We removed the stems and veins, put it in the blender and then performed several extraction steps to gently remove the protein complexes from the membrane while keeping them intact.

“This particular system is of great interest to people because the charge separation process happens extremely efficiently,” she says. “In artificial materials, we have lots of great light absorbers and systems that can create charge separation, but it’s hard to maintain that separation long enough to extract it to do useful work. In the photosystem II reaction center, that problem is nicely solved.”

The researchers used their unique spectroscopic approach to excite the photosystem II complexes and examine the signals that were produced.

Exciting molecules

“We can carefully track what’s happening,” Ogilvie says. “We can look at where the energy is transferring and when the charge separation has occurred.”

The spectroscopic signals they recorded contained long-lasting echoes, of sorts, that revealed specific vibration motions that occurred during charge separation.

“What we’ve found is that when the gaps in energy level are close to vibrational frequencies, you can have enhanced charge separation,” Ogilvie says. “It’s a bit like a bucket-brigade: how much water you transport down the line of people depends on each person getting the right timing and the right motion to maximize the throughput. Our experiments have told us about the important timing and motions that are used to separate charge in the photosystem II reaction center.”

She envisions using this information to reverse engineer the process—to design materials that have appropriate vibrational and electronic structure to mimic this highly efficient charge separation process.

The US Department of Energy, the National Science Foundation, and the University of Michigan Center for Solar and Thermal Energy Conversion, as well as the Research Council of Lithuania funded the research.

Source: University of Michigan

The post Spinach leaves vibrate to kick off photosynthesis appeared first on Futurity.


« Back