Office work and human interaction in general are increasingly happening online. And just as with in-person teams, organizations can benefit from being able to predict online group performance.
In past research, Anita Woolley coined the term “collective intelligence.” It describes a measure of the general effectiveness of a group on a wide range of tasks.
Woolley, assistant professor of organizational behavior and theory at Carnegie Mellon University’s Tepper School of Business, has conducted a new study that demonstrates the same key factors that influence the collective intelligence in face-to-face teams also apply to online groups.Related Articles On Futurity
- California Institute of TechnologyWith autism, social stature’s not an issue
- Washington University in St. LouisNever forget a face? Take the test
- Georgia Institute of TechnologyPolitics on Facebook can test friendship
“Our previous research was able to identify factors that correlate with collective intelligence,” Woolley says. “For instance, we found that having a lot of smart people in a group does not necessarily make the group smarter.
“However, we also found a significant correlation between the individuals’ ability to reason about the mental states of others—an ability called Theory of Mind—and the collective intelligence of the group.”
One way Theory of Mind is measured is by a Reading of the Eyes test, in which participants read the mental states of others by looking at photos of their eyes.
Woolley and her colleagues divided study participants into 68 distinct groups, some restricted to communicating only online and others allowed to communicate face-to-face. Individual participants were given a Theory of Mind test, and then the groups performed a series of tasks together to measure their collective intelligence.
“Our findings reveal that the same key factors predict collective intelligence in both face-to-face and online teams,” Woolley says.
“Theory of Mind abilities are just as important to group effectiveness in online environments as they are in office environments. We hope that this insight will give organizational managers a new tool in predicting the success of online teams.”
The study also mirrors findings from previous research that demonstrated collective intelligence was significantly correlated to the number of women in the group; a higher number of women raised the group’s collective intelligence.
There also is a negative correlation associated with the number of speaking turns by group members. Groups in which a few individual dominated conversation scored lower in terms of collective intelligence, as opposed to groups with more vibrant discussions—whether these discussions took place in a room or online.
The study, published in PLOS ONE, includes coauthors at MIT and Union College.
Source: Carnegie Mellon University
For children with type 1 diabetes, balance is key. Too little blood sugar can lead to seizures or coma. But new research shows that too much can lead to slower growth in some areas of the brain.
For the study, which is published in the journal Diabetes, researchers tracked brain structure and cognitive function in 144 young children with type 1 diabetes and a comparison group of 72 children without diabetes over 18 months.Related Articles On Futurity
- Emory UniversityCells in hydrogel reverse diabetes in mice
- Washington University in St. LouisPeople with episodic amnesia aren't 'stuck in time'
- University of MissouriWhy neurons get 'happy' in different ways
MRI scans showed that the brains of both groups of kids were growing, but growth was slower in several areas of the brain in the children with type 1 diabetes.
Growth was slowest in children with the highest blood sugar levels and in children whose levels showed the most fluctuation.
“This study shows we need to strike a balance between high blood sugar levels and low sugar levels, and avoid those extremes,” says Eva Tsalikian, a pediatric endocrinologist at University of Iowa Children’s Hospital. “The better we control those levels, the less likely that a child’s brain development will be affected.”
“New technology, such as continuous blood sugar monitors, may help prevent large swings in blood sugar levels,” says Michael Tansey, who is also a pediatric endocrinologist.
The researchers also tested the children’s brain function with standard tests of IQ, learning and memory, and mood and behavior, but they found no significant differences between the two groups.
The children will be followed for another five years to see if they develop differences in brain structure and function.
The National Institutes of Health funded the study.
Source: University of Iowa
The post Blood sugar may slow brain growth in diabetic kids appeared first on Futurity.
As one year ends, we ask ourselves what changes we want to make for the next. Ted Fischer, an anthropologist at Vanderbilt University and wellbeing advisor to the World Health Organization, has some ideas about where to begin.
“It’s not just money, and I think we’re realizing that more and more,” Fischer says. “But that’s a big realization because for a long time we’ve thought that money is the answer.”Related Articles On Futurity
- McGill UniversityMother Nature issues a wake-up call
- University of California, BerkeleyHow pride could hint at mood disorders
- Duke UniversityOrphanages: 'viable option' or 'last resort'?
Fischer is the author of The Good Life: Aspiration, Dignity and the Anthropology of Wellbeing, (Stanford University Press, 2014).
For The Good Life, Fischer studied German supermarket shoppers and Guatemalan coffee farmers to discover what hopes and dreams they share, and how anthropology can tell us about what the “good life” means for all of us.
Fischer describes the good life not as a goal in and of itself, but a journey. The good life entails having realistic aspirations to direct that journey, sufficient opportunity to realize those aspirations, a sense of dignity, and being able to pursue a life with purpose.
Fischer found that these principles hold true for both middle-class Germans and poor Guatemalan Mayans. Understanding how wellbeing is defined across cultures can give us a better idea of how to make the best of our own lives and livelihoods, and how to make more effective public policy decisions.4 principles to the ‘good life’ 1. We want more
“I’ve been working with Mayan farmers in Guatemala for many years, and I’ve long been struck by how similar what they would like out of life is to what we want out of life,” Fischer says.
We tend to assume the poor are exclusively driven by need, while wealthier people are driven by desire, he says, but the reality is that once a person’s basic needs are addressed, everyone tends to want the same sort of things.
The scale and details may differ—the Mayan farmer may aspire to send their child to the private Catholic school and buy a new truck, while a comparatively wealthy German supermarket shopper may yearn to make more expensive improvements—but at root they are strikingly similar.
They want to improve their lots, and they want their children to have better lives than they had.2. Give us a chance
While we may all aspire to something better, aspirations don’t mean much without adequate opportunity to realize them, Fischer says.
The Mayan farmers are a good example of this. Guatemala has been a major coffee producer since the 19th century, but until very recently, most of it was grown on large, low-altitude plantations that mass-produced coffee for major coffee companies. The Maya, who lived and farmed food crops at higher altitudes, would come down from the mountains to work as laborers on the plantations in order to make ends meet. It was a job of last resort—the work was backbreaking, they were poorly treated, and the pay was low.
Recently, however, a market has emerged for coffee grown at very high altitudes, allowing the Maya to grow coffee on their own land instead of down on the plantations. This shift in the market gave the Maya the opportunity to transform from laborers to entrepreneurs—a transformation that has had tangible economic and social benefits for Maya communities.
Aspirations without opportunity lead to frustrations, even societal upheaval, as was seen with the Arab Spring, Fischer says.3. Living with dignity
The desire to live with dignity is universal. In Guatemala, the high-end coffee market allows the Maya to support their families by owning their own labor and working on their own land. Additionally, they take pride in being able to produce a luxury product that is in high demand.
In Germany, workers’ dignity is very important. Many trades have guilds regulating and credentialing tradecrafts, formalizing the expertise necessary to become a master baker or bike mechanic.
Additionally, there is a strong distinction between work and personal time in Germany—as inconvenient as it is for shoppers, stores close at 6 on weeknights and 2 on Saturdays in order to preserve work-life balance for sales staff.4. A larger purpose
Being able to live according to a greater purpose is the final component of the good life. While aspirations tend to be smaller, individual goals, purpose encompasses the big-picture ideals to which we dedicate our lives.
“It could be big things like religion; it could be small things like our trade or craft–but we want to be committed to something bigger than ourselves,” Fischer says.
In Germany there is a strong belief that markets should be moral—that products should be conscientiously produced and that workers’ rights be protected. So when most Germans state a preference for fair-trade, organic, humanely raised eggs, they are advancing the cause of creating moral markets.The big picture
An economist will point out that a lot of Germans don’t actually end up buying the conscientiously produced eggs they say they prefer, but rather opt for the cheaper alternatives. But Fischer says that understanding that the preference exists in the first place is as important to good public policy as understanding what people ultimately end up doing. Anthropologists are the ones who ask those questions.
Using anthropology to inform public policy is a valuable way for policymakers to finally confront the thornier issues of wellbeing. “The question we have to ask,” Fischer says, “is ‘What kind of society do we want to live in?'”
Source: Vanderbilt University
A recently discovered 1776 deathbed manuscript contains the historically significant writings of two noted Mohegan tribal cultural leaders.
The document, a religious discussion between a Mohegan Indian woman and her mother, is historically valuable in its own right, but after subsequent research and analysis, historians found a bigger surprise.Related Articles On Futurity
- Penn StateClimate change withered Maya civilization
- Penn StateLincoln's nationalism helped save Union
- University of OregonVengeful God keeps cheaters honest
The document narrative was written by the Reverend Samson Occom, an 18th century Mohegan minister, and notations made on it more than 60 years later were penned by Fidelia Hoscott Fielding (1827–1908), the last known speaker of the Mohegan Pequot language.
Occom figures prominently in the history of New England and its Indian communities, as well as American religious history, and his journals from the time period of the newly discovered document had been considered lost.
The document is one of the only recorded Native American deathbed statements from this time period. In addition, the notations by Fielding are valuable given her importance in Mohegan history and culture and given the scarcity of known writings by her.
“The Mohegan Council of Elders, the tribal historian, and other tribal representatives were keenly interested in the materials from these two noted tribal cultural figures,” says Paul Grant-Costa of the Yale Indian Papers Project at Yale University.
“Occom’s journal from this time period has been missing, with most scholars believing it was lost. This document suggests otherwise, and it may be a key to finding more Occom material.
“With respect to the Fielding portion of the document, other than a journal written when she was elderly, most, if not all, of her other manuscripts were burned in a fire back in the 1930s. This would be the earliest of her writings.”
During a recent visit to the archives of the Thomas Leffingwell House & Museum in Norwich, Connecticut, Grant-Costa and Tobias Glaza of YIPP examined the manuscript at the suggestion of Leffingwell archivist Richard Guidebeck.American Revolution
In the deathbed conversation, it’s clear that the dying woman has been away from the Mohegan community and has recently returned. She mentions that she most likely will not see her father (apparently away with his brother on a missionary tour in the Province of New York) until they meet in heaven.
The woman also describes a vision she experienced before her return to the family home in which four angels were ready to carry her away—before a different group of angels intervened and told her that she must not go yet because she still has work to do.
She also tells her mother that “no one who resorts to violence will enter the Kingdom of Heaven, nor anyone who carries sharp weapons, for the first Christians did not fight.”
This could be a commentary on the violence of the American Revolution and its effect on Native communities, including the dying woman’s own relatives, Grant-Costa says.
Just before she dies, the woman asks her mother if she (the mother) has made up with God. The mother replies, “I cannot tell you.” The dying woman then says, “I have made up with God, or we are reconciled.”
“From our historical reconstruction, the dying woman appears to be the niece of Samuel Ashbow, an important Mohegan minister and missionary and mentor to Samson Occom,” says Grant-Costa.
“We believe that Occom wrote the piece for him and his brother, Robert (the woman’s father), so that they would know their religious work had succeeded and the woman had died a true Christian.”
Efforts are underway to make the document and associated transcriptions and annotations available to the public on YIPP’s website.
Based at Yale Divinity School, the Yale Indian Papers Project is a collaborative research initiative that locates, digitizes, transcribes, and annotates materials by or about New England Indians, publishing them as an online resource, “The New England Indian Papers Series.”
The Yale University Library, Yale Divinity School, and the National Endowment for the Humanities support the project.
Source: Yale University
An improved gene therapy strategy using modified human stem cells shows promise in animal models as a functional cure for HIV.
The achievement, which involves an improved technique to purify populations of HIV-resistant stem cells, opens the door for human clinical trials that were recently approved by the US Food and Drug Administration.
“We have devised a gene therapy strategy to generate an HIV-resistant immune system in patients,” says Joseph Anderson, principal investigator of the study and assistant professor of internal medicine at University of California, Davis.
“We are now poised to evaluate the effectiveness of this therapy in human clinical trials.”
Anderson and his colleagues modified human stem cells with genes that resist HIV infection and then transplanted a near-purified population of these cells into immunodeficient mice. The mice subsequently resisted HIV infection, maintaining signs of a healthy immune system.
The findings are now online and will be published in the journal Stem Cells.3 HIV-resistant genes
Using a viral vector, the researchers inserted three different genes that confer HIV resistance into the genome of human hematopoietic stem cells—cells destined to develop into immune cells in the body.
The vector also contains a gene that tags the surface of the HIV-resistant stem cells. This allows the gene-modified stem cells to be purified so that only the ones resistant to HIV infection are transplanted. The stem cells were then delivered into the animal models, with the genetically engineered human stem cells generating an HIV-resistant immune system in the mice.Related Articles On Futurity
- Emory UniversityDomestic violence, HIV go hand in hand
- University at BuffaloHow to create an 'endless supply' of cells
- Emory UniversityBreast milk may put preemies at risk of deadly virus
The three HIV-resistant genes act on different aspects of HIV infection—one prevents HIV from exposing its genetic material when inside a human cell; another prevents HIV from attaching to target cells; and the third eliminates the function of a viral protein critical for HIV gene expression.
In combination, the genes protect against different HIV strains and provide defense against HIV as it mutates.
After exposure to HIV infection, the mice given the bioengineered cells avoided two important hallmarks of HIV infection: a drop in human CD4+ cell levels and a rise in HIV virus in the blood.
CD4+ is a glycoprotein found on the surface of white blood cells, which are an important part of the normal immune system. CD4+ cells in patients with HIV infection are carefully monitored by physicians so that therapies can be adjusted to keep them at normal level: If levels are too low, patients become susceptible to opportunistic infections characteristic of AIDS.
In the experiments, mice that received the genetically engineered stem cells and infected with two different strains of HIV were still able to maintain normal CD4+ levels. The mice also showed no evidence of HIV virus in their blood.Adding a ‘handle’
Although other HIV investigators had previously bioengineered stem cells to be resistant to HIV and conducted clinical trials in human patients, efforts were stymied by technical problems in developing a pure population of the modified cells to be transplanted into patients.
During the process of genetic engineering, a significant percentage of stem cells remain unmodified, leading to poor resistance when the entire population of modified cells is transplanted into humans or animal models. In the current investigation, the team introduced a “handle” onto the surface of the bioengineered cells so that the cells could be recognized and selected.
This development achieved a population of HIV-resistant stem cells that was greater than 94 percent pure.
“Developing a technique to purify the population of HIV-resistant stem cells is the most important breakthrough of this research,” says Anderson, whose laboratory is based at the UC Davis Institute for Regenerative Cures. “We now have a strategy that shows great promise for offering a functional cure for the disease.”The ‘Berlin patient’
A “functional” cure of HIV means that the virus is no longer detectable in the blood and the patient has no signs or symptoms of the disease. Viruses may still be hiding in cells in the body, but it is believed they are no longer causing harm.
This line of research is inspired by the so-called “Berlin patient,” a man who was HIV-positive and developed acute myeloid leukemia, requiring a stem cell transplant involving complete replacement of his immune system. The donor supplying the stem cells had a mutation known to resist HIV infection. Since undergoing the transplant, the Berlin patient has been free of disease despite being off antiretroviral therapy.
The new cellular therapy method was also found to be safe: at six months, no signs of toxicity or tumor formation were found, and the animal models provided with the genetically modified cells appeared to generate a normal immune system.
For the upcoming human clinical trials, the stem cells will be autologous—taken from the bone marrow of a patient’s own body, which avoids the common transplantation risk of rejection.
Other study authors are from the UC Davis Department of Internal Medicine and the Scripps Research Institute in La Jolla, California.
The James B. Pendleton Charitable Trust and the Campbell Foundation supported the work.
Source: UC Davis
New research shows how cells interact over long distances within fibrous tissue, like that associated with many diseases of the liver, lungs, and other organs.
By developing mathematical models of how the collagen matrix that connects cells in tissue stiffens, the researchers are providing insights into the pathology of fibrosis, cirrhosis of the liver, and certain cancers.
Tissue stiffness has long been know to be clinically relevant in these diseases, but the underlying changes that alter the mechanics of tissues are poorly understood. Consisting of a complex network of fibers, tissues have proven difficult to simulate and model beyond local, neighbor-to-neighbor interactions.
Developing a better understanding of the large-scale mechanical changes that occur over longer distances, specifically the process by which the extracellular matrix is pulled into compact, highly-aligned “bridges,” could eventually form the basis of treatments for related diseases.Cell mechanics
Vivek Shenoy, professor in the department of materials science and engineering in the University of Pennsylvania’s School of Engineering and Applied Science, has led an interdisciplinary research team to tackle this problem. Their two papers are published in Biophysical Journal.
The first paper involved developing simulations that extrapolated the overall remodeling of the extracellular matrix based on the behavior of neighboring pairs of cells.
The second paper took a more mathematical approach, producing a coarse-grained model of this remodeling that could be more broadly applied to fibrotic tissue.
“We’re trying to understand how force is transmitted in tissues,” Shenoy says. “Cells are the ones that generate force, and it has to be transmitted through what surrounds the cell, the extracellular matrix, or ECM.
“But imagine trying to model the ECM by trying to keep track of each collagen fibril in your liver; there are tens of millions of those. So we’re taking what we learn from simulating those networks to turn it into a model that captures the main features with only a few parameters.
“The key here is the mechanics,” he says. “In particular, how does ECM, as a fibrous material, differ from solids, gels, and other materials that are better studied.”In the liver
Rebecca Wells, an associate professor in Penn’s Perelman School of Medicine and a coauthor of the second paper, provided insight into the clinical relevance of the mechanics that characterize ECM-related disorders.
“Fibrosis occurs when you have an injury and the tissue responds by depositing ECM, forming scar tissue,” Wells says.
“In liver fibrosis, the liver can stiffen by up to an order of magnitude, so measuring stiffness is a common diagnostic test for the disease. Increased stiffness also occurs in cancer, where tumors are typically stiffer than the surrounding tissue.”
Existing experimental evidence showed that mechanical forces were at play in the changes in both fibrosis and cancer and that these forces were important to their development and progression but could not explain the long-ranging changes cells were able to produce to change their environments.
When put in tissue-simulating gels, cells can deform their immediate surroundings but are unable to pull on more distant cells. In real, ECM-linked tissue, however, cells’ range of influence can be up to 20 times their own diameter.
“If you look at a normal tissue,” Shenoy says, “you see the cells are more rounded, and the network of ECM fibers is more random. But as cancer progresses, you see more elliptical cells, more ECM, and you see that the ECM fibers are more aligned. The cells are the ones generating force, so they’re contracting and pulling the fibers, stretching them out into bridges.”
“That’s also the pathology of cirrhosis,” Wells says. “My group had been looking at the early mechanical changes associated with liver fibrosis, which progresses to cirrhosis, but then, by collaborating with Vivek, we started to wonder if these large scale changes in the architecture of the liver could have a mechanical basis and if something similar to what is seen in gels might be occurring in the liver.
“This is a new way of approaching the problem, which has largely been thought of as biochemical in origin. And there are other tissues where it is probably the same thing, the lung, for example.”Building ‘bridges’
The researchers found that the critical difference between the existing models and ECM’s long-range behavior was rooted in its elastic properties. Materials with linear elasticity cannot transmit force over the distances observed, but the team’s simulations showed that nonlinear elasticity could arise from the ECM’s fibrous structure.
“In our model, every component is linearly elastic,” Shenoy says, “but the collective behavior is nonlinear; it emerges because of the connectivity. When you deform the network, it’s easy to bend the ‘sticks’ that represent collagen fibers but hard to stretch them.
“When you deform it to a small extent, it’s all the bending of the fibers, but, as you deform further, it can’t accommodate bending any more and moves over to stretching, forming the bridges we see in the tissue.”
Such simulations can’t predict which fibers will end up in which bridge, necessitating the coarser-grained model the researchers described in their second paper. By showing the point at which linear elasticity gives way to its nonlinear counterpart, the team produced a more complete picture of how the alignment of collagen bridges under tension transmit force between distant cells.
Further studies are needed to elucidate the feedback loops between ECM stiffening and cell contraction strength. The team is conducting physical experiments to confirm and refine their in silico findings.
“Right now,” Wells says,” we’re hypothesizing that the mechanical interactions modeled by the Shenoy lab explain aspects of cancer and fibrosis, and we’re developing the experimental systems to confirm it with real cells.”
The research team included scientists from Penn and Boston University.
The National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health and the National Science Foundation funded the work.
Source: University of Pennsylvania
Many of the worst winter floods that hit the US West Coast combined heavy rains and melting snow. The combined water washes down the mountain to breach riverbanks, wash out roads, and flood buildings.
These events are tough to forecast, but they will become more common as the planet warms and more winter precipitation falls as rain rather than snow.
Now, mountain hydrology experts are turning to the physics behind these events to better predict the risks.Related Articles On Futurity
- Duke UniversityJumpy dew drops clean cicada wings
- University of Colorado at BoulderThreatened pikas hang on in the Rockies
- University of ArizonaPlants can go thirsty but only for so long
“One of the main misconceptions is that either the rain falls and washes the snow away, or that heat from the rain is melting the snow,” says Nicholas Wayand, a doctoral student in civil and environmental engineering at the University of Washington. He prsented his research December 18 at the annual meeting of the American Geophysical Union.
Most of the largest floods on record in the western US are associated with rain falling on snow. But it’s not that the rain is melting or washing away the snow.
Instead, it’s the warm, humid air surrounding the drops that is most to blame for the melting, Wayand says.
Moisture in the air condenses on the cold snow just like water droplets form on a cold drink can. The energy released when the humid air condenses is absorbed by the snow. The other main reason is that rainstorms bring warmer air, and this air blows across the snow to melt its surface. His work supports previous research showing that these processes provide 60 to 90 percent of the energy for melting.Expensive damage
Places that experience rain-on-snow flooding are cities on rivers that begin in the mountains, such as Sacramento, California, and Centralia, Washington.
In the 1997 New Year’s Day flood in Northern California, melting snow exacerbated flooding, which broke levees and caused millions of dollars in damage. The biggest recent rain-on-snow event in Washington was the 2009 flood in the Snoqualmie basin. And the Calgary flood in summer of 2013 included snow from the Canadian Rockies that caused rivers to overflow their banks.
The researchers developed a model by recreating the 10 worst rain-on-snow flooding events between 1980 and 2008 in three regions: the Snoqualmie basin in Washington state, the upper San Joaquin basin in central California, and the East North Fork of the Feather River basin in southern California.Trees make a big difference
Their results allow them to gauge the risks for any basin and any incoming storm. The three factors that matter most, they found, are the shape of the basin, the elevation of the rain-to-snow transition before and during the storm, and the amount of tree cover. Basins most vulnerable to snowmelt are treeless basins with a lot of area within the rain-snow transition zone, where the precipitation can fall as snow and then rain.
Trees reduce the risk of flooding because they slow the storm’s winds.
“If you’ve ever been in a forest on a windy day, it’s a lot calmer,” Wayand says. That slows the energy transferred from condensation and from contact with warm air to the snowpack.
Simulations also show that meltwater accounted for up to about a quarter of the total flooding. That supports earlier research showing that snow is not the main contributor to rain-on-snow floods, but cannot be neglected since it adds water to an already heavy winter rainstorm.
The complexity of mountain weather also plays a role.
“The increase in precipitation with elevation is much greater than usual for some of these storms,” says Jessica Lundquist, an associate professor of civil and environmental engineering. “Higher flows can result from heavier rainfall rates at higher elevations, rather than from snowmelt.”
The other collaborator on the work is Martyn Clark at the National Center for Atmospheric Research in Colorado.
Source: University of Washington
Taking a daily multivitamin can help pregnant women in developing nations prevent pre-term births, increase their babies’ birth weights, and deliver infants who are healthier overall, a study in Bangladesh suggests.
In the large randomized trial, a supplement with 15 essential micronutrients was superior to the current standard in many developing countries—daily supplements containing only iron and folic acid.Related Articles On Futurity
- Cardiff UniversityBaby sweat may predict toddler aggression
- University of North Carolina at Chapel HillBPA making toddler girls grow aggressive?
- Cornell UniversityCholine does pregnant mom (and baby) good
“Our study shows that women in undernourished societies should be given a multiple micronutrient supplement during pregnancy,” says study leader Keith P. West Jr., professor of infant and child nutrition at Johns Hopkins University.
“It increases birth size because the babies stay in the womb longer, and when that happens they are born a little larger and better equipped to handle life outside the womb. There is clear evidence of benefit.”
Inadequate diets are a serious public health problem in many parts of the world where many pregnant women lack micronutrients critical to the growth and development of their fetuses. That sets these children back even before their lives outside the womb have begun.Bigger babies
For the JiVitA Project, reported in the Journal of the American Medical Association, researchers recruited roughly 45,000 pregnant women in rural Bangladesh beginning in December 2007 and assigned them to receive either a daily multivitamin or an iron-folic acid supplement.
The women were followed through their pregnancies and, for those who gave birth, at one, three, and six months after their children were born. There were roughly 14,000 live births in each group in the trial, with other pregnancies lost to miscarriage, abortion, or stillbirth.
Women who received the larger number of micronutrients were 15 percent less likely to give birth prematurely, prior to 37 weeks of gestation. Pre-term birth is a leading cause of infant mortality in many parts of the world.
Babies in the multivitamin group were also 12 percent less likely to record a low birth weight (under 2.5 kilograms or 5 pounds, 8 ounces) and 11 percent less likely to be stillborn. On average, infants born to mothers in the multivitamin group were born two to three days later than those in the iron-folic acid group, giving them more time to bulk up before birth, and were an average of 55 grams (or roughly 2 ounces) heavier.More costly vitamins
While infant mortality rates at 6 months of age were roughly the same in each group, the research suggests that girls born to mothers receiving the vitamin and mineral preparation may have survived better than girls whose mothers received only iron and folic acid. This did not happen in boys, which requires further data analysis to fully understand why.
“In countries like the United States, where there is already better vitamin and mineral nutrition, women often start taking micronutrient supplements as soon as they become pregnant, if not before,” says West, director of the Johns Hopkins Center for Human Nutrition.
“But they don’t in the developing world. Vitamin and mineral supplements are more costly—probably several cents per tablet more—so in cultures where families make only a few dollars a day we need to be able to show that the investment is worthwhile in terms of having an impact on the health of mothers and their children. This study provides the needed evidence.”
The Bill and Melinda Gates Foundation provided funding for the JiVitA-3 Trial, which also received support from Sight and Life Global Nutrition Research Institute. DSM and Beximco Pharmaceuticals provided supplements.
Source: Johns Hopkins University
A bunch of antennas, held aloft by a balloon, is listening for radio bursts made by particles from outer space shooting through the atmosphere and plowing into the Antarctic ice sheet.
On December 17, Martin Israel, professor of physics at Washington University in St. Louis, emailed to say that the stratospheric balloon carrying ANITA III, a high-energy astrophysics experiment, was about to be released into the polar vortex above Antarctica.
Co-investigators Israel and W. Robert Binns, research professor of physics, had been anxiously monitoring the launch site from St. Louis.What is it?
A non-scientist looking at a photo of ANITA might be flummoxed. What could this possibly be? The torch for a cubist Statue of Liberty? A high-concept chandelier? A monster sound system?
Actually, any guess having to do with sound would be close. ANITA III consists of 48 horn antennas—flaring metal waveguides that direct sound into a beam—that are listening for short radio bursts generated by the interaction of “hot” particles from outer space with Earth’s atmosphere and the Antarctic ice sheet.Related Articles On Futurity
- University of TorontoMost detailed view yet of Milky Way's magnetic fields
- University of MichiganCould briny drops harbor life on Mars?
- Michigan State UniversityDoes 'copier' model fix Hawking's black hole theory?
The first incarnation of the instrument, ANITA I, which flew from Antarctica in the austral summer of 2006-2007, was looking for ultra-high-energy neutrinos. “It didn’t see any, but to our complete surprise, it did see 16 ultra-high-energy cosmic rays,” Binns says.
“So the major objective of the project has changed. ANITA III will look for ultra-high-energy cosmic rays, recording neutrinos if it’s lucky enough to see any,” Binns adds.
By “high energy,” the scientists mean really high, on the order of 1019 to 1021 electronvolt (eV). For comparison, visible-light photons have energies of 1.5 to 3.5 eV and high-energy X-rays for medical imaging have energies of 2 x 105 eV.
By sampling neutrinos and cosmic rays, the scientists can probe the cosmos in much the same way that astronomers probe the visible universe by sampling light. Neutrinos and cosmic rays just obey different rules and thus reveal other truths.Just the right amount of wind
The Columbia Scientific Balloon Facility (CSBF) team that releases stratospheric balloons in Antarctica waits to launch until conditions are nearly perfect. Before ANITA finally flew December 17, five launch attempts had to be scrubbed because of wind.
Antarctica is the coldest, but also the windiest, continent on the planet. “To launch a balloon the surface winds have to be below about 8 knots, which is about 9 miles an hour,” says Binns, the veteran of many balloon campaigns.
“You also have to have low winds up to about 800 feet,” Binns says. “They use a smallweather balloon, called a pi-ball (pilot balloon) to check those. Sometimes those winds will be stiff even if it is calm on the ground. You’ll see the pi-ball go up and then all of a sudden it will take off sideways.”
“When these big stratospheric balloons are launched but not yet fully inflated, they are about 600 feet long. So if they go up and hit wind shear, they can be torn apart,” he says.
Given the constraints of time and weather, every successful launch is a thrill.Why listen for radio bursts?
When an ultra-high-energy neutrino interacts with the ice, it produces a giant shower of electrons and positrons, Binns says. The shower in turn creates a radio burst that refracts up through the ice to the air and travels through the air to ANITA’s receivers.
Ultra-high-energy cosmic rays also produce radio bursts, but they interact with Earth’s atmosphere rather than the ice, producing radio bursts that reflect off the ice and up to ANITA.
Radio bursts are generated because the particle showers are traveling faster than the speed of light in the ice or the air.Missing neutrinos
What motivated the original search for ultra-high-energy neutrinos?
“Because neutrinos don’t interact with much of anything, they let us see far, far away,” Israel says. “Any other type of radiation is limited by one thing or another, but with neutrinos we could probe the whole universe.”
“So that’s one thing that makes them interesting. They provide a window on the entire universe,” he says. “But the second reason we went looking for neutrinos is we knew they had to be there.
“We know that, at least in our cosmic neighborhood, we know there’s a steep drop in cosmic rays with energies above 5 x 1019 eV,” Israel continues. “That’s called the GZK cut-off and it’s there because cosmic rays with higher energies interact with the cosmic microwave background (the thermal radiation left over from the Big Bang) and lose enough energy that they don’t reach Earth.
“But the interactions between these ultra-high-energy cosmic rays and the microwave background produce ultra-high-energy neutrinos. So the neutrinos have to be there,” Israel says.
But the first two ANITA flights didn’t see any ultra-high-energy neutrinos. If they’re “guaranteed,” why weren’t they there?
“I tell people they’re guaranteed, but you have to read the fine print,” Binns says.
“They’re guaranteed if most of the ultra-high-energy cosmic rays are protons, but if they’re heavier nuclei, it turns out that you don’t see the neutrinos. That’s beginning to look likely,” says Binns, “although nobody knows for sure.”
“Another possibility is that we’re wrong about the interaction cross section of the neutrinos,” Israel says, “that is, the probability that they’ll interact with the ice.
“What is the interaction probability for a 1020 eV neutrino in the ice? It’s not something we’ve ever checked in an accelerator because we don’t make 1020 eV anything in accelerators, so it involves some theoretical calculations—which are reasonably well understood but includes uncertainties,” Israel says.
The early calculations of the interaction cross section, in other words, may have been too optimistic.Super-TIGER
As it happens, Washington University’s Cosmic Ray Group has a second team on the ice waiting to retrieve Super-TIGER, a balloon-borne cosmic-ray experiment that was brought down in West Antarctica, about 200 miles from the Transantarctic Mountains, two years ago.
Both ANITA and Super-TIGER are designed to detect cosmic rays, a catch-all term for the immensely energetic nuclei of atoms that have been stripped of their electrons by violent cosmic processes.
Super-TIGER, however, collected cosmic rays with energies of about 109 or 1010 eV, a hundred million times lower than the energies of the particles ANITA seeks.
The enormous difference in energy implies a different generation mechanism, Binns says. “We believe the particles Super-TIGER detected were generated in our galaxy, whereas the ultra-high-energy ones ANITA is looking for come from outside the galaxy.
“We’re pretty sure those in our galaxy were accelerated by shockwaves from supernovae explosions,” Binns says. “Nobody really knows how the ultra-high-energy cosmic rays are generated.”
Because cosmic rays are charged particles, they are deflected by the magnetic fields that thread the galaxy and the universe, making it impossible to trace them back to their source.
But, Israel says, a cosmic ray with an energy of 1020 eV or more might fly straight. The only problem is that there’s only one of those per square kilometer per century, he says.
“But from balloon altitude, ANITA’s radio receivers are looking at roughly a million square kilometers of ice. We have such a big detector, we will get to see these rare events,” Israel says.
The post ‘Anita’ hunts for neutrinos in Antarctic polar vortex appeared first on Futurity.
A new way to get timely updates on the health of public beaches could help you return home from a trip to the seashore with only a healthy glow—not the stomach flu.
Researchers say the relatively easy-to-use predictive modeling systems offer an improvement over current monitoring methods, and will give beachgoers a better chance at avoiding waterborne ailments such as gastroenteritis, respiratory illness, skin rashes, and ear, nose, and throat infections.Related Articles On Futurity
- Johns Hopkins UniversityWest Nile likely traveled by mosquito, not bird
- Brown UniversityMissing data skew clinical trials
- Indiana UniversityJust say no: Is going negative the wrong approach?
Getting gastroenteritis—the “stomach flu,” which often comes with diarrhea, vomiting, and fever—is one of several ailments that can affect people infected by water polluted with fecal bacteria from sewage.
“The current approach warns the public of the potential health risks of swimming at polluted beaches based on yesterday’s news,” says Alexandria Boehm, associate professor of structural engineering at Stanford University and coauthor of the research paper published in Environmental Science & Technology.
“We wanted to find a way to better protect the public health of the more than 150 million people who visit California beaches every year,” adds coauthor Amanda Griesbach, a beach water quality scientist with Heal the Bay.
Currently, for financial and logistical reasons, most beach managers analyze swimming waters only once a week. These tests, which involve analyzing water samples for fecal indicator bacteria, generally take 18 to 24 hours. The fastest sampling method available can take up to six hours. In the meantime, swimmers continue swimming and water conditions can change within a few hours, making lab results inaccurate.
“We know for sure that the method used now is not accurate,” says Boehm, a senior fellow at the Stanford Woods Institute for the Environment.Beach testing in an hour
By contrast, predictive models based on a range of site-specific data can return highly accurate water characterizations within about an hour.
To develop these models, researchers worked with beach managers at 25 Southern California beaches to collect site-specific archival data such as rainfall amounts, tide levels, and pollution concentrations.
“Once we are familiar with the data availability and beach characteristics, we will know what types of input variables are probably useful in water quality prediction, and thus we can develop the predictive models quite efficiently,” says study lead author Anthony Thoe, a postdoctoral scholar in civil and environmental engineering at Stanford University.
Using publicly available software, the researchers developed and tested more than 700 models. They used five different types of statistical models for each beach, and determined which type of model could best predict beach advisories.Beach quality apps
In the future, beach managers may be able to run models, specifically tailored to their beaches’ characteristics, by simply entering data into an Excel spreadsheet, says Thoe, who worked on a similar modeling project for beaches in Hong Kong. Dubbed Project WATERMAN, that resulted in a range of web tools and apps anyone can download to monitor water quality at any time.
“We believe with these apps and web pages, the public can feel more engaged and can easily access prediction results that may help them determine whether they want to swim at a particular beach on that day,” Thoe says.
Although Thoe has yet to create apps for the United States, the study’s second phase this summer will likely include a public notification component.
The researchers plan to work with managers at three Southern California beaches—Doheny State Beach, Santa Monica State Beach, and Arroyo Burro Beach—to better understand the nonscientific obstacles to implementation of predictive models for issuing beach advisories.
Along the way, the researchers will further refine their models by, for example, adding data on factors such as salinity and or water cloudiness.Like a weather report
The US Environmental Protection Agency has recommended the use of predictive models to manage beaches, but states, which have jurisdiction over beach management, have not rushed to embrace the approach. In the United States, the approach is used only at a few beaches along the Great Lakes. This may be because until now there has been no large proof-of-concept-type study on predictive modeling.
It may also be because there is little federal or state legal guidance on using predictive models. California’s Assembly Bill 411, for example, exhaustively covers recreational water quality sampling, but says nothing about predictive models, Boehm says.
Researchers say they hope the study—the first systematic assessment of predictive modeling on a range of beaches with varying geographic and pollution characteristics—is an important step toward adoption.
“Soon, we’ll be able to use beach water quality models like the weather report,” says coauthor Mark Gold, acting director of the UCLA Institute of the Environment and Sustainability. “Swimmers and surfers will be able to know about water quality before they go to their favorite beach.”
The California State Water Resources Control Board funded the study.
Source: Stanford University
Activating the amygdala, an almond-shaped part of the brain that processes emotions, can give rats an addictive, intense desire for sweets, say researchers.
Most people encounter and consume highly delicious foods, such as chocolate chip cookies and candy, and addictive substances like alcohol, nicotine, and caffeine on a regular basis. For many people, these rewards act as pleasurable treats that are both wanted and liked, but for the most part consumed in moderation.Related Articles On Futurity
- Emory UniversityTo learn songs, bird brains catch small errors
- New York UniversityBody's clock loops with neurons to wake us up
- Iowa State UniversityNew input can warp fresh memories
“One reason they can be so problematic for certain individuals is their ability to become almost the sole focus of their daily lives, at the cost of one’s health, job, family, and general well-being,” says lead author Mike Robinson, a former postdoctoral fellow at the University of Michigan, now an assistant professor of psychology at Wesleyan University in Connecticut.
Robinson says it is this moderation and balance of reward avenues that allows people to lead and maintain a healthy lifestyle.
However, for a small portion of vulnerable individuals, these rewards progressively become intensely craved, skewing their normal balance of desires and leading to addiction, he says.
“Understanding what part of the brain is involved in causing intense narrowing of focus to make one reward valued at the detriment of all others might provide crucial insights into treating addiction and excessive/compulsive consumption disorders,” Robinson says.Rats and rewards
In the study, whenever the rats pushed a lever to earn a particular sugary reward, a laser light painlessly activated the amygdala in their brains for a few seconds, making neurons in it fire more excitedly. Their amygdala was never activated when the rats earned an identical sugary reward by pressing a separate lever.
Then, a simultaneous opportunity to earn both sugary rewards occurred. Faced with a choice, the rats focused only on earning the particular sugary reward that had previously excited their amygdala, while completely ignoring the other.
The rats also were willing to work much harder to earn the sweet reward associated with amygdala activation than to earn the other sweet reward.
The amygdala activation focused the rats’ desire on the sweet reward with which it was associated. By itself, the amygdala-stimulating laser appeared worthless to the rats, who didn’t seem to care if the amygdala-stimulation was on or off, unless the sugary reward was also present.
Robinson says the results suggest a role for the amygdala in generating focused and almost exclusive desire as seen in addiction.
“Understanding the pathways involved in addictive-like behavior could provide new therapeutic avenues for treating addiction and other compulsive disorders,” he says.
The findings appear in the Journal of Neuroscience.
Source: University of Michigan
Natural disasters like earthquakes and floods can have a significant impact on cardiovascular events, including heart attack and stroke.
New research shows that in the two weeks following Hurricane Sandy in 2012, there was a 22 percent increase in the number of heart attacks and stroke in the high-impact areas of New Jersey.Related Articles On Futurity
- Northwestern UniversityWhy quake forecast maps often fail
- University of LeedsAmazon 'exhales' more carbon dioxide in dry years
- University of MichiganIs mortality a good measure of stroke care?
For the study, published in the Journal of the American Heart Association, researchers used the Myocardial Infarction Data Acquisition System (MIDAS) to examine changes in the incidence of, and mortality from, myocardial infarctions (heart attacks) and strokes from 2007 to 2012 for the two weeks before and after October 29, the date of Hurricane Sandy.
MIDAS is an administrative database containing hospital records of all patients discharged from non-federal hospitals in New Jersey with a cardiovascular disease diagnosis or invasive cardiovascular procedure.
The research shows that in the two weeks following Hurricane Sandy, there was a 22 percent increase in heart attacks occurred in the eight counties determined to be high-impact areas as compared with the same time period in the previous five years.
In the low impact areas (the remaining 13 counties), the increase was less than one percent. The 30-day mortality from heart attacks also increased by 31 percent in the high-impact area.Health and extreme weather
“We estimate that there were 69 more deaths from myocardial infarction during the two weeks following Sandy than would have been expected. This is a significant increase over typical non-emergency periods,” says Joel Swerdel, an epidemiologist at Rutgers.
“Our hope is that the research may be used by the medical community, particularly emergency medical services, to prepare for the change in volume and severity of health incidents during extreme weather events.”
In regard to stroke, the investigators found a seven percent increase compared to the same time period in the prior five years in areas of the state impacted the most. There was no change in the incidence of stroke in low-impact areas, and no change in the 30-day mortality rate due to stroke in either the high- or low-impact areas.
“Hurricane Sandy had unprecedented environmental, financial, and health consequences on New Jersey and its residents, all factors that can increase the risk of cardiovascular events,” says John B. Kostis, director of the Cardiovascular Institute of New Jersey and associate dean for cardiovascular research at the Robert Wood Johnson Medical School.
“Increased stress and physical activity, dehydration, and a decreased attention or ability to manage one’s own medical needs probably caused cardiovascular events during natural disasters or extreme weather. Also, the disruption of communication services, power outages, gas shortages, and road closures, also were contributing factors to efficiently obtaining medical care.”
The Robert Wood Johnson Foundation and the Cardiovascular Institute of New Jersey at Rutgers Robert Wood Johnson Medical School funded the research.
A NASA spacecraft circling Mercury will soon be running on fumes. This clever strategy helps squeeze a few more weeks of valuable science from a mission already nearly three years into overtime.
After more than 10 years traveling in space, nearly four of those orbiting Mercury, the Messenger probe is just about out of hydrazine fuel for its thrusters. It has been on track to drop from orbit and crash on the innermost planet at the end of March.Related Articles On Futurity
- University of California, DavisRare, fast meteorite took roundabout path
- University of RochesterTravel to Mars could damage the brain
- University of Michigan'Green Pea' galaxies overflowing with light
But engineers at the Johns Hopkins Applied Physics Laboratory have devised a way to generate thrust using the non-propellant helium gas that keeps the propulsion system pressurized. That trick will keep Messenger aloft for as long as another month and allow scientists to collect even more data about the planet closest to the Sun.
“The team continues to find inventive ways to keep Messenger going, all while providing an unprecedented vantage point for studying Mercury,” says APL’s Stewart Bushman, lead propulsion engineer for the mission. “To my knowledge this is the first time that helium pressurant has been intentionally used as a cold-gas propellant through hydrazine thrusters.”
Helium will be a less-efficient propellant, but it will work—and it’s just about all that’s left on a spacecraft that’s continued its mission far longer than originally expected.
“Propellant, though a consumable, is usually not the limiting life factor on a spacecraft, as generally something else goes wrong first,” Bushman says. “As such, we had to become creative with what we had available. Helium, with its low atomic weight, is preferred as a pressurant, because it’s light, but rarely as a cold-gas propellant, because its low mass doesn’t get you much bang for your buck.”Getting closer
Messenger’s orbits are already decaying; by January 21, the low point in its swings around Mercury will be down to about 25 kilometers [just over 15 miles]. The helium-based maneuver scheduled for that day should raise altitude at closest approach to just over 80 kilometers [almost 50 miles].
That will give scientists extra time to explore Mercury at close range. This past summer, the team launched low-altitude observations, seeking the highest-resolution Mercury images ever.
Those images are enabling scientists to search for volcanic flow fronts, small-scale tectonic features, layering in crater walls, locations of impact melt, and new aspects of hollows.
The detailed views are expected to provide a new understanding of Mercury’s geological evolution.
“During the additional period of operations, up to four weeks, Messenger will measure variations in Mercury’s internal magnetic field at shorter horizontal scales than ever before,” says APL’s Haje Korth, instrument scientist for the magnetometer.
“Combining these observations with those obtained earlier in the mission at slightly higher altitudes will allow the depths of the sources of these variations to be determined.”
The spacecraft’s neutron spectrometer should be able to study water ice deposits within individual impact craters in the planet’s high northern latitudes, Korth says.
Messenger is an acronym for Mercury Surface, Space Environment, Geochemistry and Ranging. It is the first space mission to orbit the planet closest to the sun.
The spacecraft was launched on August 3, 2004, and entered orbit on March 18, 2011, to begin a yearlong study. The orbital phase has since been extended twice and should now last a total of more than four years.
Sean C. Solomon of Columbia University’s Lamont-Doherty Earth Observatory is principal investigator. APL built and operates the Messenger spacecraft for NASA.
Source: Johns Hopkins
Life expectancies for black people are shorter and more uncertain, on average, than those of whites, but the reason why may not be as clear as once thought.
Higher rates of certain kinds of death, such as murder, are often used to explain the lifespan variability, but new research shows that higher rates of other causes of deaths among young white people appear to offset this.Related Articles On Futurity
- Michigan State UniversityADHD treatment gap for minority kids
- University of Sheffield'GPS' for DNA pinpoints where your ancestors lived
- Case Western Reserve UniversityHackers looking at your health records?
“We initially suspected that the greater variance in lifespan for blacks would be a result of differences in the causes of death, for instance, the higher homicide rates among blacks,” says Glenn Firebaugh, professor of sociology and demography at Penn State.
“But, as we looked closer, we saw that suicides and deaths due to drug poisoning—deaths that are more common among whites—offset the higher homicide rates for blacks.”
Black homicide rates account for about 38 percent of the greater variance for blacks. However, the higher rates of white suicide and death due to drug overdose nearly cancels out the homicide effect.
Taken all together, differences in causes of death account for only about 13 percent of the difference in lifespan variability between blacks and whites.Target sex, not just race
“If you could magically change everything and make it so blacks and whites died of exactly the same causes, that would have surprisingly little effect on the difference in lifespan variability,” Firebaugh says. “About 87 percent of the overall difference would persist.”
In addition to behavioral reasons for lifespan uncertainty, researchers have also theorized that differences in medical treatment of and health prevention strategies for black and white patients may explain some of the difference in the variability of lifespan.
The researchers suggest that interventions to reduce this disparity may be more effective if they target sex, as well as race.
“With regard to policy, our results indicate the importance of sex-specific intervention to reduce racial disparities,” researchers write in the study, which appears in the journal Demographics.
“In the case of HIV/AIDS, for example, there is greater potential for significant reduction of the racial gap when men are targeted. The opposite is true for heart disease and diabetes, where interventions focused on women are more likely to narrow the gap.”Lifespan uncertainty
Focusing on preventing specific causes of death, such as homicide and HIV/AIDS deaths, would help significantly cut the disparity. Eliminating the difference in deaths due to those two causes would cut the black-white disparity in lifespan variance by half, researchers say.
Lifespan uncertainty is important because it may have a ripple effect for people and society, Firebaugh says.
“This isn’t part of the study, of course, but the uncertainty in lifespan could lead to uncertainty about the future. One could imagine, for instance, that a consequence of people facing more uncertain lifespans would be that they change the way they plan, or don’t plan, for health or education, for example, or for investing in retirement.”
The researchers used information from the National Center for Health Statistics 2010 Multiple Cause of Death data archive. Claudia Nau, a postdoctoral fellow in public health at Johns Hopkins University contributed to the study.
The Eunice Kennedy Shriver National Institute of Child Health and Human Development supported the work.
Source: Penn State
The post Cause of death can’t explain lifespan gap between races appeared first on Futurity.
A new device for building large tissues from living components of 3D microtissues could someday build replacement human organs the way electronics are assembled today: with precise picking and placing of parts.
In this case, the parts are not resistors and capacitors, but 3D microtissues containing thousands to millions of living cells that need a constant stream of fluid to bring them nutrients and to remove waste.
The new device is called “BioP3″ for pick, place, and perfuse. A team of researchers, including co-leader Jeffrey Morgan, a bioengineer at Brown University, introduces BioP3 in a new paper in the journal Tissue Engineering Part C.Related Articles On Futurity
- University of PittsburghTripped T-cells reject transplant organs
- Iowa State UniversityBioengineer a better hydrocarbon?
- Rice UniversityVirus programmed to unlock and kill cancer
Because it allows assembly of larger structures from small living microtissue components, says Morgan, future versions of BioP3 may finally make possible the manufacture of whole organs, like a liver, pancreas, or kidney.
“For us it’s exciting because it’s a new approach to building tissues, potentially organs, layer by layer with large, complex living parts,” says Morgan, professor of molecular pharmacology, physiology and biotechnology.
“In contrast to 3D bioprinting that prints one small drop at a time, our approach is much faster because it uses pre-assembled living building parts with functional shapes and a thousand times more cells per part.”
Morgan’s research has long focused on making individual microtissues in various shapes such as spheres, long rods, donut rings, and honeycomb slabs. He uses a novel micromolding technique to direct the cells to self-assemble and form these complex shapes. He is a founder of the Providence startup company MicroTissues Inc., which sells such culture-making technology.
Now, the new paper shows, there is a device to build even bigger tissues by combining those living components.
“This project was particularly interesting to me since it is a novel approach to large-scale tissue engineering that hasn’t been previously described,” says co-leader Andrew Blakely, a surgery fellow at Rhode Island Hospital and the Warren Alpert Medical School.Donut rings and honeycomb stacks
The BioP3, made mostly from parts available at Home Depot for less than $200, seems at first glance to be a small, clear plastic box with two chambers: one side for storing the living building parts and one side where a larger structure can be built with them.
It’s what rests just above the box that really matters: a nozzle connected to some tubes and a microscope-like stage that allows an operator using knobs to precisely move it up, down, left, right, out and in.
The plumbing in those tubes allows a peristaltic pump to create fluid suction through the nozzle’s finely perforated membrane. That suction allows the nozzle to pick up, carry, and release the living microtissues without doing any damage to them, as shown in the paper.
Once a living component has been picked, the operator can then move the head from the picking side to the placing side to deposit it precisely. In the paper, the team shows several different structures Blakely made including a stack of 16 donut rings and a stack of four honeycombs. Because these are living components, the stacked microtissues naturally fuse with each other to form a cohesive whole after a short time.
Because each honeycomb slab had about 250,000 cells, the stack of four achieved a proof-of-concept, million-cell structure more than 2 millimeters thick.Growing organs?
That’s not nearly enough cells to make an organ such as a liver (an adult’s has about 100 billion cells), Morgan says, but the stack did have a density of cells consistent with that of human organs. In 2011, Morgan’s lab reported that it could make honeycomb slabs 2 centimeters wide, with 6 million cells each. Complex stacks with many more cells are certainly attainable, Morgan says.
If properly nurtured, stacks of these larger structures could hypothetically continue to grow, Morgan says. That’s why the BioP3 keeps a steady flow of nutrient fluid through the holes of the honeycomb slabs to perfuse nutrients and remove waste. So far, the researchers have shown that stacks survive for days.
In the paper the team made structures with a variety of cell types including H35 liver cells, KGN ovarian cells, and even MCF-7 breast cancer cells (building large tumors could have applications for testing of chemotherapeutic drugs or radiation treatments). Different cell types can also be combined in the microtissue building parts. In 2010, for example, Morgan collaborated on the creation of an artificial human ovary unifying three cell types into a single tissue.Making it faster
Because version 1.0 of the BioP3 is manually operated, it took Blakely about 60 minutes to stack the 16 donut rings around a thin post, but he and Morgan have no intention of keeping it that way.
In September, Morgan received a $1.4-million, three-year grant from the National Science Foundation in part to make major improvements, including automating the movement of the nozzle to speed up production.
“Since we now have the NSF grant, the Bio-P3 will be able to be automated and updated into a complete, independent system to precisely assemble large-scale, high-density tissues,” Blakely says.
In addition, the grant will fund more research into living building parts—how large they can be made and how they will behave in the device over longer periods of time. Those studies include how their shape will evolve and how they function as a stack.
“We are just at the beginning of understanding what kinds of living parts we can make and how they can be used to design vascular networks within the structures,” Morgan says. “Building an organ is a grand challenge of biomedical engineering. This is a significant step in that direction.”
Brown has sought a patent on the BioP3.
In addition to Blakely and Morgan, the paper’s other authors are biology graduate student Kali Manning and Anubhav Tripathi, profesor of engineering, who co-directs Brown’s Center for Biomedical Engineering with Morgan.
The National Institutes of Health and the NSF have supported the research.
Source: Brown University
Scientists can put groups of cells in just the right position for analysis—without the risk of changing or damaging them. The secret is sound, according to a team of researchers who are using surface acoustic waves to manipulate cell spacing and contact.
“Optical tweezers are the gold-standard technique in the field,” says Tony Jun Huang, professor of engineering science and mechanics at Penn State. “They can trap two cells in place, but because of their high power they tends to affect the integrity of cells, and sometimes damage them.
Acoustic tweezers use the same low-power acoustic waves as those used in existing ultrasound machines, so they are gentle and can preserve cell integrity.Related Articles On Futurity
- Washington University in St. LouisWhen eyes don't recycle, vision suffers
- Boston UniversityTiny 'hairpin' probes are made of DNA
- University of RochesterDistinguishing single cells with nothing but light
The researchers are manipulating cells so that they can look at direct contact between two cell membranes or precisely control and maintain a variety of distances between cells and determine how cells communicate.
“The value of acoustic tweezers for studying cell-to-cell information transfer is their ability to separate the cells to a precise distance or to bring them to a predetermined contact,” says Stephen J. Benkovic, professor and chair in chemistry. “Optical tweezers can do this to some extent but suffer from heating of the sample.”Trapped cells
The acoustic tweezers device that the researchers envision is no larger than a cell phone and can achieve a throughput of thousands of cells.
By altering the acoustic field, the cells can be precisely manipulated without damage. Because the acoustic tweezers operate in a vertical channel that holds the cell-containing liquid, the researchers can trap the cells in suspension or allow them to settle onto the surface of the substrate.
The researchers place four acoustic sources on opposite sides of the substrate. When opposing devices send out surface acoustic waves, they set up a grid of nodes where the sound pressure cancels out. Cells become trapped at those nodes.
By modulating the power and frequencies of the acoustic sources, the researchers can manipulate the number of cells and also their position. Two cells can be moved to touch each other or to almost touch each other with a variety of separation distances.Making patterns
The cells can also be positioned in patterns including lines of multiple cells, daisy-like clumps of cells, or even triangles of cells.
“With present technologies, the generation of a desired cell-to-cell contact is often random or limited in number,” says Benkovic. “With standing acoustic waves, precise positioning of cells can be achieved on a multi-cellular level so that planned patterns of cellular arrays can be achieved.
“One can imagine a study of a cell’s infection by a bacterium as well as the creation of a long cellular assembly, for example the formation of a nerve from neurons.”
Because the acoustic tweezers can be created on a substrate that is transparent, the researchers can use microscopes to view the resulting cell alignments.
Huang, Benkovic, and colleagues put fluorescent dye into one of a pair of almost touching cells and watched the dye move into the neighboring cell through tiny protein channels established between them, demonstrating how chemical communication might be tracked using this device.
The researchers report their findings in the Proceedings of the National Academy of Sciences.
The National Institutes of Health and the National Science Foundation supported this work.
Source: Penn State
Vast dunes march across the surface of Saturn’s largest moon, Titan, as they do across the Sahara.
New research from a refurbished NASA wind tunnel reveals the physics of how particles move in Titan’s methane-laden winds. The findings, published in Nature, could help to explain why Titan’s dunes form in the way they do.
“Conditions on Earth seem natural to us, but models from Earth won’t work elsewhere,” says Bruce White, professor of mechanical and aerospace engineering at the University of California, Davis, and a coauthor of the study. “This paper gives us the thresholds to work out what models for Titan would look like.”Related Articles On Futurity
- University of SouthamptonLike us, pulsars slow down as they age
- Iowa State UniversityMicroscope pinpoints single molecules
- University of Chicago'Dynamo' accounts for Sun's weather cycle
Earth’s dunes are made of silica sand, while Titan’s dunes, revealed by the Cassini space probe, are made of coated grains of crystalline water. Titan’s atmosphere is 95 percent nitrogen, 5 percent methane, and about half again as thick as that of Earth.
When a fluid flows over a layer of particles, there is a threshold speed at which the particles start to move. On Earth, air blowing over sand will start to kick up grains when it reaches a wind speed of about four meters [about 13 feet] per second. But flowing water, which is closer in density to silica, will move sand at much lower speeds.
White and colleagues used a wind tunnel in the Planetary Aeolian Laboratory at NASA’s Ames Research Center to establish threshold wind speeds at which grains would start to move on Titan. They found that the threshold was higher than predicted from models based on terrestrial systems.
They were able to reconcile their experiments with the models by allowing for the low ratio of density between particles and atmosphere on Titan.Particle flows
The new results should help in understanding atmospheric forces on other icy moons and planets with very thin or thick atmospheres, such as Neptune’s moon Triton, Pluto, or on comets.
The findings can also help us better understand movement of particles in fluids in general. Particle flows are important in a wide range of situations, including coal-mine or grain-elevator dust explosions, environmental pollution, and lubricants.
For White, it’s a return to work he did almost 40 years ago, before joining the faculty at UC Davis. After completing his doctoral research at Iowa State University on the physics of Martian dust storms, White helped the late Ron Greeley at NASA Ames build a wind tunnel for research on Mars and Venus, which became known as the Planetary Aeolian Laboratory. The facility was mothballed in the mid-1980s, but recently refurbished by a group led by Devon Burr at the University of Tennessee-Knoxville, who is first author of the paper.
“It’s very pleasing to be able to hand this on to a new generation of researchers,” White says.
Other coauthors of the paper contributed from Johns Hopkins University; SETI Institute, Mountain View; Arizona State University; and University of Tennessee-Knoxville. NASA supported the work.
Source: UC Davis
Changes in light patterns during Christmas in the United States and the Muslim holy month of Ramadan in the Middle East are visible from space.
Satellite observations show that light intensity increases by as much as 50 percent in some US regions between Thanksgiving and New Year’s Day.
In an analysis of light output from 70 cities during 2012 and 2013, researchers found that light intensity increased 30 to 50 percent during the Christmas season in most suburbs and major city outskirts. Central urban areas were 20 to 30 percent brighter.
The researchers observed similar trends in Middle Eastern cities during the holy month of Ramadan, when people tend to push back social gatherings and meals until after daytime fasting. Those observations, however, varied from place to place depending on cultural and social dynamics of communities.
“The peaks that we observed are really ubiquitous and occur during the holidays,” says Eleanor Stokes, a PhD candidate at the Yale School of Forestry and Environmental Studies and a NASA Jenkins Graduate Fellow. “[But] in the Middle East, we found a lot of variation between cities, with these lighting patterns tracking cultural variations.”Vary by neighborhood
In some locations, including the Saudi Arabian cities of Riyadh and Jeddah, light usage increased by as much as 100 percent throughout the month. But in other regions there was little or no increase.Related Articles On Futurity
- University of MarylandBoiling at zero-gravity. That's hot
- University of ChicagoGod endures, even as religion wanes
- Syracuse UniversityFirefly light powers nanorods that glow
In Cairo, the variations could be seen even at the neighborhood level, which corresponded with cultural or socioeconomic trends.
For instance, in wealthier and more liberal districts, they found that light usage increased throughout the month. But in poorer, and more devout, neighborhoods, people tended to observe Ramadan without significant increases in light use until the Eid al-Fitr celebration that marks the end of the holiday.
“We saw that, at least during these short-term patterns, energy use is very connected to the cultural and social contexts in which someone lives,” Stokes says.
The data was collected by an instrument aboard the NOAA/NASA National Polar-orbiting Partnership (Suomi NPP) satellite capable of detecting the glow of cities and towns worldwide. Using an advanced algorithm developed by NASA’s Goddard Space Flight Center, researchers were able to filter out moonlight, clouds, and airborne particles.
The co-leader of the research was Miguel Román, a research physical scientist at NASA Goddard and member of the Suomi NPP Land Discipline Team.
The findings provide critical insights into how broad societal forces impact energy decisions, Stokes says. And with the UN Intergovernmental Panel on Climate Change (IPCC) suggesting that energy efficiency and conservation will play a key role in reducing greenhouse gas emissions, these insights have become increasingly important, she adds.
NASA and the National Oceanic and Atmospheric Administration (NOAA) supported the project.
Source: Yale University
Researchers believe they’re on track to solve the mystery of weight gain—and it has nothing to do with overeating.
They discovered that a protein, Thy1, has a fundamental role in controlling whether a primitive cell decides to become a fat cell, making Thy1 a possible therapeutic target, according to a study published in the FASEB Journal.
The research brings a new, biological angle to a problem that’s often viewed as behavioral, says lead author Richard P. Phipps, a professor at the University of Rochester.
Although Thy1 was discovered 40 years ago and has been studied in other contexts, its true molecular function has never been known. Phipps’ laboratory reported for the first time that expression of Thy1 is lost during the development of fat cells, suggesting obesity could be treated by restoring Thy1.
They’re also working towards developing an anti-obesity drug, a Thy1-peptide, and have applied for an international patent to protect the invention. Phipps and colleagues are trying to identify a company to help with commercializing the patent asset and bring a new obesity treatment to the marketplace.
“Our goal is to prevent or reduce obesity and in this paper we’ve shown how to do this in principle,” says Phipps. “We believe that weight gain is not necessarily just a result of eating more and exercising less. Our focus is on the intricate network involved in fat cell development.”Fatter mice Related Articles On Futurity
- University at BuffaloJunk food tax works to cut calories
- Johns Hopkins UniversityMajor mental illness no hurdle for weight loss
- Johns Hopkins UniversityFewer strokes for older Americans, but experts worry
Researchers studied mice and human cell lines to confirm that a loss of Thy1 function promotes more fat cells. Mice lacking the Thy1 protein and fed a high-fat diet gained more weight and faster, compared to normal mice in a control group that also ate the same high-fat diet.
In addition, the fatter mice without Thy1 had greater than twice the levels of resistin in their blood, a biomarker for severe obesity and insulin-resistance or diabetes.
Experiments using human fatty tissue from the abdomen and eyes showed similar results.Born that way?
Phipps and colleagues, including key researcher Collynn Woeller, a research assistant professor of environmental medicine, are continuing to investigate why cells with the potential to turn into fat cells lose the Thy1 protein, and why fat accumulates faster when Thy1 shuts off.
It’s not clear whether Thy1 levels are different in people at birth, or whether they change with time and exposure to various environmental agents.
To address the latter question, Phipps’ laboratory is separately studying whether chemicals known as obesogens–such as bisphenol A (BPA), flame retardants, and phthalates–reduce Thy1 expression in human cells and promote obesity. That study is funded by the National Institute of Environmental Health Sciences. The work reported in FASEB was funded by the National Institutes of Health, as well as grants from the Rochester/Finger lakes Eye & Tissue Bank and the Research to Prevent Blindness Foundation.
Source: University of Rochester
Cars that run on natural gas are touted as efficient and environmentally friendly, but getting enough gas onboard to make them practical is a hurdle.
Tiny synthetic molecules called metal organic frameworks (MOFs) are one possible storage solution.
MOFs are nanoscale compounds of metal ions or clusters known as secondary building units (SBUs) and organic binding ligands, or linkers. These linkers hold the SBUs together in a spongy network that can capture and store methane molecules in a tank under pressure. As the pressure is relieved, the network releases the methane for use.
Because there are tens of thousands of possible MOFs, it’s a daunting task to synthesize them for testing. Researchers have turned to using computers to model candidates with the right qualities.
A team led by Rice University bioengineer Michael Deem went a step further; they used a custom algorithm to not only quickly design new MOF configurations able to store compressed natural gas—also known as methane—with a high “deliverable capacity,” but ones that can be reliably synthesized from commercial precursor molecules.
And here’s a handy bonus: the algorithm also keeps track of the routes to synthesis.
Deem and his colleagues at Rice, the Lawrence Berkeley National Laboratory, and the University of California-Berkeley reported their results in the Journal of Physical Chemistry C.
“MOFs are being commercialized for methane storage in vehicles now,” Deem says.Fill up at home?
The advantages to using MOF as a storage medium are many and start with increased capacity over the heavy, high-pressure cylinders in current use. The study found 48 MOFs that beat the best currently available, a compound called MOF-5, by as much as 8 percent.
The program adhered to standard US Department of Energy conditions that an ideal MOF would store methane at 65 bar (atmospheric pressure at sea level is one bar) and release it at 5.8 bar, all at 298 kelvins (about 77 degrees Fahrenheit). That pressure is significantly less than standard CNG tanks, and the temperature is far higher than liquid natural gas tanks that must be cooled to minus 260 degrees F.
Lower pressures mean tanks can be lighter and made to fit cars better, Deem says. They may also offer the possibility that customers can tank up from household gas supply lines.
The Deem group’s algorithm was adapted from an earlier project to identify zeolites. The researchers ran Monte Carlo calculations on nearly 57,000 precursor molecules, modifying them with synthetic chemistry reactions via the computer to find which would make MOFs with the best deliverable capacity—the amount of fuel that can be practically stored and released for use.
“Our work differs from previous efforts because we’re searching the space of possible MOF linkers specifically for this deliverable capacity,” Deem says.
The researchers hope to begin real-world testing of their best MOF
“We’re very keen to work with experimental groups, and happy to collaborate,” Deem says. “We have joint projects under way, so we hope some of these predicted materials will be synthesized very soon.”
The DOE Office of Basic Energy Sciences supported the research.
Source: Rice University