Persimmons belong to a small group of plants that are either male or female. Researchers say knowing how sex is determined could open up new possibilities for breeding plants.
Most plants have both male and female sex organs in the same individual. Some, like tomato, rice, beans, and other cultivated species, cast pollen from male to female organs in the same flower. Others employ ingenious schemes to ensure that one individual pollinates the flower of another.Related Articles On Futurity
- McGill UniversityMen get more bang for health care bucks
- Georgia Institute of TechnologyIn parasite battles, weakness is a boost
- University of ChicagoGrapefruit juice lowers dose of cancer drug
Only about 5 percent of plant species have separate sexes, a condition called dioecy, or “two houses.”
“Think of it as nature’s best trick to ensure that reproduction involves two individuals, thus maximizing the mixing of genes, says Luca Comai, professor of plant biology at University of California, Davis.
“Persimmon, pistachio, wild grapevine, kiwi, hops, spinach, and even marijuana are dioecious.”
In mammals, sex is determined by X and Y chromosomes: males have an X and a Y; females have two Xs. A single gene on the Y is responsible for triggering the development of male traits.Molecular scissors
Most dioecious plants resemble the human system, with XY males and XX females. What gene may be responsible for determining plant sex has been a long-standing mystery.
For a new study, published in the journal Science, researchers worked on a family of persimmon trees (Diospyros lotus) established at Kyoto University. They combed through the genomes of some of these trees looking for genes that were exclusive to males and found an unusual gene they called OGI (Japanese for male tree).
Unlike most genes, OGI does not encode a protein—instead, it codes for a very small piece of RNA that acts as “molecular scissors,” cutting down expression of another gene, called MeGI (Japanese for female tree).
In females, MeGI builds to high level and acts like a neutering agent, repressing pollen formation. In males, OGI prevents accumulation of MeGI.Plant hybrids
Regulation by RNA scissors can be fickle, and this may help explain why plants that are genetically one sex but functionally another can arise in dioecious species.
Discovery of the OGI-MeGI system in persimmon provides a comparison for parallel studies in other dioecious plant species, Comai says.
“Because separate sexes evolved independently many times in plants, we can effectively replay the evolutionary game and ask whether plants invent different solutions to the same problem or whether the same regulatory system is recruited over and over,” he says.
The findings may also have practical applications.
“Separate sexes are the most effective way to produce plant hybrids, and hybrids are key to agricultural productivity,” says Isabelle Henry.
“In the future, we may be able to breed dioecy into new species and facilitate hybrid production through exploitation of a natural system.”
The work was supported by the Japan Society for the Promotion of Science, the US Department of Energy, and the UC Davis Genome Center.
Source: UC Davis
Both natural and engineered types of cartilage get stronger in a low-oxygen environment, researchers report.
Lab-grown tissues could one day provide new treatments for injured or damaged joints, tendons, and ligaments.Related Articles On Futurity
- Penn StateTunnel built to guide severed nerves
- Brown University‘Brain radio’ sensor lets subject move freely
- University of California, DavisSynthetic muscle brings back eye blink
Cartilage, for example, is a hard material that caps the ends of bones and allows joints to work smoothly. Biomedical engineers, exploring ways to toughen up engineered cartilage and keep natural tissues strong outside the body, report new developments in the Proceedings of the National Academy of Sciences.
“The problem with engineered tissue is that the mechanical properties are far from those of native tissue,” says first author Eleftherios Makris, a postdoctoral researcher biomedical engineering department of University of California, Davis.
While engineered cartilage has yet to be tested or approved for use in humans, a current method for treating serious joint problems is with transplants of native cartilage. But it is well known that this method is not sufficient as a long-term clinical solution, Makris says.
The major component of cartilage is a protein called collagen, which also provides strength and flexibility to the majority of our tissues, including ligaments, tendons, skin, and bones. Collagen is produced by the cells and made up of long fibers that can be cross-linked together.Stronger tissue
The team has been maintaining native cartilage in the lab and culturing cartilage cells, or chondrocytes, to produce engineered cartilage.
“In engineered tissues the cells produce initially an immature matrix, and the maturation process makes it tougher,” Makris says.
Knee joints are normally low in oxygen, so the researchers looked at the effect of depriving native or engineered cartilage of oxygen. In both cases, low oxygen led to more cross-linking and stronger material.
They also found that an enzyme called lysyl oxidase, which is triggered by low oxygen levels, promoted cross-linking and made the material stronger.
“The ramifications of the work presented in the PNAS paper are tremendous with respect to tissue grafts used in surgery, as well as new tissues fabricated using the principles of tissue engineering,” says Kyriacos A. Athanasiou, a professor of biomedical engineering and orthopedic surgery, and chair of the biomedical engineering department, who oversaw the work.
Grafts from cadavers such as cartilage, tendons, or ligaments—notorious for losing their mechanical characteristics in storage—can now be treated with this new processes to make them stronger and fully functional, he says.
Athanasiou also envisions that many tissue engineering methods will now be altered to take advantage of this strengthening technique.
The National Institutes of Health funded the work.
Source: UC Davis
Women who undergo rapid menopause brought on by the surgical removal of ovaries may have fewer hot flashes and night sweats when there are young children at home.
The process of menopause, when ovaries no longer produce eggs and menstruation stops, varies widely. Some women have almost no bothersome symptoms, while some women experience almost crippling ones. A small subset of women experience very severe effects longer than would be expected.Related Articles On Futurity
- University of ChicagoA preschool lesson in parenting
- Washington University in St. LouisFaces spark brain activity in depressed kids
- Yale UniversityTo feel full, think 'treat' when you eat
For a new study, researchers recruited 117 women, 69 who were menopausal or postmenopausal at the time of their surgery, with 29 of them having at least one child at home; and 48 women were premenopausal, with 28 of them having at least one child at home.
Researchers measured hot flashes and night sweats just before the surgery and then again at two months, six months, and 12 months post-surgery.
“These are intriguing findings,” says Tierney Lorenz, postdoctoral fellow at the Kinsey Institute at Indiana University Bloomington. “For women who were menopausal when our study began, those with young children at home actually showed more symptoms of hot flashes.
“But the women who underwent rapid menopause because of the surgical removal of their ovaries showed a dramatic reduction of symptoms.”
Previous studies on menopause have generated little consensus, Lorenz says, leaving women with a wide range of questionable treatments, such as supplements, hormonal treatments, and even hot yoga.Oxytocin’s role
The new study, published in the journal Menopause, is one of the first involving social interaction and menopause symptoms to control for the age of the women and also for the type of relationship—only relationships with young children were considered.
The study got its start with an interest in the evolutionary role of social structures—grandmothering in this case, an institution that crosses cultures.
But is it really necessary for the survival of the species? Is there an immediate benefit to the women? Is it a coincidence that women often undergo the physiological change of menopause at an age when they might have young grandchildren on hand?
The findings cannot be generalized to all women, particularly since menopause affects women so differently, Lorenz says. But they point to a need to examine the hormone oxytocin more carefully because of its possible role in the results.
Oxytocin is associated with nurturing care and a wide range of effects across the body, including interactions involved in regulating body temperature. It also can affect mood and sleeping patterns, which can be disturbed during menopause.
Lorenz says the fact that the benefits only involve young children may also be significant.
“The fact the effects observed were limited to only women with children younger than 13 years suggests that parity was not sufficient to produce changes in flashes and points instead to the increased nurturance needs of young children,” the authors write. “Presence of young children at home may moderate development of hot flashes during the menopausal transition.”
The Fred Hutchinson Cancer Research Center supported the research.
Bonnie A McGregor, researcher at University of Washington and Virginia J. Vitzthum, professor of anthropology and senior research scientist at The Kinsey Institute, are coauthors of the study.
Source: Indiana University
Scientists have put three unusually motionless patients through the CT scanner: Egyptian mummies.
One of the mummies already was known to have a brain, but scans revealed she also still has lungs. In many mummies, lungs typically were removed prior to burial.
The scientists—radiologists with the Washington University Mallinckrodt Institute of Radiology—discovered that the same mummy also has an array of small objects around her head. It appears to be a headdress or embellished shroud, but other possibilities include packing material or debris.
The scientists were surprised to find that a second mummy appeared to be significantly shorter than his sarcophagus. Further scanning revealed that his head had been dislodged from his body, perhaps when grave robbers ransacked his tomb.
They also found an item on his chest that may have been a burial amulet missed by grave robbers. They hope to use the scanning data to reconstruct the item with a 3D printer.
The mummies’ burial containers and wrappings identify each by name. The Saint Louis Art Museum’s mummy is Amen-Nestawy-Nakht, a male; the Kemper Art Museum mummies are Pet-Menekh, also a male, and Henut-Wedjebu, a female.
Karen K. Butler, associate curator of the Kemper Art Museum, says Pet-Menekh and Henut-Wedjebu were donated to the university in 1896 by Charles Parsons, a St. Louis banker and prominent art collector. Working with a curator from the Egyptian Museum in Cairo, Parsons acquired them shortly after excavation through the Antiquities Service of Egypt.Two priests and a lady
Henut-Wedjebu, the oldest of the mummies, was discovered in a cave-like tomb near the ruins of the Egyptian city of Thebes. Her name means “singer of Amun and lady of the house,” and her elaborately gilded coffin, decorated with texts from the Book of the Dead, is one of only eight such objects to survive from the reign of Amenhotep III (1390-1353 BCE).Related Articles On Futurity
- Duke UniversityBlack history echoes in latest dance crazes
- University of KansasDepression can be deadly for older Americans
- University of OregonAncient Mongolia—no passport needed
Pet-Menekh—or “he whom the excellent one has given”—is thought to have been a priest of the god Chem during the Ptolemaic period (c. 300 BCE). He died in his 30s or 40s, possibly of sudden trauma or acute disease. His coffin—likely found at the Necropolis of El-Hawawish in Akhmim—is richly decorated with hundreds of hieroglyphics as well as images of the goddesses Isis and Nut.
Amen-Nakht—or “Amun (Lord) of the Thrones of the Two Lands is Strong”—acquired by the Saint Louis Art Museum in 1980, was a priest of Amun during the 22nd Dynasty (945–712 BCE). His coffin is thought to have been discovered in the Necropolis of Thebes. A painted cartonnage—a kind of funerary case made of linen and plaster—covers the body and illustrates the panoply of deities charged with escorting him into the afterlife.
“The technical sophistication of all three mummies suggests that these were well-off individuals,” says Lisa Çakmak, assistant curator for ancient art at the Saint Louis Art Museum, who initiated the project.
“We would expect to see that reflected in the condition of their teeth and skeletons. The CT scan helps us to better understand their lifestyles.”‘Time capsules’
Mummies present art experts and scientists with a formidable challenge: They are incredible time capsules from human societies that vanished thousands of years ago, but opening the capsules would desecrate human remains and possibly destroy unique cultural treasures.
Modern medical imaging techniques offer ways to peer into these time capsules without physically opening them. Scientists scanned Amen-Nestaway-Nakht two decades ago, but imaging technology has advanced significantly since then.
Çakmak, Butler, and others approached Gil Jost, then director of the Mallinckrodt Institute, about the possibility of getting the mummies scanned. Jost enlisted Sanjeev Bhalla, professor of radiology and chief of cardiothoracic imaging, to lead the research team.How to scan mummies
The scientists considered scanning the mummies with a magnetic resonance imaging (MRI) unit, but it was impossible to guarantee that the mummies were free of any metals. Metal is prohibited in MRIs because the strong magnets in the scanners can damage the equipment and the subject being scanned.
More importantly, though, mummies are free of water as a result of the mummification process, and the images created with MRI scanning are dependent on the water content of tissue.
The researchers instead brought the mummies to a powerful and recently installed computerized tomography (CT) scanner. The unit uses X-rays to virtually slice a solid object, producing detailed 3D images of its interior.
“This new CT scanner has higher spatial resolution and quickly can assemble slices in a variety of ways, providing more medical details about the mummies,” Bhalla says.
In living patients, Bhalla and his colleagues often inject contrast agents that help make different types of cells and tissues stand out. This was not an option for the mummies, but researchers scanned them at two different energy levels to enhance details.Hearts and teeth
Among other goals, the researchers are analyzing the data for signs of artery hardening in the mummies. Indicators of heart disease have been detected in prior mummy scans, but it’s not clear yet if this is reflective of the elite lifestyle of anyone rich enough to be mummified or if heart disease was a common problem in ancient Egyptian society.
The researchers also will take a close look at the mummies’ teeth. The degree of wear on the teeth helps scientists more precisely estimate a mummy’s age at the time of death. They also will search for evidence of what caused the mummies’ deaths.
Logistically, the scans were complicated. The mummies had to be carefully removed from their display cases, packed, and prepared for transport in custom-built boxes. The team took precise measurements to be sure each mummy would fit into the scanner.‘These were human beings’
But Bhalla viewed another aspect of the scans as the greatest challenge of the project.
“It was very important for us to remember that these were human beings we were scanning,” he says. “We had to do the scanning in an atmosphere of spiritual and physical respect, and with the help of museum staff who acted as a kind of surrogate family for the mummies, we did that.”
“Mummification was a difficult and expensive process. It’s really very poignant. Each of these people was beloved by someone,” adds Çakmak.
The researchers and other medical center staff volunteered their time, and the School of Medicine and Barnes-Jewish Hospital donated time on the scanner and the computing resources necessary to process the results. The Saint Louis Art Museum paid for transporting the mummies.
The brains of children with dyslexia may be structured differently, according to neuroimaging of the thalamus, the part of the brain that serves as its connector.
The behavioral characteristics of dyslexia—a reading disorder that affects up to 17 percent of the population—are well documented, including struggling to recognize and decode words as well as trouble with comprehension and reading aloud.Related Articles On Futurity
- University of FloridaTreat meth withdrawal like chronic disease
- Vanderbilt UniversityNeurons reveal why ‘haste makes waste’
- Preschoolers get lessons in 'mean' from kids TV
The thalamus serves as the brain’s connector—relaying sensory and motor signals back to the cerebral cortex via nerve fibers that are part of the brain’s “white matter.” The thalamus also regulates alertness, consciousness, and sleep.
Evaluating 40 children ages 8 to 17 years, evenly divided between typically developing readers and those with developmental dyslexia, the researchers used diffusion tensor imaging to visually map the structure of the brain in an effort to better understand the role of the thalamus in reading behavior.
“A different pattern of thalamic connectivity was found in the dyslexic group in the sensorimotor and lateral prefrontal cortices,” says Laurie Cutting, professor of special education and professor of psychology and human development, radiology, and pediatrics at Vanderbilt University.
“These results suggest that the thalamus may play a key role in reading behavior by mediating the functions of task-specific cortical regions. Such findings lay the foundation for future studies to investigate further neurobiological anomalies in the development of thalamo-cortical connectivity in individuals with dyslexia.”Different connections
In a related study, researchers examined connectivity patterns in a cortical region known to be especially important for reading: the left occipito-temporal region, sometimes referred to as the visual word form area.
While there have been many functional MRI studies examining this region, there is not a consensus on the region’s functionalities, and studies of the visual word form area’s structural connectivity are relatively new.
Cutting and her colleagues used diffusion MRI to study the structural connectivity patterns in the left occipito-temporal region and surrounding areas of the brain in 55 children.
“Findings suggest that the architecture of the left occipito-temporal region connectivity is fundamentally different between children who are typically developing readers and those with dyslexia,” Cutting says.
The typically developing readers showed greater connectivity to linguistic regions than the dyslexic group. Those with dyslexia showed greater connectivity to visual and parahippocampal (memory encoding and retrieval) regions.
The data were collected at Johns Hopkins University School of Medicine’s Kennedy Krieger Institute and the Vanderbilt University Institute of Imaging Science at Vanderbilt University Medical Center. The work was conducted in part using the resources of the Advanced Computing Center for Research and Education at Vanderbilt University.
Source: Vanderbilt University
The post Brain ‘architecture’ differs in kids with dyslexia appeared first on Futurity.
Scientists have figured out how to convert human skin cells directly into a specific type of brain cell without passing through a stem cell phase, which avoids the production of multiple cell types.
The researchers demonstrate that these converted cells survived at least six months after injection into the brains of mice and behaved similarly to native cells in the brain.
“Not only did these transplanted cells survive in the mouse brain, they showed functional properties similar to those of native cells,” says senior author Andrew S. Yoo, assistant professor of developmental biology at the Washington University School of Medicine in St. Louis.
“These cells are known to extend projections into certain brain regions. And we found the human transplanted cells also connected to these distant targets in the mouse brain. That’s a landmark point about this paper.”Spiny neurons and Huntington’s disease
The investigators produced a specific type of brain cell called medium spiny neurons, which are important for controlling movement.
They are the primary cells affected in Huntington’s disease, an inherited genetic disorder that causes involuntary muscle movements and cognitive decline usually beginning in middle-adulthood. Patients with the condition live about 20 years following the onset of symptoms, which steadily worsen over time.
The research involved adult human skin cells, rather than more commonly studied mouse cells or even human cells at an earlier stage of development.
In regard to potential future therapies, the ability to convert adult human cells presents the possibility of using a patient’s own skin cells, which are easily accessible and won’t be rejected by the immune system.microRNA is key
To reprogram these cells, Yoo and his colleagues put the skin cells in an environment that closely mimics the environment of brain cells. They knew from past work that exposure to two small molecules of RNA, a close chemical cousin of DNA, could turn skin cells into a mix of different types of neurons.
In a skin cell, the DNA instructions for how to be a brain cell, or any other type of cell, is neatly packed away, unused. In past research published in Nature, Yoo and his colleagues showed that exposure to two microRNAs called miR-9 and miR-124 altered the machinery that governs packaging of DNA.Related Articles On Futurity
- University of MichiganFinicky neurons get all fired up
- University of RochesterDrug shows promise for Huntington's disease
- Monash UniversityTo find disease, method tracks proteins
Though the investigators still are unraveling the details of this complex process, these microRNAs appear to be opening up the tightly packaged sections of DNA important for brain cells, allowing expression of genes governing development and function of neurons.
Knowing exposure to these microRNAs alone could change skin cells into a mix of neurons, the researchers then started to fine tune the chemical signals, exposing the cells to additional molecules called transcription factors that they knew were present in the part of the brain where medium spiny neurons are common.
“We think that the microRNAs are really doing the heavy lifting,” says co-first author Matheus B. Victor, a graduate student in neuroscience. “They are priming the skin cells to become neurons. The transcription factors we add then guide the skin cells to become a specific subtype, in this case medium spiny neurons. We think we could produce different types of neurons by switching out different transcription factors.”Brain cells ‘behave’
Yoo also explains that the microRNAs, but not the transcription factors, are important components for the general reprogramming of human skin cells directly to neurons.
His team, including co-first author Michelle C. Richner, senior research technician, showed that when the skin cells were exposed to the transcription factors alone, without the microRNAs, the conversion into neurons wasn’t successful.
The researchers performed extensive tests to demonstrate that these newly converted brain cells did indeed look and behave like native medium spiny neurons. The converted cells expressed genes specific to native human medium spiny neurons and did not express genes for other types of neurons. When transplanted into the mouse brain, the converted cells showed morphological and functional properties similar to native neurons.
To study the cellular properties associated with the disease, the investigators now are taking skin cells from patients with Huntington’s disease and reprogramming them into medium spiny neurons using the approach described in the new paper. They also plan to inject healthy reprogrammed human cells into mice with a model of Huntington’s disease to see if this has any effect on the symptoms.
The work appears in the journal Neuron.
Funding came from a National Science Foundation Graduate Research fellowship, a fellowship from Cognitive, Computation and Systems Neuroscience Pathway, grants from the National Institutes of Health, and awards from the Mallinckrodt Jr. Foundation, Ellison Medical Foundation, and Presidential Early Career Award for Scientists and Engineers.
Pumpkins may or may not fend off evil spirits on Halloween, but scientists suspect they do possess medicinal properties that could help treat diabetes.
The materials inside pumpkins such as the fruit pulp, oil from ungerminated seeds, and protein from germinated seeds have hypoglycemic properties.Related Articles On Futurity
- Tulane UniversityHigh blood sugar makes Alzheimer's plaque more toxic
- New York UniversityStem cells not unique in ability to regenerate?
- University of FloridaPlants in space: No gravity required
These biologically active ingredients—polysaccharides, para-aminobenzoic acid, vegetable oils, sterol, proteins, and peptides—could assist in maintaining healthy blood sugar levels, researchers suspect.
Gary Adams from the School of Health Sciences at the University of Nottingham is investigating the effect these ingredients have on blood sugar and diabetes
“There are many different types of insulins available to treat diabetes, but there are still physiological consequences for such use. Alternatives are, therefore, required and this includes herbal preparations as well as dietary plants in the form of curcubitaceae (pumpkin).”
Adams is working with colleagues at Abant Izzet Baysal University in Turkey, where he is a visiting professor, to characterize the interactions of macromolecules. Their latest findings were published in Critical Reviews in Food Science and Nutrition.
“Both the pulp—the soft flesh—and seeds of naturally sourced pumpkins are extracted from cucurbits (pumpkins) and the components within these gourds are characterized/examined using specialized instruments in our laboratory.
“By using these instruments, we are able, in part, to determine the actual components that might be responsible for reducing blood sugar especially in patients presenting with diabetes.”
Source: University of Nottingham
New research shows how drugs in the Dominican Republican’s touristy areas present barriers to preventing HIV.
The Caribbean has the second highest global HIV prevalence in the world outside of Sub-Saharan Africa, with HIV/AIDS as leading cause of death among people aged 20–59 years within the region.
Particularly hard-hit are the Dominican Republic (DR) and Haiti, on the island of Hispaniola, which account for approximately 70 percent of all people living with HIV in the Caribbean region.
How the intersection of drugs and tourism as contributing factors to the region’s elevated HIV/AIDS risk hasn’t gotten enough attention.
Caribbean studies have almost exclusively focused on drug transportation. The roles that drugs play in tourism areas, which may be fueling the Caribbean HIV/AIDS epidemic, seldom come up.Sex, booze, and risk
Now a new study in Global Public Health addresses this gap by conducting in-depth interviews with 30 drug users in Sosúa, a major sex tourism destination of the DR.Related Articles On Futurity
- California Institute of TechnologySolar-power potty wins toilet challenge
- Michigan State UniversityWhat the 1918 'Spanish flu' tells us about Ebola
- Cornell UniversityEdible 'stop signs' remind us to eat less
The study’s results suggest three themes: (1) local demand shifts drug routes to tourism areas, (2) drugs shape local economies, and (3) drug use facilitates HIV risk behaviors in tourism areas.
“We know that the DR is located on a primary drug transportation route and is also the Caribbean country with the most tourist arrivals, receiving over 4.5 million visitors in 2012,” says Vincent Guilamo-Ramos, professor of social work and global public health and a co-director at NYU’s Center for Drug Use and HIV Research (CDUHR) and NYU’s Center for Latino Adolescent and Family Health.
“Tourism areas represent distinct ecologies of risk often characterized by sex work, alcohol consumption, and population mixing between lower and higher risk groups.”
The researchers sought to document drug use in tourism areas of the DR and its impact on HIV risk behaviors, potentially informing public health policies and programmatic efforts to address local drug use and improve HIV prevention efforts within tourism areas.
The participants were recruited from randomly selected alcohol-serving venues and locations of identified high drug use. Key findings from the in-depth interviews with drug users in Sosúa, DR include:
- Drug users commonly work in jobs related to tourism, and the vast majority of drug users in tourism areas indicated having sexual intercourse with someone in exchange for money, drugs, or another good (90 percent).
- Cocaine (86 percent) and marijuana (83 percent) were the most commonly used illegal drugs, with amphetamine use much less common (3 percent).
- Drugs have become linked to the tourism economy and are perceived to facilitate greater profitability for locals working in the area. As such, tourism environments provide opportunities for locals and tourists to engage in high-risk behaviors involving sex and drug use.
- The majority of participants in the study (79 percent) agreed that drug use is a serious public health problem in the area and attributed it to the tourism economy.
According to Guilamo-Ramos and his team, the study supports the need for targeted research and intervention efforts for HIV prevention that address local drug use within the context of tourism areas and their role on HIV risk behaviors.
“Our analysis suggests that local demands shift drug routes to tourism areas, drugs influence the local economies, and drug use facilitates HIV risk behaviors in tourist towns,” says Guilamo-Ramos.
“This study is important because it indicates the need for drug policies to address the structural factors of the tourism economy involving local drug transport with specific impact on HIV transmission. The current failure to provide a local level response in tourist areas of high sexual risk behaviors potentially exacerbates HIV transmission.”
The researchers stress that given the estimated 20 million tourists travelling to the Caribbean region annually and the documented elevated rates of HIV within prime tourism destinations, research related to such contexts is of importance to locals and visitors and has widespread importance to both the Caribbean region and the countries in which the tourists originate.
The Global Public Health Research Challenge Fund (GPHRCF) at New York University supported the work.
Additional researchers from NYU, Columbia University, Pontificia Universidad Católica Madre y Maestra in the Dominican Republic, and Purdue University contributed to the study.
The post Drugs and tourism combine to raise HIV risk in Caribbean appeared first on Futurity.
A medication used to treat heart failure, called Digoxin, could be adaptable for treating amyotrophic lateral sclerosis (ALS), the paralyzing condition known as Lou Gehrig’s disease.
ALS destroys the nerve cells that control muscles. This leads to loss of mobility, difficulty breathing and swallowing, and eventually death. Riluzole, the sole medication approved to treat the disease, has only marginal benefits in patients.
But in a new study conducted in cell cultures and in mice, scientists showed that when they reduced the activity of an enzyme or limited cells’ ability to make copies of the enzyme, the disease’s destruction of nerve cells stopped. The enzyme maintains the proper balance of sodium and potassium in cells.
“We blocked the enzyme with digoxin,” says senior author Azad Bonni. “This had a very strong effect, preventing the death of nerve cells that are normally killed in a cell culture model of ALS.”Mouse model of ALS
The results stem from Bonni’s studies of brain cells’ stress responses in a mouse model of ALS. The mice have a mutated version of a gene that causes an inherited form of the disease and develop many of the same symptoms seen in humans with ALS, including paralysis and death.Related Articles On Futurity
- Johns Hopkins UniversityMajor mental illness no hurdle for weight loss
- University of PittsburghMelatonin slows ALS symptoms in mice
- University of Southern CaliforniaHigher co-pays may cost kids their meds
Efforts to monitor the activity of a stress response protein in the mice unexpectedly led the scientists to another protein: sodium-potassium ATPase. This enzyme ejects charged sodium particles from cells and takes in charged potassium particles, allowing cells to maintain an electrical charge across their outer membranes.
Maintenance of this charge is essential for the normal function of cells. The particular sodium-potassium ATPase highlighted by Bonni’s studies is found in nervous system cells called astrocytes. In the ALS mice, levels of the enzyme are higher than normal in astrocytes.Astrocyte trouble
Bonni’s group found that the increase in sodium-potassium ATPase led the astrocytes to release harmful factors called inflammatory cytokines, which may kill motor neurons.
Recent studies have suggested that astrocytes may be crucial contributors to neurodegenerative disorders such as ALS, and Alzheimer’s, Huntington’s, and Parkinson’s diseases. For example, placing astrocytes from ALS mice in culture dishes with healthy motor neurons causes the neurons to degenerate and die.
“Even though the neurons are normal, there’s something going on in the astrocytes that is harming the neurons,” says Bonni, the professor of neurobiology and head of the department of anatomy and neurobiology at the Washington University School of Medicine in St. Louis.
How this happens isn’t clear, but Bonni’s results suggest the sodium-potassium ATPase plays a key role. When he conducted the same experiment but blocked the enzyme in ALS astrocytes using digoxin, the normal motor nerve cells survived. Digoxin blocks the ability of sodium-potassium ATPase to eject sodium and bring in potassium.Longer survival
In mice with the mutation for inherited ALS, those with only one copy of the gene for sodium-potassium ATPase survived an average of 20 days longer than those with two copies of the gene. When one copy of the gene is gone, cells make less of the enzyme.
“The mice with only one copy of the sodium-potassium ATPase gene live longer and are more mobile,” Bonni says. “They’re not normal, but they can walk around and have more motor neurons in their spinal cords.”
Many important questions remain about whether and how inhibitors of the sodium-potassium ATPase enzyme might be used to slow progressive paralysis in ALS, but Bonni says the findings offer an exciting starting point for further studies.
Funding from the Edward R. and Anne G. Lefler Foundation and the National Research Service Award supported this research. The findings appear online in Nature Neuroscience.
Ebola has a lot of company, researchers say. Since 1980, the world has seen an increasing number of infectious diseases, including enterovirus, tuberculosis, cholera, measles, and various strains of the flu and hepatitis.
Menacing as that may sound, preliminary findings also reveal an encouraging trend.
On a per capita basis, the impact of the outbreaks is declining. In other words, even though the globe faces more outbreaks from more pathogens, they tend to affect a shrinking proportion of the world population.
“We live in a world where human populations are increasingly interconnected with one another and with animals—both wildlife and livestock—that host novel pathogens,” says Katherine Smith, assistant professor of biology at Brown University.
“These connections create opportunities for pathogens to switch hosts, cross borders, and evolve new strains that are stronger than what we have seen in the past.”
The analysis shows that animals are the major source of what ails us. Sixty-five percent of diseases in the dataset were “zoonoses,” meaning they come from animals. Ebola, for instance, may have come from bats. In all, such diseases caused 56 percent of outbreaks since 1980.
Smith and colleagues have compiled their work into a database that is now available online.12,102 outbreaks
The researchers developed a “bioinformatics pipeline” to automate the creation of a database comprising 12,102 outbreaks of 215 infectious diseases involving 44 million cases in 219 countries between 1980 and 2013. The analysis is based on data from the prose reports of outbreaks stored in the Global Infectious Disease and Epidemiology Online Network (GIDEON).Related Articles On Futurity
- Alcohol brand names pop up on Top 40
- Purdue UniversityTo beat a virus, find its ‘pocket factor’
- Northwestern UniversityCaregiver mistakes risky for seniors
The raw numbers reveal a steep rise in the number of outbreaks globally.
“GIDEON defines an outbreak as an increase in the number of cases of disease beyond what would normally be expected in a defined community, geographical area, or season,” says coauthor Sohini Ramachandran, assistant professor of biology.
Between 1980 and 1985 there were well under 1,000 such instances, but for 2005-10, the number surged to nearly 3,000. In those same timeframes, the number of unique diseases causing the trouble climbed from less than 140 to about 160.
The researchers reasoned the increase could be due to factors such as better reporting of outbreaks and information sharing. To account for that, they paired the outbreak data with data on each country’s GDP, press freedom, population size, population density, and even Internet use (after 1990).
Even after controlling for those factors, the numbers of outbreaks and unique causes rose significantly over 33 years. Latitude was also included because previous studies had shown there are more infectious diseases in lower latitudes.The top 10
From the analysis, the researchers were not only able to track trends in the total number of outbreaks in each country and around the world, but they could also analyze the host source of the outbreaks. They compiled top 10 lists for each decade of diseases causing the most outbreaks.
For zoonoses in 2000-10, salmonella topped the list followed by e. coli, influenza A, hepatitis A, anthrax, dengue fever, shigellosis, tuberculosis, chikingunya, and trichinosis.
Notably, chikingunya, a painful mosquito-borne virus that has afflicted much of the Caribbean and Central America, was a newcomer in the decade. So was influenza A.
Meanwhile, diseases that were top 10 scourges of earlier decades dropped off the list: campylobacterosis, cryptosporidiosis, and hepatitis E.
Among human-specific infections, gastroenteritis led the 2000s list, trailed by cholera, measles, enterovirus, bacterial meningitis, legionellosis, typhoid and enteric fever, rotavirus, mumps, and pertussis (whooping cough). Notable”newcomers” were mumps and pertussis, but adenoviruses and rubella had fallen out of their former prominence.
It’s good news that although the world seems to face an increasing number of infectious flare-ups from a widening range of tiny foes, we are improving our public health and medical defenses as well, Smith says.
“Our data suggest that, despite an increase in overall outbreaks, global improvements in prevention, early detection, control, and treatment are becoming more effective at reducing the number of people infected,” the authors write.
The analysis is particularly interested in how global infectious disease patterns will shift with climate and land use change, Smith says.
“A warmer world, a world with altered landscapes, and a more urban world will undoubtedly have a new disease-scape to consider.”
Brown’s Institute for the Study of Environment and Society funded the work, which was published in the Journal of the Royal Society Interface.
Cici Bauer, assistant professor of biostatistics, graduate student Samantha Rosenthal, adjunct lecturer Lynn Carlson, and 2013 Brown graduates Michael Goldberg and Jane Chen are coauthors of the study.
Source: Brown University
The post Why the steep rise in disease outbreaks since 1980? appeared first on Futurity.
Otherwise healthy girls at high-risk for depression may be aging at a faster rate than their peers.
A new study shows the girls with a family history of depression respond to stress by releasing much higher levels of the hormone cortisol.
They also have telomeres that are shorter by the equivalent of six years in adults. Telomeres are caps on the ends of chromosomes. Every time a cell divides the telomeres get a little shorter.
Telomere length is like a biological clock corresponding to age. Telomeres also shorten as a result of exposure to stress.
Previous studies have uncovered links in adults between shorter telomeres and premature death, more frequent infections, and chronic diseases.
“I did not think that these girls would have shorter telomeres than their low-risk counterparts—they’re too young,” says Ian Gotlib, professor of psychology at Stanford University.Six years older
For the study, published in Molecular Psychiatry, researchers recruited 10- to 14-year-old healthy girls with a family history of depression and compared them to healthy girls without that background.Related Articles On Futurity
- Vanderbilt University'Mighty mouse' runs on brain chemical
- Duke UniversitySigns of early DNA ‘aging’ in abused kids
- Michigan State UniversityEnglish proficiency test gets 'F' for stress
Researchers measured the girls’ response to stress tests, asking them to count backward from 100 by 7, and interviewing them about stressful situations. Before and after the test, the team measured the girls’ cortisol levels and also analyzed DNA samples for telomere length.
“No one had examined telomere length in young children who are at risk for developing depression,” before the study, Gotlib says.
Healthy but high-risk 12-year-old girls had significantly shorter telomeres, a sign of premature aging.
“It’s the equivalent in adults of six years of biological aging,” Gotlib says, but “it’s not at all clear that that makes them 18, because no one has done this measurement in children.”What to do?
Exercise has been shown to delay telomere shortening in adults, so girls at high-risk girls should learn stress reduction techniques, Gotlib says.
Other studies show that neurofeedback and attention bias training (redirecting attention toward the positive) seem promising. Other investigators are studying techniques based on mindfulness training.
Gotlib says he and colleagues are continuing to monitor the girls from the original study. “It’s looking like telomere length is predicting who’s going to become depressed and who’s not.”
The National Institute of Mental Health supported the study.
Source: Stanford University
When political races are competitive, both Democrat and Republican voters favor candidates who are more strongly conservative or liberal, new research shows.
The findings contradict conventional thinking that holds close elections swing appeal to the center toward candidates with a moderate or centrist ideology and may explain why voters in the United States have elected so many polarizing candidates in recent elections.Related Articles On Futurity
- Christianity influences meat taboos in Amazon
- University of MichiganWhy falsehoods are easy to believe
- Penn StatePolitics drive a wedge in charitable giving
For a study published in the Journal of Experimental Social Psychology, researchers conducted a series of three experiments.
The first presented subjects with a hypothetical primary election in the United States and observed whether the participant’s preference for an extreme leader was altered by their perception of how competitive the election would be.
When an electoral district was described as “hotly contested” between Republicans and Democrats, Democrat participants were more likely to choose an extreme liberal from a lineup of primary candidates than when the district was described as being “safe.”
A second study, which was conducted shortly before the 2012 Presidential election, found that both Democrat and Republican participants were more supportive of a more ideologically extreme version of Barack Obama or Mitt Romney, respectively, when they were told that the presidential election was likely to be competitive than when they thought the election was not close.
The final experiment tested the reasoning behind people’s choice of extreme candidates in competitive elections.Extreme leaders
“We found that when people believe that there is greater competition between their group and another group, they want to make clear what the differences are between the two groups,” says Rosalind Chow, associate professor of organizational behavior and theory at Carnegie Mellon University.
“To make the distinction between the two groups clear, they choose leaders who are extreme in their views.”
The findings suggest that when the media focuses on the competition between two groups, such as political parties, it leads members of both parties to lean away from the center in favor of more ideologically extreme representatives.
Party nominees who hold more extreme views inevitably produce elected officials with more extreme views, which create wide ideological divisions in legislative bodies, such as Congress, and difficulties in facilitating the efficient operation of government, Chang says.
“Ironically, it is the perception that elections are close that are creating such stark ideological divisions.”
Source: Carnegie Mellon University
It’s taken more than half a century, but scientists have proved that a new frog species exists in New York City. In fact, the new species is living in wetlands from Connecticut to North Carolina.
“Even though he was clearly on to something, the claim Carl Kauffeld made in his 1937 paper fell short,” says Rutgers doctoral candidate Jeremy Feinberg.
“We had the benefits of genetic testing and bioacoustic analysis that simply weren’t available to Kauffeld to prove that even though this frog might look like the two other leopard frogs in the area, it was actually a third and completely separate species.”
In the paper in PLOS ONE, Feinberg and a team of seven other researchers reveal the scientific name for the new species: Rana kauffeldi.
The leopard frog, first encountered by Feinberg on Staten Island six years ago not far from the Statue of Liberty, will be commonly referred to as the Atlantic Coast Leopard Frog.Overdue credit
During his career, Kauffeld, who died in 1974 at age 63, worked as the director of the Staten Island Zoo and at the American Museum of Natural History. He wrote many books about amphibians and reptiles and is considered to have been an authority on the subject.
Still, although Kauffeld’s research was initially recognized by some of his colleagues, Feinberg says Kauffeld faced considerable scrutiny and failed to gain any lasting support for his proposal.
“After some discussion, we agreed that it just seemed right to name the species after Carl Kauffeld,” says Feinberg. “We wanted to acknowledge his work and give credit where we believe it was due even though it was nearly 80 years after the fact.”
Feinberg, the lead author, encountered the new species six years ago in one of the most developed, heavily populated areas in the world. Two years ago, he and colleagues—who had worked together to show that this frog was a brand new species—made the initial announcement.
Today, the new research paper, completes that discovery. The paper provides the critical evidence needed to formally describe and name the new frog and also presents information on the distribution, ecology, and conservation status of this species.Look-alike frogs
Historically, the new frog was confused with two closely related species—including one to the north and one to the south—because it looks so similar. As a result, it was not noticed as a distinct species.Related Articles On Futurity
- RutgersGenetic aftermath of alcohol during pregnancy
- Stanford UniversityIn acidic water, reef runs wild with algae
- University of MichiganBullfrog farms spread killer fungus worldwide
But after Feinberg’s encounter in 2008, modern technology stepped in. Using molecular and bioacoustic techniques to examine the genetics and mating calls of leopard frogs from various parts of Northeast, the scientists were able to positively determine that the frog found living in the marshes of Staten Island was, in fact, a new species that might also be hiding in ponds and wetlands beyond New York and New Jersey.
The news, Feinberg says, became a call to arms to biologists, hobbyists, and frog enthusiasts from Massachusetts to Virginia to go out, look, and listen in order to determine if the new frog—mint-gray to light olive green with medium to dark spots—could be found beyond the New York metropolitan area.
Over the last two years, many frog-lovers, including some involved with the North American Amphibian Monitoring Project—a government project that observes frog habitats to determine if populations are declining—have provided crucial information about where the frogs are living, what they look like and how they sound.
One volunteer, in fact, noticed the new species’ unusual and distinct “chuck” call, and provided information that ultimately helped confirm populations of the new species in both Virginia and North Carolina.
“If there is a single lesson to take from this study, it’s that those who love nature and want to conserve it need to shut down their computers, get outside, and study the plants and animals in their own backyards,” says coauthor Brad Shaffer, professor in UCLA’s department of ecology and evolutionary biology.‘Hiding in plain sight’
Shaffer describes the discovery as biological detective work. Although fun and satisfying work, the goal is to protect the biodiversity of the planet, he says.
Scientists say the fact that this new species—which brings the total number of leopard frogs in the world to 19—remained under the radar in a highly populated area spanning eight east coast states and several major North American cities stretching 485 miles is remarkable.
“It is incredible and exciting that a new species of frog could be hiding in plain sight in New York City and existing from Connecticut to North Carolina,” says Joanna Burger, professor in the department of cell biology and neuroscience and Feinberg’s advisor at Rutgers.
“The process of recognizing, identifying, and documenting a new species is long and arduous but it is important for our understanding of the wide ranging wildlife in urban as well as other environments.”
Scientists from Rutgers, UCLA, UC Davis, the University of Alabama, Yale, Louisiana State University, SUNY College of Environmental Science and Forestry, and the New Jersey Division of Fish and Wildlife contributed to the work.
A widely used treatment for prostate cancer may cause more harm than good for some patients, a new study reports.
For decades, many men diagnosed with prostate cancer were treated with androgen deprivation therapy (ADT), injections that suppressed testosterone production.Related Articles On Futurity
- University of RochesterDistinguishing single cells with nothing but light
- Johns Hopkins UniversityOne missing gene makes others mutate
- University of VirginiaChildren's cancer test may return false positives
A new study published in the journal Onco Targets and Therapy shows this is the wrong approach for selected men with localized disease, as it provides no added survival benefit and may be associated with other serious health issues.
“Men with advanced disease, or certain men with aggressive disease confined to the prostate gland, are potential ADT candidates,” says Oliver Sartor, medical director of the Tulane University Cancer Center.
“Testosterone suppression can increase radiation cure rates for certain aggressive cancers, and it is standard of care for metastatic disease.”
For years, though, many doctors used ADT for men with low-grade, prostate-confined cancers.More harm than good
“We now have good evidence this treatment may cause more harm than good for these individuals,” Sartor says, “especially patients with slow-growing tumors who are not likely to die of their disease.”
The research shows ADT can potentially lead to health issues, including hot flashes, loss of libido, fracture risk, muscle loss, fatigue, depression, diabetic risk, erectile dysfunction, and weight gain.
Of the 240,000 new cases of prostate cancer diagnosed in the US each year, more than half are early stage and low risk. So what’s the bottom line?
“Many men diagnosed with prostate cancer may not need to be treated,” Sartor says. Instead, “active surveillance” may be a better option for men with low-grade, localized disease.
Source: Tulane University
Chimpanzees will find a place to sleep that’s on the way to breakfast sites, report researchers. The chimps will also risk travel in the dark when predators are active to get more desired, less abundant fruits, such as figs.
“As humans we are familiar with the race against birds for our cherries, or against squirrels for our walnuts and pecans,” says study coauthor Leo Palansky, “but this race is carried out amongst competitors of all kinds of species in locations all over the world.”Related Articles On Futurity
- Purdue UniversityMice get heart healthy on 'watermelon diet'
- New York University"Sustainable" fish may not deserve the label
- Brown UniversityInvasive 'scarecrow' crabs help restore marshes
The study provides evidence that chimpanzees flexibly plan their breakfast time, type, and location after weighing multiple disparate pieces of information.
“Being able to reveal the role of environmental complexity in shaping cognitive-based behavior is especially exciting,” says Polansky, an associate researcher in the anthropology department at the University of California, Davis.
“Long-term, detailed information from the field can reveal the value of high levels of cognition and behavioral flexibility for efficiently obtaining critical food resources in complex environments.”
Researchers recorded when and where five adult female chimpanzees spent the night and acquired food for 275 days during three fruit-scarce periods.
The research took place in the Taï National Park in Côte d’Ivoire, led by researchers of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, where Polansky was a postdoctoral researcher.
The study appears in the Proceedings of the National Academy of Sciences.
Source: UC Davis
Assessing the damage caused by the 2010 Deepwater Horizon oil spill in the Gulf of Mexico has been a challenge. The location of two million barrels of submerged oil thought to be trapped in the deep ocean remains an unsolved part of the puzzle.
Now scientists have traced the oil’s path to create a footprint on the deep ocean floor.
For the new study, published in the Proceedings of the National Academy of Sciences, scientists used data from the Natural Resource Damage Assessment process conducted by the National Oceanic and Atmospheric Administration.
The United States government estimates the Macondo well’s total discharge—from the spill in April 2010 until the well was capped that July—to be 5 million barrels.
By analyzing data from more than 3,000 samples collected at 534 locations over 12 expeditions, researchers identified a 1,250-square-mile patch of the deep sea floor upon which 2 to 16 percent of the discharged oil was deposited.
The fallout of oil to the sea floor created thin deposits most intensive to the southwest of the Macondo well. The oil was most concentrated within the top half inch of the sea floor and was patchy even at the scale of a few feet.
The investigation focused primarily on hopane, a nonreactive hydrocarbon that served as a proxy for the discharged oil. Researchers analyzed the spatial distribution of hopane in the northern Gulf of Mexico and found it was most concentrated in a thin layer at the sea floor within 25 miles of the ruptured well, clearly implicating Deepwater Horizon as the source.Damaged corals
“Based on the evidence, our findings suggest that these deposits come from Macondo oil that was first suspended in the deep ocean and then settled to the sea floor without ever reaching the ocean surface,” says David Valentine, professor of earth science and biology at University of California, Santa Barbara.Related Articles On Futurity
- Michigan State UniversityIn warm oceans, phytoplankton may thrive near poles
- University of Melbourne'Shiny lid' of sea ice spikes Arctic heat
- University of California, DavisBeach 'bugs' took major hit in Gulf oil spill
“The pattern is like a shadow of the tiny oil droplets that were initially trapped at ocean depths around 3,500 feet and pushed around by the deep currents. Some combination of chemistry, biology, and physics ultimately caused those droplets to rain down another 1,000 feet to rest on the sea floor.”
Valentine and his colleagues were able to identify hotspots of oil fallout in close proximity to damaged deep-sea corals. According to the researchers, this data supports the previously disputed finding that these corals were damaged by the Deepwater Horizon spill.
“The evidence is becoming clear that oily particles were raining down around these deep sea corals, which provides a compelling explanation for the injury they suffered,” Valentine says. “The pattern of contamination we observe is fully consistent with the Deepwater Horizon event but not with natural seeps—the suggested alternative.”Patchy oil
While the study examined a specified area, the scientists argue that the observed oil represents a minimum value. They purport that oil deposition likely occurred outside the study area but so far has largely evaded detection because of its patchiness.
“This analysis provides us with, for the first time, some closure on the question ‘Where did the oil go and how?'” says Don Rice, program director in the National Science Foundation’s Division of Ocean Sciences. “It also alerts us that this knowledge remains largely provisional until we can fully account for the remaining 70 percent.”
“These findings should be useful for assessing the damage caused by the Deepwater Horizon spill as well as planning future studies to further define the extent and nature of the contamination,” Valentine says.
“Our work can also help to assess the fate of reactive hydrocarbons, test models of oil’s behavior in the ocean, and plan for future spills.”
Researchers from University of California, Irvine were coauthors of the study. The National Science Foundation provided funding.
Source: UC Santa Barbara
The post 2 million barrels of oil from Deepwater are still missing appeared first on Futurity.
We tend to think of arsenic as a poison, but new research links high levels of the element in drinking water to a 50 percent drop in breast cancer deaths.
The study, published in the journal EBioMedicine, presents results of breast cancer mortality data from a region in Chile where residents were inadvertently exposed to high levels of arsenic, which occurs naturally in many minerals.Related Articles On Futurity
- McGill UniversityHeart attack: 1 in 5 young women have no chest pain
- RutgersTomatoes may lower women's breast cancer risk
- Johns Hopkins UniversityTo ease chronic pain, direct thoughts elsewhere
Instead of an increase in mortality, as with many other cancer sites, the study found that breast cancer deaths were cut in half during the period that coincided with high arsenic exposure. The effect was more pronounced among women under age 60, with mortality in these women reduced by 70 percent.
“What we found was astonishing,” says study lead author Allan Smith, professor of epidemiology at University of California, Berkeley, and director of the Arsenic Health Effects Research Program.
“We’ve been studying the long-term effects of arsenic in this population for many years, focusing on increased disease and mortality attributed to the historical exposure to arsenic in this population.”
In 1958, the northern Chilean city of Antofagasta switched to a geothermal water source originating in the Andes Mountains. Years later, it was discovered that the water sources contained more than 800 micrograms per liter of arsenic—80 times higher than the levels recommended by the World Health Organization.
An arsenic removal plant was installed in 1970 after toxic effects from arsenic exposure became apparent in some residents.Clinical trials
As part of the study, researchers at the Stanford Cancer Institute found that human breast cancer cells grown in lab cultures are killed by arsenic, and normal breast cells are more resistant to arsenic.
The medicinal use of arsenic is not entirely new. Arsenic trioxide was approved in 2000 by the Food and Drug Administration as an effective treatment for a rare type of leukemia.
So should arsenic now be used to treat breast cancer?
“Not yet,” says Smith. “We do not know if the treatment will work, but carefully designed clinical trials should take place as soon as possible based on this new evidence.”
Smith and collaborators in Chile are proceeding to design clinical trials in which some advanced breast cancer patients would be given arsenic treatment.
Additional coauthors contributed from Pontifical Catholic University of Chile, UC Berkeley, and Stanford University.
Source: UC Berkeley
A simple blood test might diagnose early onset Alzheimer’s disease with increased accuracy and much sooner than currently possible.
Previous research has found that changes in the brain occur two decades before patients show signs of dementia. These changes can be detected through expensive brain imaging procedures.Related Articles On Futurity
- Princeton UniversityBrain’s quick memories drive our choices
- University of California, Davis'Spin' molecules may delay Alzheimer's
- University of Texas at AustinKnown drugs may help veterans with PTSD
The blood test has the potential to improve prediction of Alzheimer’s disease to 91 percent accuracy. But because it’s a progressive disease, more testing is needed in a larger population over several years.
In an initial trial group using the blood test, one in five healthy participants with no memory complaints tested positive.
On further medical investigation using brain-imaging techniques, these patients showed signs of degeneration in the brain resembling Alzheimer’s disease features.
The findings are published in the journal Molecular Psychiatry.
The blood test would significantly advance efforts to find new treatments for the degenerative disease and could lead to better preventative measures prior to diagnoses, says lead researcher Andrew Hill from the biochemistry and molecular biology department and Bio21 Institute at the University of Melbourne.
“This blood test would be crucial to the development of therapeutic and preventative drugs for Alzheimer’s disease. It can be used to identify patients for clinical drugs and monitoring improvement on treatment.”Genetic bubbles
The blood test’s high accuracy comes from its ability to harvest protected bubbles of genetic material, called microRNA, found circulating in the bloodstream. People with Alzheimer’s contain a certain set of microRNA that distinguishes them from healthy people.
The test is an accessible method for patients to accurately predict their susceptibility to Alzheimer’s, says Lesley Cheng from the biochemistry, molecular, and cell biology department and Bio21 Institute.
“This test provides the possibility of early detection of Alzheimer’s disease by using a simple blood test which has been designed to also be cost-effective. Furthermore, it is highly accessible for patients and physicians compared to organizing a brain scan or undergoing a neuropsychological test.
“Patients with a family history of Alzheimer’s disease or those with memory concerns could be tested during a standard health check at a medical clinic.
“This test could ease concerns for patients experiencing normal memory problems due to natural aging. Those identified as high risk could then be monitored by their doctor.”
The research was conducted in collaboration with The Florey Institute of Neuroscience and Mental Health, the CSIRO, and Austin Health and Australian Imaging Biomarker and Lifestyle study of Aging.
Source: University of Melbourne
The post Blood test might diagnose Alzheimer’s much earlier appeared first on Futurity.
When researchers exposed overfed mice to UV light, they gained less weight. The mice also displayed fewer warning signs linked to diabetes, such as abnormal glucose levels and resistance to insulin.
Next the researchers applied a cream containing nitric oxide to the overfed mice and found it had the same effect of curbing weight gain as exposure to UV light. Nitric oxide is released by the skin after exposure to sunlight.
Vitamin D, which is produced by the body in response to sunlight and often lauded for its health benefits, does not appear to play a role.
“These observations further indicate that the amounts of nitric oxide released from the skin may have beneficial effects not only on heart and blood vessels but also on the way our body regulates metabolism,” says Martin Feelisch, professor of experimental medicine and integrative biology at the University of Southampton.Related Articles On Futurity
- University of MichiganFood ads fire up the teenage brain
- Duke UniversityDad's obesity may raise kid's cancer risk
- New York UniversityAntibiotics may ‘prime’ kids for obesity
Previous studies in people have shown that nitric oxide can lower blood pressure after exposure to UV lamps.
The researchers say the results should be interpreted cautiously, because mice are nocturnal animals covered in fur and not usually exposed to much sunlight. Studies are needed to confirm whether sunshine exposure has the same effect on weight gain and risk of diabetes in people.
“Our findings are important as they suggest that casual skin exposure to sunlight, together with plenty of exercise and a healthy diet, may help prevent the development of obesity in children,” says Shelley Gorman of the Telethon Kids Institute and lead author of the study published in the journal Diabetes.
“We know from epidemiology studies that sun-seekers live longer than those who spend their lives in the shade,” says Richard Weller, senior lecturer in dermatology at University of Edinburgh.
“Studies such as this one are helping us to understand how the sun can be good for us. We need to remember that skin cancer is not the only disease that can kill us and should perhaps balance our advice on sun exposure.”
Source: University of Southampton
The post ‘Sunshine cream’ seems to slow weight gain in mice appeared first on Futurity.
Scientists have discovered just how DEET repels mosquitoes.
They also have identified a plant defensive compound that might mimic DEET, a discovery that could pave the way for better and more affordable insect repellents.
More than 200 million people worldwide use DEET, developed by scientists at the US Department of Agriculture and patented by the US Army in 1946.
“Mosquitoes are considered the most deadly animals on the planet, but unfortunately, not everyone who needs this repellent can afford to use it, and not all who can afford it can use it due to its undesirable properties such as an unpleasant odor,” says lead author Professor Walter Leal of the molecular and cellular biology department.
“Vector-borne diseases are major health problems for travelers and people living in endemic regions,” Leal says. “Among the most notorious vectors are mosquitoes that transmit the protozoan parasites causing malaria and viruses that cause infections, such as dengue, yellow fever, chikungunya, and encephalitis.”How mosquitoes smell
Mosquitoes detect scents with olfactory receptors on their antennae. The researchers examined two families of olfactory receptors of the southern house mosquito, Culex quinquefasciatus, which transmits diseases such as West Nile virus.Related Articles On Futurity
- Rice UniversityAge, size throw ecosystems out of whack
- University of FloridaFinally! A fungus that fights fire ants
- Yale UniversityBig ants are kings of the hill
One receptor group, “ionotropic receptors,” normally detects acids, bases, and other water-soluble compounds. The researchers discovered, however, that a receptor from the odorant receptor group is directly activated by DEET.
They also detected a link between DEET and the compound methyl jasmonate, suggesting that DEET might work by mimicking a defensive chemical found in plants.
Dan Strickman, senior program officer for vector control at the Bill and Melinda Gates Foundation’s Global Health Program, says, “We are at a very exciting time for research on insect repellents.” (The Gates Foundation was not involved in the study.)
“For decades, the field concentrated on screening compounds for activity, with little or no understanding of how chemicals interacted with mosquitoes to discourage biting. Use of modern techniques that combine molecular biology, biochemistry, and physiology has generated evidence on how mosquitoes perceive odors,” Strickman says.
Findings from the study appear in the Proceedings of the National Academy of Sciences.
Mosquito researcher Anthony Cornel, an associate professor with the UC Davis entomology and nematology department and based at the Kearney Agricultural Research and Extension Center, Parlier, provided mosquitoes that allowed the Leal lab to duplicate his mosquito colony at UC Davis. Richard Benton of the University of Lausanne, Switzerland, shared his flies, Drosophila plasmids, which are also part of the research.
The National Institute of Allergy and Infectious Diseases of the National Institutes of Health supported the work.
Source: UC Davis