logo

Groundwater pumping leads to sea level rise, cancels out effect of dams

 

As people pump groundwater for irrigation, drinking water, and industrial uses, the water doesn’t just seep back into the ground — it also evaporates into the atmosphere, or runs off into rivers and canals, eventually emptying into the world’s oceans. This water adds up, and a new study calculates that by 2050, groundwater pumping will cause a global sea level rise of about 0.8 millimeters per year.

 


 

“Other than ice on land, the excessive groundwater extractions are fast becoming the most important terrestrial water contribution to sea level rise,” said Yoshihide Wada, with Utrecht University in the Netherlands and lead author of the study. In the coming decades, he noted, groundwater contributions to sea level rise are expected to become as significant as those of melting glaciers and ice caps outside of Greenland and the Antarctic.

 

Between around 1970 and 1990, sea level rise caused by groundwater pumping was cancelled out as people built dams, trapping water in reservoirs so the water wouldn’t empty into the sea, Wada said. His research shows that starting in the 1990s, that changed as populations started pumping more groundwater and building fewer dams.

 

The researchers looked not only at the contribution of groundwater pumping, which they had investigated before, but also at other factors that influence the amount of terrestrial water entering the oceans, including marsh drainage, forest clearing, and new reservoirs. Wada and his colleagues calculate that by mid-century, the net effect of these additional factors is an additional 0.05 mm per year of annual sea level rise, on top of the contribution from groundwater pumping alone.

 

The research team’s article is being published today in Geophysical Research Letters, a journal of the American Geophysical Union.

 

The last report of the United Nations Intergovernmental Panel on Climate Change in 2007 addressed the effect on sea level rise of melting ice on land, including glaciers and ice caps, Wada said. But it didn’t quantify the future contribution from other terrestrial water sources, such as groundwater, reservoirs, wetlands and more, he said, because the report’s authors thought the estimates for those sources were too uncertain.

 

“They assumed that the positive and negative contribution from the groundwater and the reservoirs would cancel out,” Wada said. “We found that wasn’t the case. The contribution from the groundwater is going to increase further, and outweigh the negative contribution from reservoirs.”

 

In the current study, the researchers estimated the impact of groundwater depletion since 1900 using data from individual countries on groundwater pumping, model simulations of groundwater recharge, and reconstructions of how water demand has changed over the years. They also compared and corrected those estimates with observations from sources such as the GRACE satellite, which uses gravity measurements to determine variations in groundwater storage.

 

With these groundwater depletion rates, Wada and his colleagues estimate that in 2000, people pumped about 204 cubic kilometers (49 cubic miles) of groundwater, most of which was used for irrigation. Most of this, in turn, evaporates from plants, enters the atmosphere and rains back down. Taking into account the seepage of groundwater back into the aquifers, as well as evaporation and runoff, the researchers estimated that groundwater pumping resulted in sea level rise of about 0.57 mm in 2000 — much greater than the 1900 annual sea level rise of 0.035 mm.

 

The researchers also projected groundwater depletion, reservoir storage, and other impacts for the rest of the century, using climate models and projected population growth and land use changes. The increase in groundwater depletion between 1900 and 2000 is due mostly to increased water demands, the researchers find. But the increase projected between 2000 and 2050 is mostly due to climate-related factors like decreased surface water availability and irrigated agricultural fields that dry out faster in a warmer climate.

 

If things continue as projected, Wada estimates that by 2050, the net, cumulative effect of these non-ice, land-based water sources and reservoirs — including groundwater pumping, marsh drainage, dams, and more — will have added 31 mm to sea level rise since 1900.

 

The new study assumes that, where there is groundwater, people will find a way to extract it, Wada said, but some of his colleagues are investigating the limits of groundwater extraction. One way to decrease groundwater’s contribution to sea level rise, he noted, is to improve water efficiency in agriculture — to grow more with less groundwater.

 

 

 

Source:  American Geophysical Union

 

 

Published on 10th May 2012

 

Advanced genetic screening method may speed vaccine development

 

Infectious diseases—both old and new—continue to exact a devastating toll, causing some 13 million fatalities per year around the world.


Vaccines remain the best line of defense against deadly pathogens and now Kathryn Sykes and Stephen Johnston, researchers at Arizona State University’s Biodesign Institute, along with co-author Michael McGuire from the University of Texas Southwestern Medical Center are using clever functional screening methods to attempt to speed new vaccines into production that are both safer and more potent.

 

In a recent study appearing in the journal Proteome Science, the group used high-throughput methods to identify a modulator of immune activity that exists naturally in an unusual pathogen belonging to the Poxviridae family of viruses.

 

Parapoxvirus infection causes immune cell accumulation at the site of infection; direct screening in the host for this biological activity enabled the isolation of an immunomodulator—labeled B2.  Indeed, B2 by itself causes immune cell accumulation at the site of skin injection. When added to a traditional influenza vaccine, B2 improves the vaccine’s protective capacity. Furthermore, the immunomodulator also demonstrated the ability to shrink the size of cancerous tumors, even in the absence of any accompanying specific antigen.

 

In the past, the process of vaccine discovery involved the random selection of naturally attenuated strains of viruses and bacteria, which were found to provide protection in humans. Examples of this approach include the use of vaccinia to protect against smallpox and attenuated mycobacterium bovis (BCG) to protect against tuberculosis.

 

In recent years, many vaccines have been developed using only selected portions of a given pathogen to confer immunity. These so-called subunit vaccines have several advantages over whole pathogen vaccines. Genetic components that allow a given pathogen to elude immune detection for example may be screened out, as well as any factors causing unwanted vaccine side effects. Through careful screening, just those elements responsible for eliciting protective immune responses in the host can be extracted from the pathogen and reassembled into an effective, safer subunit vaccine.

 

In practice, the process of narrowing the field of promising subunit candidates from the whole genome of a pathogen has often been time consuming, laborious and perplexing. In the current study, their earlier-developed strategy, known as expression library immunization, is extended to develop a scheme to find the protein-encoding segments—known as open reading frames (ORFs)—from a pathogenic genome that have any biological function of interest.

 

This simple, yet powerful technique uses the host’s immune system itself to rapidly reduce any pathogenic genome (viral, fungal, bacterial or parasitic) to a handful of antigens capable of conferring protection in the host.

 

The advantage of this in vivo technique is that it offers a means of rapidly screening entire genomes, with the results of the search displaying desired immunogenic traits. The mode of entry of vaccines designed in this way closely resembles the natural infection process of host cells—an improvement over live attenuated vaccines.

 

This promising approach has been used effectively to engineer a vaccine against hepatitis and may provide a new avenue for the development of protective agents against pathogens that have thus far eluded traditional vaccine efforts, including HIV and ebola.

 

“We had developed a method for screening for protective subunits against a specific disease,” Sykes says.  “However this type of safer vaccine design is notoriously less potent than the whole pathogen designs.  What we needed was a method to find  generally useful vaccine components that would serve to enhance and control immunity.”

 

The group chose the pathogen parapoxvirus ovis (known as the Orf virus) for the current set of experiments, in which expression library immunization techniques were used to screen for an immunogenic factor buried in the pathogen’s genome.

 

Parapoxvirus ovis causes a highly infectious disease known as Orf, which is prevalent in sheep and goats and may be transmitted cutaneously to humans handling these animals, causing pustular lesions and scabs.

 

Once the group had sequenced the full genome of parapoxvirus, PCR was used to amplify all the viral open reading frames, which code for all of the viruse’s proteins. Each ORF, comprising a library of genomic components, was compiled into a unique high throughput expression construct, and these were randomly distributed into sub-library pools. These pools were directly delivered into sets of mice for in vivo expression. Functional testing for the activity desired identified B2 as the immune cell accumulator.

 

In further experiments, the team co-delivered B2L as an additive or adjuvant for an influenza gene vaccine, to see if it could improve survival rates in mice challenged with the influenza virus. The co-immunized mice indeed displayed full protection against influenza compared with 50 percent protection of the control group, immunized with influenza vaccine alone.

 

In addition to infectious agents like Orf, non-infectious diseases including cancer may be amenable to vaccine defense. Thus far however, the discovery of tumor-specific antigens has been frustrating. One approach may lie in using non-specific immunogenic factors like B2.

 

In the current study, two forms of cancer were investigated in a mouse model, following the administering of B2 alone, in the absence of a disease antigen. The experiments evaluated B2’s ability to enhance survival and shrink tumor size. In the case of an aggressive melanoma, tumor size was significantly reduced and survival rate improved.  Administration of B2 to an infection induced by a breast cancer cell line also showed a modest but measureable reduction in tumor size.

 

With the growing popularity of sub-unit vaccines, the need arises for more effective adjuvants, which may be used to compensate for the reduced immunogenicity of such vaccines compared with their whole-pathogen counterparts. Techniques similar to those applied here to isolate and evaluate B2 could potentially permit the screening of virtually any genome for any gene-encoded activity testable in an organism.

 

The original article was written by Richard Harth

 

Source:-  Arizona State University

 

Published on 10th May 2012

 

Scientists Gain New Understanding of Alzheimer’s Trigger

 

A highly toxic beta-amyloid – a protein that exists in the brains of Alzheimer’s disease victims – has been found to greatly increase the toxicity of other more common and less toxic beta-amyloids, serving as a possible “trigger” for the advent and development of Alzheimer’s, researchers at the University of Virginia and German biotech company Probiodrug have discovered.

 


 

The finding, reported in the May 2 online edition of the journal Nature, could lead to more effective treatments for Alzheimer’s. Already, Probiodrug AG, based in Halle, Germany has completed phase 1 clinical trials in Europe with a small molecule that inhibits an enzyme, glutaminyl cyclase, that catalyzes the formation of this hypertoxic version of beta-amyloid.

 

“This form of beta-amyloid, called pyroglutamylated (or pyroglu) beta-amyloid, is a real bad guy in Alzheimer’s disease,” said principal investigator George Bloom, a U.Va. professor of biology and cell biology in the College of Arts & Sciences and School of Medicine, who is collaborating on the study with scientists at Probiodrug. “We’ve confirmed that it converts more abundant beta-amyloids into a form that is up to 100 times more toxic, making this a very dangerous killer of brain cells and an attractive target for drug therapy.”

 

Bloom said the process is similar to various prion diseases, such as mad cow disease or chronic wasting disease, where a toxic protein can “infect” normal proteins that spread through the brain and ultimately destroy it.

 

In the case of Alzheimer’s, severe dementia occurs over the course of years prior to death.

 

“You might think of this pyroglu beta-amyloid as a seed that can further contaminate something that’s already bad into something much worse – it’s the trigger,” Bloom said. Just as importantly, the hypertoxic mixtures that are seeded by pyroglu beta-amyloid exist as small aggregates, called oligomers, rather than as much larger fibers found in the amyloid plaques that are a signature feature of the Alzheimer’s brain.

 

And the trigger fires a “bullet,” as Bloom puts it. The bullet is a protein called tau that is stimulated by beta-amyloid to form toxic “tangles” in the brain that play a major role in the onset and development of Alzheimer’s. Using mice bred to have no tau genes, the researchers found that without the interaction of toxic beta-amyloids with tau, the Alzheimer’s cascade cannot begin. The pathway by which pyroglu beta-amyloid induces the tau-dependent death of neurons is now the target of further investigation to understand this important step in the early development of Alzheimer’s disease

 

“There are two matters of practical importance in our discovery,” Bloom said. “One, is the new insights we have as to how Alzheimer’s might actually progress – the mechanisms which are important to understand if we are to try to prevent it from happening; and second, it provides a lead into how to design drugs that might prevent this kind of beta-amyloid from building up in the first place.”

 

Said study co-author Hans-Ulrich Demuth, a biochemist and chief scientific officer at Probiodrug, “This publication further adds significant evidence to our hypothesis about the critical role pyroglu beta-amyloid plays in the initiation of Alzheimer’s Disease. For the first time we have found a clear link in the relationship between pyroglu beta-amyloid, oligomer formation and tau protein in neuronal toxicity.”

 

Bloom and his collaborators are now looking for other proteins that are needed for pyroglu beta-amyloid to become toxic. Any such proteins they discover are potential targets for the early diagnosis and/or treatment of Alzheimer’s disease.

 

 

 

Source:   University of Virginia

 

Published on 3rd  May 2012

 

 

Related articles

HOME-BASED ASSESSMENT TOOL FOR DEMENTIA SCREENING


Alzheimer’s Cognitive Decline Slows in Advanced Age


Alzheimer’s Gene Causes Brain’s Blood Vessels to Leak, Die


USF researchers find that Alzheimer’s precursor protein controls its own fate

 

 

Tiny ‘spherules’ reveal details about Earth’s asteroid impacts

 


 

Researchers are learning details about asteroid impacts going back to the Earth’s early history by using a new method for extracting precise information from tiny “spherules” embedded in layers of rock. The spherules were created when asteroids crashed into Earth, vaporizing rock that expanded as a giant vapor plume. Small droplets of molten rock in the plume condensed and solidified, falling back to the surface as a thin layer. This sample was found in Western Australia and formed 2.63 billion years ago in the aftermath of a large impact. (Credit: Oberlin College photo/Bruce M. Simonson)

 

Researchers are learning details about asteroid impacts going back to the Earth’s early history by using a new method for extracting precise information from tiny “spherules” embedded in layers of rock.


The spherules were created when asteroids crashed into the Earth, vaporizing rock that expanded into space as a giant vapor plume. Small droplets of molten and vaporized rock in the plume condensed and solidified, falling back to Earth as a thin layer. The round or oblong particles were preserved in layers of rock, and now researchers have analyzed them to record precise information about asteroids impacting Earth from 3.5 billion to 35 million years ago.


“What we have done is provide the foundation for understanding how to interpret the layers in terms of the size and velocity of the asteroid that made them,” said Jay Melosh, an expert in impact cratering and a distinguished professor of earth and atmospheric sciences, physics and aerospace engineering at Purdue University.


Findings, which support a theory that the Earth endured an especially heavy period of asteroid bombardment early in its history, are detailed in a research paper appearing online in the journal Nature on Wednesday (April 25). The paper was written by Purdue physics graduate student Brandon Johnson and Melosh. The findings, based on geologic observations, support a theoretical study in a companion paper in Nature by researchers at the Southwest Research Institute in Boulder, Colo.


The period of heavy asteroid bombardment – from 4.2 to 3.5 billion years ago – is thought to have been influenced by changes in the early solar system that altered the trajectory of objects in an asteroid belt located between Mars and Jupiter, sending them on a collision course with Earth.


“That’s the postulate, and this is the first real solid evidence that it actually happened,” Melosh said. “Some of the asteroids that we infer were about 40 kilometers in diameter, much larger than the one that killed off the dinosaurs about 65 million years ago that was about 12-15 kilometers. But when we looked at the number of impactors as a function of size, we got a curve that showed a lot more small objects than large ones, a pattern that matches exactly the distribution of sizes in the asteroid belt. For the first time we have a direct connection between the crater size distribution on the ancient Earth and the sizes of asteroids out in space.”


Because craters are difficult to study directly, impact history must be inferred either by observations of asteroids that periodically pass near the Earth or by studying craters on the moon. Now, the new technique using spherules offers a far more accurate alternative to chronicle asteroid impacts on Earth, Melosh said.


“We can look at these spherules, see how thick the layer is, how big the spherules are, and we can infer the size and velocity of the asteroid,” Melosh said. “We can go back to the earliest era in the history of the Earth and infer the population of asteroids impacting the planet.”


For asteroids larger than about 10 kilometers in diameter, the spherules are deposited in a global layer.


“Some of these impacts were several times larger than the Chicxulub impact that killed off the dinosaurs 65 million years ago,” Johnson said. “The impacts may have played a large role in the evolutional history of life. The large number of impacts may have helped simple life by introducing organics and other important materials at a time when life on Earth was just taking hold.”


A 40-kilometer asteroid would have wiped out everything on the Earth’s surface, whereas the one that struck 65 million years ago killed only land animals weighing more than around 20 kilograms.


“Impact craters are the most obvious indication of asteroid impacts, but craters on Earth are quickly obscured or destroyed by surface weathering and tectonic processes,” Johnson said. “However, the spherule layers, if preserved in the geologic record, provide information about an impact even when the source crater cannot be found.”


The Purdue researchers studied the spherules using computer models that harness mathematical equations developed originally to calculate the condensation of vapor.


“There have been some new wrinkles in vapor condensation modeling that motivated us to do this work, and we were the first to apply it to asteroid impacts,” Melosh said.


The spherules are about a millimeter in diameter.


The researchers also are studying a different type of artifact similar to spherules but found only near the original impact site. Whereas the globally distributed spherules come from the condensing vaporized rock, these “melt droplets” are from rock that’s been melted and not completely vaporized.


“Before this work, it was not possible to distinguish between these two types of formations,” Melosh said. “Nobody had established criteria for discriminating between them, and we’ve done that now.”


One of the authors of the Southwest Research Institute paper, David Minton, is now an assistant professor of earth and atmospheric sciences at Purdue.


Findings from the research may enable Melosh’s team to enhance an asteroid impact effects calculator he developed to estimate what would happen if asteroids of various sizes were to hit the Earth. The calculator, “Impact: Earth!” allows anyone to calculate potential comet or asteroid damage based on the object’s mass.


The research has been funded by NASA.

 

 

 

Source: Purdue University

 

Published on 3rd May 2012

 

24 new species discovered on Caribbean islands are close to extinction

Lizard with blue tail on brown rock

 


 

 

An Anguilla Bank skink. Blair Hedges and his team have discovered and scientifically named 24 new species of lizards known as skinks. (Credit: Karl Questel)

 

 

 

In a single new scientific publication, 24 new species of lizards known as skinks, all from islands in the Caribbean, have been discovered and scientifically named. According to Blair Hedges, professor of biology at Penn State University and the leader of the research team, half of the newly added skink species already may be extinct or close to extinction, and all of the others on the Caribbean islands are threatened with extinction. The researchers found that the loss of many skink species can be attributed primarily to predation by the mongoose — an invasive predatory mammal that was introduced by farmers to control rats in sugarcane fields during the late 19th century. The research team reports on the newly discovered skinks in a 245-page article published today (April 30) in the journal Zootaxa.

About 130 species of reptiles from all over the world are added to the global species count each year in dozens of scientific articles. However, not since the 1800s have more than 20 reptile species been added at one time. Primarily through examination of museum specimens, the team identified a total of 39 species of skinks from the Caribbean islands, including six species currently recognized, and another nine named long ago but considered invalid until now. Hedges and his team also used DNA sequences, but most of the taxonomic information, such as counts and shapes of scales, came from examination of the animals themselves.

“Now, one of the smallest groups of lizards in this region of the world has become one of the largest groups,” Hedges said. “We were completely surprised to find what amounts to a new fauna, with co-occurring species and different ecological types.”

He said some of the new species are six times larger in body size than other species in the new fauna.

Hedges also explained that these New World skinks, which arrived in the Americas about 18 million years ago from Africa by floating on mats of vegetation, are unique among lizards in that they produce a human-like placenta, which is an organ that directly connects the growing offspring to the maternal tissues that provide nutrients.

“While there are other lizards that give live birth, only a fraction of the lizards known as skinks make a placenta and gestate offspring for up to one year,” Hedges said.

He also speculated that the lengthy gestational period may have given predators a competitive edge over skinks, since pregnant females are slower and more vulnerable.

“The mongoose is the predator we believe is responsible for many of the species’ close-to-extinction status in the Caribbean,” Hedges said. “Our data show that the mongoose, which was introduced from India in 1872 and spread around the islands over the next three decades, has nearly exterminated this entire reptile fauna, which had gone largely unnoticed by scientists and conservationists until now.”

According to Hedges, the “smoking gun” is a graph included in the scientific paper showing a sharp decline in skink populations that occurred soon after the introduction of the mongoose. Hedges explained that the mongoose originally was brought to the New World to control rats, which had become pests in the sugarcane fields in Cuba, Hispaniola, Puerto Rico, Jamaica and the Lesser Antilles. While this strategy did help to control infestations of some pests; for example, the Norway rat, it also had the unintended consequence of reducing almost all skink populations.

“By 1900, less than 50 percent of those mongoose islands still had their skinks, and the loss has continued to this day,” Hedges said.

This newly discovered skink fauna will increase dramatically the number of reptiles categorized as “critically endangered” by the International Union for Conservation of Nature in their “Red List of Threatened Species,” which is recognized as the most comprehensive database evaluating the endangerment status of various plant and animal species.

“According to our research, all of the skink species found only on Caribbean islands are threatened,” Hedges said. “That is, they should be classified in the Red List as either vulnerable, endangered, or critically endangered. Finding that all species in a fauna are threatened is unusual, because only 24 percent of the 3,336 reptile species listed in the Red List have been classified as threatened with extinction. Most of the 9,596 named reptile species have yet to be classified in the Red List.”

Hedges explained that there are two reasons why such a large number of species went unnoticed for so many years, in a region frequented by scientists and tourists.

“First, Caribbean skinks already had nearly disappeared by the start of the 20th century, so people since that time rarely have encountered them and therefore have been less likely to study them,” he said. “Second, the key characteristics that distinguish this great diversity of species have been overlooked until now.”

Hedges also noted that many potential new species of animals around the world have been identified in recent years with DNA data. However, much more difficult is the task of following up DNA research with the work required to name new species and to formally recognize them as valid, as this team did with Caribbean skinks.

The other member of the research team, Caitlin Conn, now a researcher at the University of Georgia and formerly a biology major in Penn State’s Eberly College of Science and a student in Penn State’s Schreyer Honors College at the time of the research, added that researchers might be able to use the new data to plan conservation efforts, to study the geographic overlap of similar species, and to study in more detail the skinks’ adaptation to different ecological habitats or niches. The research team also stressed that, while the mongoose introduction by humans now has been linked to these reptile declines and extinctions, other types of human activity, especially the removal of forests, are to blame for the loss of other species in the Caribbean.

Funding for the research comes from the National Science Foundation.

 

 

 


Source: Pennsylvania State University

 

Published on 2nd May 2012

 

Clinical Decline in Alzheimer’s Requires Plaque and Proteins

 

According to a new study, the neuron-killing pathology of Alzheimer’s disease (AD), which begins before clinical symptoms appear, requires the presence of both amyloid-beta (a-beta) plaque deposits and elevated levels of an altered protein called p-tau.

 


 

Without both, progressive clinical decline associated with AD in cognitively healthy older individuals is “not significantly different from zero,” reports a team of scientists at the University of California, San Diego School of Medicine in the April 23 online issue of the Archives of Neurology.

 

“I think this is the biggest contribution of our work,” said Rahul S. Desikan, MD, PhD, research fellow and resident radiologist in the UC San Diego Department of Radiology and first author of the study.  “A number of planned clinical trials – and the majority of Alzheimer’s studies – focus predominantly on a-beta. Our results highlight the importance of also looking at p-tau, particularly in trials investigating therapies to remove a-beta. Older, non-demented individuals who have elevated a-beta levels, but normal p-tau levels, may not progress to Alzheimer’s, while older individuals with elevated levels of both will likely develop the disease.”

 

The findings also underscore the importance of p-tau as a target for new approaches to treating patients with conditions ranging from mild cognitive impairment (MCI) to full-blown AD. An estimated 5.4 million Americans have AD. It’s believed that 10 to 20 percent of Americans age 65 and older have MCI, a risk factor for AD. Some current therapies appear to delay clinical AD onset, but the disease remains irreversible and incurable.

 

“It may be that a-beta initiates the Alzheimer’s cascade,” said Desikan. “But once started, the neurodegenerative mechanism may become independent of a-beta, with p-tau and other proteins playing a bigger role in the downstream degenerative cascade. If that’s the case, prevention with anti-a-beta compounds may prove efficacious against AD for older, non-demented individuals who have not yet developed tau pathology.  But novel, tau-targeting therapies may help the millions of individuals who already suffer from mild cognitive impairment or Alzheimer’s disease.”

 

The new study involved evaluations of healthy, non-demented elderly individuals participating in the ongoing, multi-site Alzheimer’s Disease Neuroimaging Initiative, or ADNI. Launched in 2003, ADNI is a longitudinal effort to measure the progression of mild cognitive impairment and early-stage AD.

 

The researchers studied samples of cerebrospinal fluid (CSF) taken from ADNI participants.

 

“In these older individuals, the presence of a-beta alone was not associated with clinical decline,” said Anders M. Dale, PhD, professor of radiology, neurosciences, and psychiatry at UC San Diego and senior author of the study. “However, when p-tau was present in combination with a-beta, we saw significant clinical decline over three years.”

 

A-beta proteins have several normal responsibilities, including activating enzymes and protecting cells from oxidative stress. It is not known why a-beta proteins form plaque deposits in the brain. Similarly, the origins of p-tau are not well understood. One hypothesis, according to Desikan, is that a-beta plaque deposits trigger hyperphosphorylation of nearby tau proteins, which normally help stabilize the structure of brain cells. Hyperphosphorylation occurs when phosphate groups attach to a protein in excess numbers, altering their normal functions. Hyperphosphorylated tau – or p-tau – can then exacerbate the toxic effects of a-beta plaque upon neurons.

 

The discovery of p-tau’s heightened role in AD neurodegeneration suggests it could be a specific biomarker for the disease before clinical symptoms appear. While high levels of another tau protein – t-tau – in cerebrospinal fluid have been linked to neurologic disorders, such as frontotemporal dementia and traumatic brain injury, high levels of p-tau correlates specifically to increased neurofibrillary tangles in brain cells, which are seen predominantly with AD.

 

“These results are in line with another ADNI study of healthy controls and MCI participants that found progressive atrophy in the entorhinal cortex – one of the areas of the brain first affected in AD –only in amyloid positive individuals who also showed evidence of elevated p-tau levels,” said Linda McEvoy, PhD, assistant professor of radiology and study co-author.

 

“One of the exciting dimensions of this paper was the combined use of cerebrospinal fluid markers and clinical assessments to better elucidate the neurodegenerative process underlying Alzheimer’s disease in individuals who do not yet show clinical signs of dementia,” added co-author James Brewer, MD, PhD, an associate professor of radiology and neurosciences at UC San Diego School of Medicine.  “We do not have an animal model that works very well for studying this disease, so the ability to examine the dynamics of neurodegeneration in living humans is critical.”

 

Nonetheless, the scientists say more research is needed. They note that CSF biomarkers provide only an indirect assessment of amyloid and neurofibrillary pathology and may not fully reflect the underlying biological processes of AD.

 

“This study highlights the complex interaction of multiple pathologies that likely contribute to the clinical symptomatology of Alzheimer’s disease,” said co-author Reisa Sperling, MD, a neurologist at Massachusetts General Hospital and Brigham and Women’s Hospital. “It suggests we may be able to intervene in the preclinical stages of AD before there is significant neurodegeneration and perhaps prevent the onset of symptoms.”

 

Other co-authors are Wesley K. Thompson, Department of Psychiatry; and Dominic Holland and Paul S. Aisen, Department of Neuroscience, UC San Diego School of Medicine.

 

Funding for this research came, in part, from the National Institutes of Health and the Alzheimer’s Disease Neuroimaging Initiative.

 

 

 

Source: University of California, San Diego


Published on 24th  April  2012

 

COMPRESSED SENSING ALLOWS SUPER-RESOLUTION MICROSCOPY IMAGING OF LIVE CELL STRUCTURES

 

Single Molecule Identification

 


 

 

 

Image shows single-molecule identification. The green cross signs show the locations of single molecules using the super resolution technique.  (Credit Lei Zhu and Bo Huang)


 

 

Researchers from the Georgia Institute of Technology and University of California San Francisco have advanced scientists’ ability to view a clear picture of a single cellular structure in motion. By identifying molecules using compressed sensing, this new method provides needed spatial resolution plus a faster temporal resolution than previously possible.

 

Despite many achievements in the field of super-resolution microscopy in the past few years with spatial resolution advances, live-cell imaging has remained a challenge because of the need for high temporal resolution.

 

Now, Lei Zhu, assistant professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering, and Bo Huang, assistant professor in UCSF’s Department of Pharmaceutical Chemistry and Department of Biochemistry and Biophysics, have developed an advanced approach using super-resolution microscopy to resolve cellular features an order of magnitude smaller than what could be seen before. This allows the researchers to tap previously inaccessible information and answer new biological questions.

 

The research was published April 22, 2012 in the journal Nature Methods. The research is funded by the National Institutes of Health, UCSF Program for Breakthrough Biomedical Research, Searle Scholarship and Packard Fellowship for Science and Engineering.

 

The previous technology using the single-molecule-switching approach for super-resolution microscopy depends on spreading single molecule images sparsely into many, often thousands of, camera frames. It is extremely limited in its temporal resolution and does not provide the ability to follow dynamic processes in live cells.

 

“We can now use our discovery using super-resolution microscopy with seconds or even sub-second temporal resolution for a large field of view to follow many more dynamic cellular processes,” said Zhu. “Much of our knowledge of the life of a cell comes from our ability to see the small structures within it.”

 

Huang noted, “One application, for example, is to investigate how mitochondria, the power house of the cell, interact with other organelles and the cytoskeleton to reshape the structure during the life cycle of the cell.”

 

Currently, light microscopy, especially in the modern form of fluorescence microscopy, is still used frequently by many biologists. However, the authors say, conventional light microscopy has one major limitation: the inability to resolve two objects closer than half the wavelength of the light because of the phenomenon called diffraction. With diffraction, the images look blurry and overlapped no matter how high the magnification that is used.

 

“The diffraction limit has long been regarded as one of the fundamental constraints for light microscopy until the recent inventions of super-resolution fluorescence microscopy techniques,” said Zhu. Super-resolution microscopy methods, such as stochastic optical reconstruction microscopy (STORM) or photoactivated localization microscopy (PALM), rely on the ability to record light emission from a single molecule in the sample.

 

Using probe molecules that can be switched between a visible and an invisible state, STORM/PALM determines the position of each molecule of interest. These positions ultimately define a structure.

 

The new finding is significant, said Zhu and Huang, because they have shown that the technology allows for following the dynamics of a microtubule cytoskeleton with a three-second time resolution, which would allow researchers to study the active transports of vesicles and other cargos inside the cell.

 

Using the same optical system and detector as in conventional light microscopy, super-resolution microscopy naturally requires longer acquisition time to obtain more spatial information, leading to a trade-off between its spatial and temporal resolution. In super-resolution microscopy methods based on STORM/PALM, each camera image samples a very sparse subset of probe molecules in the sample.

 

An alternative approach is to increase the density of activated fluorophores so that each camera frame samples more molecules. However, this high density of fluorescent spots causes them to overlap, invalidating the widely used single-molecule localization method.

 

The authors said that a number of methods have been reported recently that can efficiently retrieve single-molecule positions even when the single fluorophore signals overlap. These methods are based on fitting clusters of overlapped spots with a variable number of point-spread functions (PSFs) with either maximum likelihood estimation or Bayesian statistics. The Bayesian method has also been applied to the whole image set.

 

As a result of new research, Zhu and Huang present a new approach based on global optimization using compressed sensing, which does not involve estimating or assuming the number of molecules in the image. They show that compressed sensing can work with much higher molecule densities compared to other technologies and demonstrate live cell imaging of fluorescent protein-labeled microtubules with three-second temporal resolution.

 

The STORM experiment used by the authors, with immunostained microtubules in Drosophila melanogaster S2 cells, demonstrated that nearby microtubules can be resolved by compressed sensing using as few as 100 camera frames, whereas they were not discernible by the single-molecule fitting method. They have also performed live STORM on S2 cells stably expressing tubulin fused to mEos2.

 

At the commonly used camera frame rate of 56.4 Hertz, a super-resolution movie was constructed with a time resolution of three seconds (169 frames) and a Nyquist resolution of 60 nanometers, much faster than previously reported, said Zhu and Huang. These results have proven that compressed sensing can enable STORM to monitor live cellular processes with second-scale time resolution, or even sub-second-scale resolution if a faster camera can be used.

 

 

 

Source: Georgia Institute of Technology

 

Published on 24th April 2012

 

 

Strange cousins: molecular alternatives to DNA, RNA offer new insight into life’s origins

 

Living systems owe their existence to a pair of information-carrying molecules: DNA and RNA. These fundamental chemical forms possess two features essential for life: they display heredity—meaning they can encode and pass on genetic information, and they can adapt over time, through processes of Darwinian evolution.

 


 

A long-debated question is whether heredity and evolution could be performed by molecules other than DNA and RNA.

 

John Chaput, a researcher at ASU’s Biodesign Institute, who recently published an article in Nature Chemistry describing the evolution of threose nucleic acids,  joined a multidisciplinary team of scientists from England, Belgium and Denmark to extend these properties to other so-called Xenonucleic acids or XNA’s.

 

The group demonstrates for the first time that six of these unnatural nucleic acid polymers are capable of sharing information with DNA. One of these XNAs, a molecule referred to as anhydrohexitol nucleic acid or HNA was capable of undergoing directed evolution and folding into biologically useful forms.

 

Their results appear in the current issue of Science.

 

The work sheds new light on questions concerning the origins of life and provides a range of practical applications for molecular medicine that were not previously available.

 

Nucleic acid aptamers, which have been engineered through in vitro selection to bind with various molecules, act in a manner similar to antibodies—latching onto their targets with high affinity and specificity. “This could be great for building new types of diagnostics and new types of biosensors,” Chaput says, pointing out that XNAs are heartier molecules, not recognized by the natural enzymes that tend to degrade DNA and RNA. New therapeutics may also arise from experimental Xenobiology.

 

Both RNA and DNA embed data in their sequences of four nucleotides—information vital for conferring hereditary traits and for supplying the coded recipe essential for building proteins from the 20 naturally occurring amino acids. Exactly how (and when) this system got its start however, remains one of the most intriguing and hotly contested areas of biology.

 

According to one hypothesis, the simpler RNA molecule preceded DNA as the original informational conduit. The RNA world hypothesis proposes that the earliest examples of life were based on RNA and simple proteins. Because of RNA’s great versatility—it is not only capable of carrying genetic information but also of catalyzing chemical reactions like an enzyme—it is believed by many to have supported pre-cellular life.

 

Nevertheless, the spontaneous arrival of RNA through a sequence of purely random mixing events of primitive chemicals was at the very least, an unlikely occurrence.  “This is a big question,” Chaput says. “If the RNA world existed, how did it come into existence? Was it spontaneously produced, or was it the product of something that was even simpler than RNA?”

 

This pre-RNA world hypothesis has been gaining ground, largely through investigations into XNAs, which provide plausible alternatives to the current biological regime and could have acted as chemical stepping-stones to the eventual emergence of life. The current research strengthens the case that something like this may have taken place.

 

Threose nucleic acid or TNA for example, is one candidate for this critical intermediary role. “TNA does some interesting things,” Chaput says, noting the molecule’s capacity to bind with RNA through antiparallel Watson-Crick base pairing. “This property provides a model for how XNAs could have transferred information from the pre-RNA world to the RNA world.”

 

Nucleic acid molecules, including DNA and RNA consist of 3 chemical components: a sugar group, a triphosphate backbone and combinations of the four nucleic acids. By tinkering with these structural elements, researchers can engineer XNA molecules with unique properties. However, in order for any of these exotic molecules to have acted as a precursor to RNA in the pre-biotic epoch, they would need to have been able to transfer and recover their information from RNA. To do this, specialized enzymes, known as polymerases are required.

 

Nature has made DNA and RNA polymerases, capable of reading, transcribing and reverse transcribing normal nucleic acid sequences. For XNA molecules, however; no naturally occurring polymerases exist. So the group, led by Phil Holliger at the MRC in England, painstakingly evolved synthetic polymerases that could copy DNA into XNA and other polymerases that could copy XNA back into DNA. In the end, polymerases were discovered that transcribe and reverse-transcribe six different genetic systems: HNA, CeNA, LNA, ANA, FANA and TNA. The experiments demonstrated that these unnatural DNA sequences could be rendered into various XNAs when the polymerases were fed the appropriate XNA substrates.

 

Using these enzymes as tools for molecular evolution, the team evolved the first example of an HNA aptamer through iterative rounds of selection and amplification. Starting from a large pool of DNA sequences, a synthetic polymerase was used to copy the DNA library into HNA. The pool of HNA molecules was then incubated with an arbitrary target. The small fraction of molecules that bound the target were separated from the unbound pool, reverse transcribed back into DNA with a second synthetic enzyme and amplified by PCR. After many repeated rounds, HNAs were generated that bound  HIV trans-activating response RNA (TAR) and hen egg lysosome (HEL), which were used as binding targets.) “This is a synthetic Darwinian process,” Chaput says. “The same thing happens inside our cells, but this is done in vitro.”

 

The method for producing XNA polymerases draws on the path-breaking work of Holliger, one of the lead authors of the current study. The elegant technique uses cell-like synthetic compartments of water/oil emulsion to conduct directed evolution of enzymes, particularly polymerases. By isolating self-replication reactions from each other, the process greatly improves the accuracy and efficiency of polymerase evolution and replication. “What nobody had really done before,” Chaput says,  “is to take those technologies and apply them to unnatural nucleic acids. ”

 

Chaput also underlines the importance of an international collaboration for carrying out this type of research, particularly for the laborious effort of assembling the triphosphate substrates needed for each of the 6 XNA systems used in the study:

 

“What happened here is that a community of scientists came together and organized around this idea that we could find polymerases that could be used to open up biology to unnatural polymers. It would have been a tour de force for any lab to try to synthesize all the triphosphates, as none of these reagents are commercially available.”

 

The study advances the case for a pre-RNA world, while revealing a new class of XNA aptamers capable of fulfilling myriad useful roles. Although many questions surrounding the origins of life persist, Chaput is optimistic that solutions are coming into view: “Further down the road, through research like this, I think we’ll have enough information to begin to put the pieces of the puzzle together.”

 

The research group consisted of investigators from  the  Medical Research Council (MRC) Laboratory of Molecular Biology, Cambridge, led by Philipp Holliger; the Institute, Katholieke Universiteit Leuven, Belgium,  led by Piet Herdewijn;  the Nucleic Acid Center, Department of Physics and Chemistry, University of Southern Denmark, led by Jesper Wengel; and the Biodesign Institute at Arizona State University, led by John Chaput.

 

In addition to his appointment at the Biodesign Institute, John Chaput is an associate professor in the Department of Chemistry and Biochemistry, in the College of Liberal Arts & Sciences.

 

 

 

 

The original article was written by Richard Harth

Science Writer: The Biodesign Institute

 

 

Source:-  Arizona State University

Published on 22nd April 2012

Modest Alcohol Use Lowers Risk and Severity of Some Liver Disease

 

People with nonalcoholic fatty liver disease (NALFD) who consume alcohol in modest amounts – no more than one or two servings per day – are half as likely to develop hepatitis as non-drinkers with the same condition, reports a national team of scientists led by researchers at the University of California, San Diego School of Medicine.

 


 

The findings are published in the April 19, 2012 online issue of The Journal of Hepatology.

 

NALFD is the most common liver disease in the United States, affecting up to one third of American adults. It’s characterized by abnormal fat accumulation in the liver. The specific cause or causes is not known, though obesity and diabetes are risk factors. Most patients with NAFLD have few or no symptoms, but in its most progressive form, known as nonalcoholic steatohepatitis or NASH, there is a significantly heightened risk of cirrhosis, liver cancer and liver-related death.

 

NALFD is also a known risk factor for cardiovascular disease (CVD). Patients with NAFLD are approximately two times more likely to die from coronary heart disease than from liver disease. The study’s authors wanted to know if the well-documented heart-healthy benefits of modest alcohol consumption outweighed alcohol’s negative effects.

 

“We know a 50-year-old patient with NAFLD has a higher risk of CVD,” said Jeffrey Schwimmer, MD, associate professor of clinical pediatrics at UC San Diego, director of the Fatty Liver Clinic at Rady Children’s Hospital-San Diego and senior author. “Data would suggest modest alcohol consumption would be beneficial (in reducing the patient’s CVD risk) if you don’t take liver disease into account. When you do take liver disease into account, however, the usual medical recommendation is no alcohol whatsoever.”

 

Schwimmer and colleagues discovered that the benefits of modest alcohol consumption were compelling, at least in terms of reducing the odds of patients with NAFLD from developing more severe forms of the disease. Patients with NASH are 10 times more likely to progress to cirrhosis, the final phase of chronic liver disease. Cirrhosis is the 12th leading cause of death in the U.S., killing an estimated 27,000 Americans annually.

 

“Our study showed that those people with modest alcohol intake – two drinks or less daily – had half the odds of developing NASH than people who drank no alcohol,” said Schwimmer. “The reasons aren’t entirely clear. It’s known that alcohol can have beneficial effects on lipid levels, that it increases ‘good’ cholesterol, which tends to be low in NAFLD patients. Alcohol may improve insulin sensitivity, which has a role in NAFLD. And depending upon the type of alcohol, it may have anti-inflammatory effects.”

 

The study also found that in patients with NAFLD, modest drinkers experienced less severe liver scarring than did lifelong non-drinkers.

 

The study did not evaluate the effects of different types of alcohol, such as beer or spirits. Schwimmer said to do so would require a much larger study. Also, the study’s findings do not apply to children. All of the participants in the study were age 21 and older.

 

The current paper is based on analyses of 600 liver biopsies of patient’s with NAFLD by a national panel of pathologists who had no identifying clinical information about the samples. The study excluded anyone who averaged more than two alcoholic drinks per day or who reported consuming five or more drinks in a day (binge-drinking) at least once a month. All of the patients were at least 21 years of age.

 

Schwimmer said the findings indicate patients with liver disease should be treated individually, with nuance.

 

“For a patient with cirrhosis or viral hepatitis, the data says even small amounts of alcohol can be bad. But that may not be applicable to all forms of liver disease. Forty million Americans have NAFLD. Physicians need to look at their patient’s overall health, their CVD risk, their liver status, whether they’re already drinking modestly or not. They need to put all of these things into a framework to determine risk. I suspect modest alcohol consumption will be an appropriate recommendation for many patients, but clearly not all.”

 

Co-authors are Winston Dunn, departments of Pediatrics and Medicine, UC San Diego and Gastroenterology and Hepatology, Department of Medicine, University of Kansas Medical Center; Arun J. Sanyal, Division of Gastroenterology, Hepatology and Nutrition, Department of Internal Medicine, Virginia Commonwealth University Medical Center; Elizabeth M. Brunt, John Cochran VA Medical Center, Saint Louis and Division of Gastroenterology, Saint Louis University School of Medicine; Aynur Unalp-Arida, Department of Epidemilogy, Johns Hopkins Bloomberg School of Public Health; Michael Donohue, Division of Biostatics and Bioinformatics, Department of Family and Preventive Medicine, UC San Diego; and Arthur J. McCullough, Department of Gastroenterology and Hepatology, Cleveland Clinic.

 

Funding for this research came, in part, from the National Institute of Diabetes and Digestive and Kidney Diseases, the National Institute of Child Health and Human Development and the National Cancer Institute.

 

 

 

Source: University of California, San Diego


Published on 22nd  April  2012

 

USF researchers find that Alzheimer’s precursor protein controls its own fate

 

A research team led by the University of South Florida’s Department of Psychiatry & Behavioral Neurosciences has found that a fragment of the amyloid precursor protein (APP) — known as sAPP-α and associated with Alzheimer’s disease — appears to regulate its own production.  The finding may lead to ways to prevent or treat Alzheimer’s disease by controlling the regulation of APP.

 


 

Their preclinical study is published online today in Nature Communications.

 

“The purpose of this study was to help better understand why, in most cases of Alzheimer’s disease, the processing of APP becomes deregulated, which leads to the formation of protein deposits and neuron loss,” said study senior author Dr. Jun Tan, professor of psychiatry and the Robert A. Silver Chair, Rashid Laboratory for Developmental Neurobiology at the USF Silver Child Development Center.   “The many risk factors for Alzheimer’s disease can change the way APP is processed, and these changes appear to promote plaque formation and neuron loss.”

 

 

Co-localization of amyloid precursor protein fragment and the APP-converting enzyme BACE

 

Microscopic image showing the merging of the amyloid precursor protein fragment, sAPP-α, and the APP-converting enzyme BACE 1, in neuronal cells.  This co-localization suggests that sAPP-α may serve as the body’s mechanism to inhibit BACE1  activity and thus lower production of the toxic amyloid beta characteristic of Alzheimer’s disease. (Credit : University of South Florida)


 

 

An estimated 30 million people worldwide and 5 million in the U.S. have Alzheimer’s.  With the aging of the “Baby Boom” generation, the prevalence of the debilitating disease is expected to increase dramatically in the U.S. in the coming years.  Currently, there are no disease-modifying treatments to prevent, reverse or halt the progression of Alzheimer’s disease, only medications that may improve symptoms for a short time.

 

“For the first time, we have direct evidence that a secreted portion of APP itself, so called ‘ sAPP-α,’ acts as an essential stop-gap mechanism,” said the study’s lead author Dr. Demian Obregon, a resident specializing in research in the Department of Psychiatry & Behavioral Neurosciences at USF Health. “Risk factors associated with Alzheimer’s disease lead to a decline in sAPP-α levels, which results in excessive activity of a key enzyme in Aβ formation.”

 

In initial studies using cells, and in follow-up studies using mice genetically engineered to mimic Alzheimer’s disease, the investigators found that the neutralization of sAPP-α leads to enhanced Aβ formation.  This activity depended on  sAPP-α’s ability to associate with the APP-converting enzyme, BACE1.  When this interaction was blocked,  Aβ formation was restored.

 

The authors suggest that through monitoring and correcting low sAPP-α levels, or through enhancing its association with BACE, Alzheimer’s disease may be prevented or treated.

 

Source:- University of South Florida, Health
Published on 12th April 2012

 

 

Related articles

HOME-BASED ASSESSMENT TOOL FOR DEMENTIA SCREENING


Alzheimer’s Cognitive Decline Slows in Advanced Age


Alzheimer’s Gene Causes Brain’s Blood Vessels to Leak, Die


Scientists Gain New Understanding of Alzheimer’s Trigger

 

 

 

Best cpc cpm ppc ad network for publisher