logo

Scientists Gain New Understanding of Alzheimer’s Trigger

 

A highly toxic beta-amyloid – a protein that exists in the brains of Alzheimer’s disease victims – has been found to greatly increase the toxicity of other more common and less toxic beta-amyloids, serving as a possible “trigger” for the advent and development of Alzheimer’s, researchers at the University of Virginia and German biotech company Probiodrug have discovered.

 


 

The finding, reported in the May 2 online edition of the journal Nature, could lead to more effective treatments for Alzheimer’s. Already, Probiodrug AG, based in Halle, Germany has completed phase 1 clinical trials in Europe with a small molecule that inhibits an enzyme, glutaminyl cyclase, that catalyzes the formation of this hypertoxic version of beta-amyloid.

 

“This form of beta-amyloid, called pyroglutamylated (or pyroglu) beta-amyloid, is a real bad guy in Alzheimer’s disease,” said principal investigator George Bloom, a U.Va. professor of biology and cell biology in the College of Arts & Sciences and School of Medicine, who is collaborating on the study with scientists at Probiodrug. “We’ve confirmed that it converts more abundant beta-amyloids into a form that is up to 100 times more toxic, making this a very dangerous killer of brain cells and an attractive target for drug therapy.”

 

Bloom said the process is similar to various prion diseases, such as mad cow disease or chronic wasting disease, where a toxic protein can “infect” normal proteins that spread through the brain and ultimately destroy it.

 

In the case of Alzheimer’s, severe dementia occurs over the course of years prior to death.

 

“You might think of this pyroglu beta-amyloid as a seed that can further contaminate something that’s already bad into something much worse – it’s the trigger,” Bloom said. Just as importantly, the hypertoxic mixtures that are seeded by pyroglu beta-amyloid exist as small aggregates, called oligomers, rather than as much larger fibers found in the amyloid plaques that are a signature feature of the Alzheimer’s brain.

 

And the trigger fires a “bullet,” as Bloom puts it. The bullet is a protein called tau that is stimulated by beta-amyloid to form toxic “tangles” in the brain that play a major role in the onset and development of Alzheimer’s. Using mice bred to have no tau genes, the researchers found that without the interaction of toxic beta-amyloids with tau, the Alzheimer’s cascade cannot begin. The pathway by which pyroglu beta-amyloid induces the tau-dependent death of neurons is now the target of further investigation to understand this important step in the early development of Alzheimer’s disease

 

“There are two matters of practical importance in our discovery,” Bloom said. “One, is the new insights we have as to how Alzheimer’s might actually progress – the mechanisms which are important to understand if we are to try to prevent it from happening; and second, it provides a lead into how to design drugs that might prevent this kind of beta-amyloid from building up in the first place.”

 

Said study co-author Hans-Ulrich Demuth, a biochemist and chief scientific officer at Probiodrug, “This publication further adds significant evidence to our hypothesis about the critical role pyroglu beta-amyloid plays in the initiation of Alzheimer’s Disease. For the first time we have found a clear link in the relationship between pyroglu beta-amyloid, oligomer formation and tau protein in neuronal toxicity.”

 

Bloom and his collaborators are now looking for other proteins that are needed for pyroglu beta-amyloid to become toxic. Any such proteins they discover are potential targets for the early diagnosis and/or treatment of Alzheimer’s disease.

 

 

 

Source:   University of Virginia

 

Published on 3rd  May 2012

 

 

Related articles

HOME-BASED ASSESSMENT TOOL FOR DEMENTIA SCREENING


Alzheimer’s Cognitive Decline Slows in Advanced Age


Alzheimer’s Gene Causes Brain’s Blood Vessels to Leak, Die


USF researchers find that Alzheimer’s precursor protein controls its own fate

 

 

Tiny ‘spherules’ reveal details about Earth’s asteroid impacts

 


 

Researchers are learning details about asteroid impacts going back to the Earth’s early history by using a new method for extracting precise information from tiny “spherules” embedded in layers of rock. The spherules were created when asteroids crashed into Earth, vaporizing rock that expanded as a giant vapor plume. Small droplets of molten rock in the plume condensed and solidified, falling back to the surface as a thin layer. This sample was found in Western Australia and formed 2.63 billion years ago in the aftermath of a large impact. (Credit: Oberlin College photo/Bruce M. Simonson)

 

Researchers are learning details about asteroid impacts going back to the Earth’s early history by using a new method for extracting precise information from tiny “spherules” embedded in layers of rock.


The spherules were created when asteroids crashed into the Earth, vaporizing rock that expanded into space as a giant vapor plume. Small droplets of molten and vaporized rock in the plume condensed and solidified, falling back to Earth as a thin layer. The round or oblong particles were preserved in layers of rock, and now researchers have analyzed them to record precise information about asteroids impacting Earth from 3.5 billion to 35 million years ago.


“What we have done is provide the foundation for understanding how to interpret the layers in terms of the size and velocity of the asteroid that made them,” said Jay Melosh, an expert in impact cratering and a distinguished professor of earth and atmospheric sciences, physics and aerospace engineering at Purdue University.


Findings, which support a theory that the Earth endured an especially heavy period of asteroid bombardment early in its history, are detailed in a research paper appearing online in the journal Nature on Wednesday (April 25). The paper was written by Purdue physics graduate student Brandon Johnson and Melosh. The findings, based on geologic observations, support a theoretical study in a companion paper in Nature by researchers at the Southwest Research Institute in Boulder, Colo.


The period of heavy asteroid bombardment – from 4.2 to 3.5 billion years ago – is thought to have been influenced by changes in the early solar system that altered the trajectory of objects in an asteroid belt located between Mars and Jupiter, sending them on a collision course with Earth.


“That’s the postulate, and this is the first real solid evidence that it actually happened,” Melosh said. “Some of the asteroids that we infer were about 40 kilometers in diameter, much larger than the one that killed off the dinosaurs about 65 million years ago that was about 12-15 kilometers. But when we looked at the number of impactors as a function of size, we got a curve that showed a lot more small objects than large ones, a pattern that matches exactly the distribution of sizes in the asteroid belt. For the first time we have a direct connection between the crater size distribution on the ancient Earth and the sizes of asteroids out in space.”


Because craters are difficult to study directly, impact history must be inferred either by observations of asteroids that periodically pass near the Earth or by studying craters on the moon. Now, the new technique using spherules offers a far more accurate alternative to chronicle asteroid impacts on Earth, Melosh said.


“We can look at these spherules, see how thick the layer is, how big the spherules are, and we can infer the size and velocity of the asteroid,” Melosh said. “We can go back to the earliest era in the history of the Earth and infer the population of asteroids impacting the planet.”


For asteroids larger than about 10 kilometers in diameter, the spherules are deposited in a global layer.


“Some of these impacts were several times larger than the Chicxulub impact that killed off the dinosaurs 65 million years ago,” Johnson said. “The impacts may have played a large role in the evolutional history of life. The large number of impacts may have helped simple life by introducing organics and other important materials at a time when life on Earth was just taking hold.”


A 40-kilometer asteroid would have wiped out everything on the Earth’s surface, whereas the one that struck 65 million years ago killed only land animals weighing more than around 20 kilograms.


“Impact craters are the most obvious indication of asteroid impacts, but craters on Earth are quickly obscured or destroyed by surface weathering and tectonic processes,” Johnson said. “However, the spherule layers, if preserved in the geologic record, provide information about an impact even when the source crater cannot be found.”


The Purdue researchers studied the spherules using computer models that harness mathematical equations developed originally to calculate the condensation of vapor.


“There have been some new wrinkles in vapor condensation modeling that motivated us to do this work, and we were the first to apply it to asteroid impacts,” Melosh said.


The spherules are about a millimeter in diameter.


The researchers also are studying a different type of artifact similar to spherules but found only near the original impact site. Whereas the globally distributed spherules come from the condensing vaporized rock, these “melt droplets” are from rock that’s been melted and not completely vaporized.


“Before this work, it was not possible to distinguish between these two types of formations,” Melosh said. “Nobody had established criteria for discriminating between them, and we’ve done that now.”


One of the authors of the Southwest Research Institute paper, David Minton, is now an assistant professor of earth and atmospheric sciences at Purdue.


Findings from the research may enable Melosh’s team to enhance an asteroid impact effects calculator he developed to estimate what would happen if asteroids of various sizes were to hit the Earth. The calculator, “Impact: Earth!” allows anyone to calculate potential comet or asteroid damage based on the object’s mass.


The research has been funded by NASA.

 

 

 

Source: Purdue University

 

Published on 3rd May 2012

 

24 new species discovered on Caribbean islands are close to extinction

Lizard with blue tail on brown rock

 


 

 

An Anguilla Bank skink. Blair Hedges and his team have discovered and scientifically named 24 new species of lizards known as skinks. (Credit: Karl Questel)

 

 

 

In a single new scientific publication, 24 new species of lizards known as skinks, all from islands in the Caribbean, have been discovered and scientifically named. According to Blair Hedges, professor of biology at Penn State University and the leader of the research team, half of the newly added skink species already may be extinct or close to extinction, and all of the others on the Caribbean islands are threatened with extinction. The researchers found that the loss of many skink species can be attributed primarily to predation by the mongoose — an invasive predatory mammal that was introduced by farmers to control rats in sugarcane fields during the late 19th century. The research team reports on the newly discovered skinks in a 245-page article published today (April 30) in the journal Zootaxa.

About 130 species of reptiles from all over the world are added to the global species count each year in dozens of scientific articles. However, not since the 1800s have more than 20 reptile species been added at one time. Primarily through examination of museum specimens, the team identified a total of 39 species of skinks from the Caribbean islands, including six species currently recognized, and another nine named long ago but considered invalid until now. Hedges and his team also used DNA sequences, but most of the taxonomic information, such as counts and shapes of scales, came from examination of the animals themselves.

“Now, one of the smallest groups of lizards in this region of the world has become one of the largest groups,” Hedges said. “We were completely surprised to find what amounts to a new fauna, with co-occurring species and different ecological types.”

He said some of the new species are six times larger in body size than other species in the new fauna.

Hedges also explained that these New World skinks, which arrived in the Americas about 18 million years ago from Africa by floating on mats of vegetation, are unique among lizards in that they produce a human-like placenta, which is an organ that directly connects the growing offspring to the maternal tissues that provide nutrients.

“While there are other lizards that give live birth, only a fraction of the lizards known as skinks make a placenta and gestate offspring for up to one year,” Hedges said.

He also speculated that the lengthy gestational period may have given predators a competitive edge over skinks, since pregnant females are slower and more vulnerable.

“The mongoose is the predator we believe is responsible for many of the species’ close-to-extinction status in the Caribbean,” Hedges said. “Our data show that the mongoose, which was introduced from India in 1872 and spread around the islands over the next three decades, has nearly exterminated this entire reptile fauna, which had gone largely unnoticed by scientists and conservationists until now.”

According to Hedges, the “smoking gun” is a graph included in the scientific paper showing a sharp decline in skink populations that occurred soon after the introduction of the mongoose. Hedges explained that the mongoose originally was brought to the New World to control rats, which had become pests in the sugarcane fields in Cuba, Hispaniola, Puerto Rico, Jamaica and the Lesser Antilles. While this strategy did help to control infestations of some pests; for example, the Norway rat, it also had the unintended consequence of reducing almost all skink populations.

“By 1900, less than 50 percent of those mongoose islands still had their skinks, and the loss has continued to this day,” Hedges said.

This newly discovered skink fauna will increase dramatically the number of reptiles categorized as “critically endangered” by the International Union for Conservation of Nature in their “Red List of Threatened Species,” which is recognized as the most comprehensive database evaluating the endangerment status of various plant and animal species.

“According to our research, all of the skink species found only on Caribbean islands are threatened,” Hedges said. “That is, they should be classified in the Red List as either vulnerable, endangered, or critically endangered. Finding that all species in a fauna are threatened is unusual, because only 24 percent of the 3,336 reptile species listed in the Red List have been classified as threatened with extinction. Most of the 9,596 named reptile species have yet to be classified in the Red List.”

Hedges explained that there are two reasons why such a large number of species went unnoticed for so many years, in a region frequented by scientists and tourists.

“First, Caribbean skinks already had nearly disappeared by the start of the 20th century, so people since that time rarely have encountered them and therefore have been less likely to study them,” he said. “Second, the key characteristics that distinguish this great diversity of species have been overlooked until now.”

Hedges also noted that many potential new species of animals around the world have been identified in recent years with DNA data. However, much more difficult is the task of following up DNA research with the work required to name new species and to formally recognize them as valid, as this team did with Caribbean skinks.

The other member of the research team, Caitlin Conn, now a researcher at the University of Georgia and formerly a biology major in Penn State’s Eberly College of Science and a student in Penn State’s Schreyer Honors College at the time of the research, added that researchers might be able to use the new data to plan conservation efforts, to study the geographic overlap of similar species, and to study in more detail the skinks’ adaptation to different ecological habitats or niches. The research team also stressed that, while the mongoose introduction by humans now has been linked to these reptile declines and extinctions, other types of human activity, especially the removal of forests, are to blame for the loss of other species in the Caribbean.

Funding for the research comes from the National Science Foundation.

 

 

 


Source: Pennsylvania State University

 

Published on 2nd May 2012

 

COMPRESSED SENSING ALLOWS SUPER-RESOLUTION MICROSCOPY IMAGING OF LIVE CELL STRUCTURES

 

Single Molecule Identification

 


 

 

 

Image shows single-molecule identification. The green cross signs show the locations of single molecules using the super resolution technique.  (Credit Lei Zhu and Bo Huang)


 

 

Researchers from the Georgia Institute of Technology and University of California San Francisco have advanced scientists’ ability to view a clear picture of a single cellular structure in motion. By identifying molecules using compressed sensing, this new method provides needed spatial resolution plus a faster temporal resolution than previously possible.

 

Despite many achievements in the field of super-resolution microscopy in the past few years with spatial resolution advances, live-cell imaging has remained a challenge because of the need for high temporal resolution.

 

Now, Lei Zhu, assistant professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering, and Bo Huang, assistant professor in UCSF’s Department of Pharmaceutical Chemistry and Department of Biochemistry and Biophysics, have developed an advanced approach using super-resolution microscopy to resolve cellular features an order of magnitude smaller than what could be seen before. This allows the researchers to tap previously inaccessible information and answer new biological questions.

 

The research was published April 22, 2012 in the journal Nature Methods. The research is funded by the National Institutes of Health, UCSF Program for Breakthrough Biomedical Research, Searle Scholarship and Packard Fellowship for Science and Engineering.

 

The previous technology using the single-molecule-switching approach for super-resolution microscopy depends on spreading single molecule images sparsely into many, often thousands of, camera frames. It is extremely limited in its temporal resolution and does not provide the ability to follow dynamic processes in live cells.

 

“We can now use our discovery using super-resolution microscopy with seconds or even sub-second temporal resolution for a large field of view to follow many more dynamic cellular processes,” said Zhu. “Much of our knowledge of the life of a cell comes from our ability to see the small structures within it.”

 

Huang noted, “One application, for example, is to investigate how mitochondria, the power house of the cell, interact with other organelles and the cytoskeleton to reshape the structure during the life cycle of the cell.”

 

Currently, light microscopy, especially in the modern form of fluorescence microscopy, is still used frequently by many biologists. However, the authors say, conventional light microscopy has one major limitation: the inability to resolve two objects closer than half the wavelength of the light because of the phenomenon called diffraction. With diffraction, the images look blurry and overlapped no matter how high the magnification that is used.

 

“The diffraction limit has long been regarded as one of the fundamental constraints for light microscopy until the recent inventions of super-resolution fluorescence microscopy techniques,” said Zhu. Super-resolution microscopy methods, such as stochastic optical reconstruction microscopy (STORM) or photoactivated localization microscopy (PALM), rely on the ability to record light emission from a single molecule in the sample.

 

Using probe molecules that can be switched between a visible and an invisible state, STORM/PALM determines the position of each molecule of interest. These positions ultimately define a structure.

 

The new finding is significant, said Zhu and Huang, because they have shown that the technology allows for following the dynamics of a microtubule cytoskeleton with a three-second time resolution, which would allow researchers to study the active transports of vesicles and other cargos inside the cell.

 

Using the same optical system and detector as in conventional light microscopy, super-resolution microscopy naturally requires longer acquisition time to obtain more spatial information, leading to a trade-off between its spatial and temporal resolution. In super-resolution microscopy methods based on STORM/PALM, each camera image samples a very sparse subset of probe molecules in the sample.

 

An alternative approach is to increase the density of activated fluorophores so that each camera frame samples more molecules. However, this high density of fluorescent spots causes them to overlap, invalidating the widely used single-molecule localization method.

 

The authors said that a number of methods have been reported recently that can efficiently retrieve single-molecule positions even when the single fluorophore signals overlap. These methods are based on fitting clusters of overlapped spots with a variable number of point-spread functions (PSFs) with either maximum likelihood estimation or Bayesian statistics. The Bayesian method has also been applied to the whole image set.

 

As a result of new research, Zhu and Huang present a new approach based on global optimization using compressed sensing, which does not involve estimating or assuming the number of molecules in the image. They show that compressed sensing can work with much higher molecule densities compared to other technologies and demonstrate live cell imaging of fluorescent protein-labeled microtubules with three-second temporal resolution.

 

The STORM experiment used by the authors, with immunostained microtubules in Drosophila melanogaster S2 cells, demonstrated that nearby microtubules can be resolved by compressed sensing using as few as 100 camera frames, whereas they were not discernible by the single-molecule fitting method. They have also performed live STORM on S2 cells stably expressing tubulin fused to mEos2.

 

At the commonly used camera frame rate of 56.4 Hertz, a super-resolution movie was constructed with a time resolution of three seconds (169 frames) and a Nyquist resolution of 60 nanometers, much faster than previously reported, said Zhu and Huang. These results have proven that compressed sensing can enable STORM to monitor live cellular processes with second-scale time resolution, or even sub-second-scale resolution if a faster camera can be used.

 

 

 

Source: Georgia Institute of Technology

 

Published on 24th April 2012

 

 

Clinical Decline in Alzheimer’s Requires Plaque and Proteins

 

According to a new study, the neuron-killing pathology of Alzheimer’s disease (AD), which begins before clinical symptoms appear, requires the presence of both amyloid-beta (a-beta) plaque deposits and elevated levels of an altered protein called p-tau.

 


 

Without both, progressive clinical decline associated with AD in cognitively healthy older individuals is “not significantly different from zero,” reports a team of scientists at the University of California, San Diego School of Medicine in the April 23 online issue of the Archives of Neurology.

 

“I think this is the biggest contribution of our work,” said Rahul S. Desikan, MD, PhD, research fellow and resident radiologist in the UC San Diego Department of Radiology and first author of the study.  “A number of planned clinical trials – and the majority of Alzheimer’s studies – focus predominantly on a-beta. Our results highlight the importance of also looking at p-tau, particularly in trials investigating therapies to remove a-beta. Older, non-demented individuals who have elevated a-beta levels, but normal p-tau levels, may not progress to Alzheimer’s, while older individuals with elevated levels of both will likely develop the disease.”

 

The findings also underscore the importance of p-tau as a target for new approaches to treating patients with conditions ranging from mild cognitive impairment (MCI) to full-blown AD. An estimated 5.4 million Americans have AD. It’s believed that 10 to 20 percent of Americans age 65 and older have MCI, a risk factor for AD. Some current therapies appear to delay clinical AD onset, but the disease remains irreversible and incurable.

 

“It may be that a-beta initiates the Alzheimer’s cascade,” said Desikan. “But once started, the neurodegenerative mechanism may become independent of a-beta, with p-tau and other proteins playing a bigger role in the downstream degenerative cascade. If that’s the case, prevention with anti-a-beta compounds may prove efficacious against AD for older, non-demented individuals who have not yet developed tau pathology.  But novel, tau-targeting therapies may help the millions of individuals who already suffer from mild cognitive impairment or Alzheimer’s disease.”

 

The new study involved evaluations of healthy, non-demented elderly individuals participating in the ongoing, multi-site Alzheimer’s Disease Neuroimaging Initiative, or ADNI. Launched in 2003, ADNI is a longitudinal effort to measure the progression of mild cognitive impairment and early-stage AD.

 

The researchers studied samples of cerebrospinal fluid (CSF) taken from ADNI participants.

 

“In these older individuals, the presence of a-beta alone was not associated with clinical decline,” said Anders M. Dale, PhD, professor of radiology, neurosciences, and psychiatry at UC San Diego and senior author of the study. “However, when p-tau was present in combination with a-beta, we saw significant clinical decline over three years.”

 

A-beta proteins have several normal responsibilities, including activating enzymes and protecting cells from oxidative stress. It is not known why a-beta proteins form plaque deposits in the brain. Similarly, the origins of p-tau are not well understood. One hypothesis, according to Desikan, is that a-beta plaque deposits trigger hyperphosphorylation of nearby tau proteins, which normally help stabilize the structure of brain cells. Hyperphosphorylation occurs when phosphate groups attach to a protein in excess numbers, altering their normal functions. Hyperphosphorylated tau – or p-tau – can then exacerbate the toxic effects of a-beta plaque upon neurons.

 

The discovery of p-tau’s heightened role in AD neurodegeneration suggests it could be a specific biomarker for the disease before clinical symptoms appear. While high levels of another tau protein – t-tau – in cerebrospinal fluid have been linked to neurologic disorders, such as frontotemporal dementia and traumatic brain injury, high levels of p-tau correlates specifically to increased neurofibrillary tangles in brain cells, which are seen predominantly with AD.

 

“These results are in line with another ADNI study of healthy controls and MCI participants that found progressive atrophy in the entorhinal cortex – one of the areas of the brain first affected in AD –only in amyloid positive individuals who also showed evidence of elevated p-tau levels,” said Linda McEvoy, PhD, assistant professor of radiology and study co-author.

 

“One of the exciting dimensions of this paper was the combined use of cerebrospinal fluid markers and clinical assessments to better elucidate the neurodegenerative process underlying Alzheimer’s disease in individuals who do not yet show clinical signs of dementia,” added co-author James Brewer, MD, PhD, an associate professor of radiology and neurosciences at UC San Diego School of Medicine.  “We do not have an animal model that works very well for studying this disease, so the ability to examine the dynamics of neurodegeneration in living humans is critical.”

 

Nonetheless, the scientists say more research is needed. They note that CSF biomarkers provide only an indirect assessment of amyloid and neurofibrillary pathology and may not fully reflect the underlying biological processes of AD.

 

“This study highlights the complex interaction of multiple pathologies that likely contribute to the clinical symptomatology of Alzheimer’s disease,” said co-author Reisa Sperling, MD, a neurologist at Massachusetts General Hospital and Brigham and Women’s Hospital. “It suggests we may be able to intervene in the preclinical stages of AD before there is significant neurodegeneration and perhaps prevent the onset of symptoms.”

 

Other co-authors are Wesley K. Thompson, Department of Psychiatry; and Dominic Holland and Paul S. Aisen, Department of Neuroscience, UC San Diego School of Medicine.

 

Funding for this research came, in part, from the National Institutes of Health and the Alzheimer’s Disease Neuroimaging Initiative.

 

 

 

Source: University of California, San Diego


Published on 24th  April  2012

 

Strange cousins: molecular alternatives to DNA, RNA offer new insight into life’s origins

 

Living systems owe their existence to a pair of information-carrying molecules: DNA and RNA. These fundamental chemical forms possess two features essential for life: they display heredity—meaning they can encode and pass on genetic information, and they can adapt over time, through processes of Darwinian evolution.

 


 

A long-debated question is whether heredity and evolution could be performed by molecules other than DNA and RNA.

 

John Chaput, a researcher at ASU’s Biodesign Institute, who recently published an article in Nature Chemistry describing the evolution of threose nucleic acids,  joined a multidisciplinary team of scientists from England, Belgium and Denmark to extend these properties to other so-called Xenonucleic acids or XNA’s.

 

The group demonstrates for the first time that six of these unnatural nucleic acid polymers are capable of sharing information with DNA. One of these XNAs, a molecule referred to as anhydrohexitol nucleic acid or HNA was capable of undergoing directed evolution and folding into biologically useful forms.

 

Their results appear in the current issue of Science.

 

The work sheds new light on questions concerning the origins of life and provides a range of practical applications for molecular medicine that were not previously available.

 

Nucleic acid aptamers, which have been engineered through in vitro selection to bind with various molecules, act in a manner similar to antibodies—latching onto their targets with high affinity and specificity. “This could be great for building new types of diagnostics and new types of biosensors,” Chaput says, pointing out that XNAs are heartier molecules, not recognized by the natural enzymes that tend to degrade DNA and RNA. New therapeutics may also arise from experimental Xenobiology.

 

Both RNA and DNA embed data in their sequences of four nucleotides—information vital for conferring hereditary traits and for supplying the coded recipe essential for building proteins from the 20 naturally occurring amino acids. Exactly how (and when) this system got its start however, remains one of the most intriguing and hotly contested areas of biology.

 

According to one hypothesis, the simpler RNA molecule preceded DNA as the original informational conduit. The RNA world hypothesis proposes that the earliest examples of life were based on RNA and simple proteins. Because of RNA’s great versatility—it is not only capable of carrying genetic information but also of catalyzing chemical reactions like an enzyme—it is believed by many to have supported pre-cellular life.

 

Nevertheless, the spontaneous arrival of RNA through a sequence of purely random mixing events of primitive chemicals was at the very least, an unlikely occurrence.  “This is a big question,” Chaput says. “If the RNA world existed, how did it come into existence? Was it spontaneously produced, or was it the product of something that was even simpler than RNA?”

 

This pre-RNA world hypothesis has been gaining ground, largely through investigations into XNAs, which provide plausible alternatives to the current biological regime and could have acted as chemical stepping-stones to the eventual emergence of life. The current research strengthens the case that something like this may have taken place.

 

Threose nucleic acid or TNA for example, is one candidate for this critical intermediary role. “TNA does some interesting things,” Chaput says, noting the molecule’s capacity to bind with RNA through antiparallel Watson-Crick base pairing. “This property provides a model for how XNAs could have transferred information from the pre-RNA world to the RNA world.”

 

Nucleic acid molecules, including DNA and RNA consist of 3 chemical components: a sugar group, a triphosphate backbone and combinations of the four nucleic acids. By tinkering with these structural elements, researchers can engineer XNA molecules with unique properties. However, in order for any of these exotic molecules to have acted as a precursor to RNA in the pre-biotic epoch, they would need to have been able to transfer and recover their information from RNA. To do this, specialized enzymes, known as polymerases are required.

 

Nature has made DNA and RNA polymerases, capable of reading, transcribing and reverse transcribing normal nucleic acid sequences. For XNA molecules, however; no naturally occurring polymerases exist. So the group, led by Phil Holliger at the MRC in England, painstakingly evolved synthetic polymerases that could copy DNA into XNA and other polymerases that could copy XNA back into DNA. In the end, polymerases were discovered that transcribe and reverse-transcribe six different genetic systems: HNA, CeNA, LNA, ANA, FANA and TNA. The experiments demonstrated that these unnatural DNA sequences could be rendered into various XNAs when the polymerases were fed the appropriate XNA substrates.

 

Using these enzymes as tools for molecular evolution, the team evolved the first example of an HNA aptamer through iterative rounds of selection and amplification. Starting from a large pool of DNA sequences, a synthetic polymerase was used to copy the DNA library into HNA. The pool of HNA molecules was then incubated with an arbitrary target. The small fraction of molecules that bound the target were separated from the unbound pool, reverse transcribed back into DNA with a second synthetic enzyme and amplified by PCR. After many repeated rounds, HNAs were generated that bound  HIV trans-activating response RNA (TAR) and hen egg lysosome (HEL), which were used as binding targets.) “This is a synthetic Darwinian process,” Chaput says. “The same thing happens inside our cells, but this is done in vitro.”

 

The method for producing XNA polymerases draws on the path-breaking work of Holliger, one of the lead authors of the current study. The elegant technique uses cell-like synthetic compartments of water/oil emulsion to conduct directed evolution of enzymes, particularly polymerases. By isolating self-replication reactions from each other, the process greatly improves the accuracy and efficiency of polymerase evolution and replication. “What nobody had really done before,” Chaput says,  “is to take those technologies and apply them to unnatural nucleic acids. ”

 

Chaput also underlines the importance of an international collaboration for carrying out this type of research, particularly for the laborious effort of assembling the triphosphate substrates needed for each of the 6 XNA systems used in the study:

 

“What happened here is that a community of scientists came together and organized around this idea that we could find polymerases that could be used to open up biology to unnatural polymers. It would have been a tour de force for any lab to try to synthesize all the triphosphates, as none of these reagents are commercially available.”

 

The study advances the case for a pre-RNA world, while revealing a new class of XNA aptamers capable of fulfilling myriad useful roles. Although many questions surrounding the origins of life persist, Chaput is optimistic that solutions are coming into view: “Further down the road, through research like this, I think we’ll have enough information to begin to put the pieces of the puzzle together.”

 

The research group consisted of investigators from  the  Medical Research Council (MRC) Laboratory of Molecular Biology, Cambridge, led by Philipp Holliger; the Institute, Katholieke Universiteit Leuven, Belgium,  led by Piet Herdewijn;  the Nucleic Acid Center, Department of Physics and Chemistry, University of Southern Denmark, led by Jesper Wengel; and the Biodesign Institute at Arizona State University, led by John Chaput.

 

In addition to his appointment at the Biodesign Institute, John Chaput is an associate professor in the Department of Chemistry and Biochemistry, in the College of Liberal Arts & Sciences.

 

 

 

 

The original article was written by Richard Harth

Science Writer: The Biodesign Institute

 

 

Source:-  Arizona State University

Published on 22nd April 2012

Modest Alcohol Use Lowers Risk and Severity of Some Liver Disease

 

People with nonalcoholic fatty liver disease (NALFD) who consume alcohol in modest amounts – no more than one or two servings per day – are half as likely to develop hepatitis as non-drinkers with the same condition, reports a national team of scientists led by researchers at the University of California, San Diego School of Medicine.

 


 

The findings are published in the April 19, 2012 online issue of The Journal of Hepatology.

 

NALFD is the most common liver disease in the United States, affecting up to one third of American adults. It’s characterized by abnormal fat accumulation in the liver. The specific cause or causes is not known, though obesity and diabetes are risk factors. Most patients with NAFLD have few or no symptoms, but in its most progressive form, known as nonalcoholic steatohepatitis or NASH, there is a significantly heightened risk of cirrhosis, liver cancer and liver-related death.

 

NALFD is also a known risk factor for cardiovascular disease (CVD). Patients with NAFLD are approximately two times more likely to die from coronary heart disease than from liver disease. The study’s authors wanted to know if the well-documented heart-healthy benefits of modest alcohol consumption outweighed alcohol’s negative effects.

 

“We know a 50-year-old patient with NAFLD has a higher risk of CVD,” said Jeffrey Schwimmer, MD, associate professor of clinical pediatrics at UC San Diego, director of the Fatty Liver Clinic at Rady Children’s Hospital-San Diego and senior author. “Data would suggest modest alcohol consumption would be beneficial (in reducing the patient’s CVD risk) if you don’t take liver disease into account. When you do take liver disease into account, however, the usual medical recommendation is no alcohol whatsoever.”

 

Schwimmer and colleagues discovered that the benefits of modest alcohol consumption were compelling, at least in terms of reducing the odds of patients with NAFLD from developing more severe forms of the disease. Patients with NASH are 10 times more likely to progress to cirrhosis, the final phase of chronic liver disease. Cirrhosis is the 12th leading cause of death in the U.S., killing an estimated 27,000 Americans annually.

 

“Our study showed that those people with modest alcohol intake – two drinks or less daily – had half the odds of developing NASH than people who drank no alcohol,” said Schwimmer. “The reasons aren’t entirely clear. It’s known that alcohol can have beneficial effects on lipid levels, that it increases ‘good’ cholesterol, which tends to be low in NAFLD patients. Alcohol may improve insulin sensitivity, which has a role in NAFLD. And depending upon the type of alcohol, it may have anti-inflammatory effects.”

 

The study also found that in patients with NAFLD, modest drinkers experienced less severe liver scarring than did lifelong non-drinkers.

 

The study did not evaluate the effects of different types of alcohol, such as beer or spirits. Schwimmer said to do so would require a much larger study. Also, the study’s findings do not apply to children. All of the participants in the study were age 21 and older.

 

The current paper is based on analyses of 600 liver biopsies of patient’s with NAFLD by a national panel of pathologists who had no identifying clinical information about the samples. The study excluded anyone who averaged more than two alcoholic drinks per day or who reported consuming five or more drinks in a day (binge-drinking) at least once a month. All of the patients were at least 21 years of age.

 

Schwimmer said the findings indicate patients with liver disease should be treated individually, with nuance.

 

“For a patient with cirrhosis or viral hepatitis, the data says even small amounts of alcohol can be bad. But that may not be applicable to all forms of liver disease. Forty million Americans have NAFLD. Physicians need to look at their patient’s overall health, their CVD risk, their liver status, whether they’re already drinking modestly or not. They need to put all of these things into a framework to determine risk. I suspect modest alcohol consumption will be an appropriate recommendation for many patients, but clearly not all.”

 

Co-authors are Winston Dunn, departments of Pediatrics and Medicine, UC San Diego and Gastroenterology and Hepatology, Department of Medicine, University of Kansas Medical Center; Arun J. Sanyal, Division of Gastroenterology, Hepatology and Nutrition, Department of Internal Medicine, Virginia Commonwealth University Medical Center; Elizabeth M. Brunt, John Cochran VA Medical Center, Saint Louis and Division of Gastroenterology, Saint Louis University School of Medicine; Aynur Unalp-Arida, Department of Epidemilogy, Johns Hopkins Bloomberg School of Public Health; Michael Donohue, Division of Biostatics and Bioinformatics, Department of Family and Preventive Medicine, UC San Diego; and Arthur J. McCullough, Department of Gastroenterology and Hepatology, Cleveland Clinic.

 

Funding for this research came, in part, from the National Institute of Diabetes and Digestive and Kidney Diseases, the National Institute of Child Health and Human Development and the National Cancer Institute.

 

 

 

Source: University of California, San Diego


Published on 22nd  April  2012

 

The Neurogenics of Niceness

 

Study finds peoples’ relative niceness may reside in their genes

 

 

 

It turns out that the milk of human kindness is evoked by something besides mom’s good example.

 

Research by psychologists at the University at Buffalo and the University of California, Irvine, has found that at least part of the reason some people are kind and generous is because their genes nudge them toward it.

 


 

Michel Poulin, PhD, assistant professor of psychology at UB, is the principal author of the study “The Neurogenics of Niceness,” published in this month in Psychological Science, a journal of the Association for Psychological Science.

 

The study, co-authored by Anneke Buffone of UB and E. Alison Holman of the University of California, Irvine, looked at the behavior of study subjects who have versions of receptor genes for two hormones that, in laboratory and close relationship research, are associated with niceness. Previous laboratory studies have linked the hormones oxytocin and vasopressin to the way we treat one another, Poulin says.

 

In fact, they are known to make us nicer people, at least in close relationships. Oxytocin promotes maternal behavior, for example, and in the lab, subjects exposed to the hormone demonstrate greater sociability. An article in the usually staid Science magazine even used the terms “love drug” and “cuddle chemical” to describe oxytocin, Poulin points out.

 

Poulin says this study was an attempt to apply previous findings to social behaviors on a larger scale; to learn if these chemicals provoke in us other forms of pro-social behavior: urge to give to charity, for instance, or to more readily participate in such civic endeavors as paying taxes, reporting crime, giving blood or sitting on juries.

 

He explains that hormones work by binding to our cells through receptors that come in different forms. There are several genes that control the function of oxytocin and vasopressin receptors.

 

Subjects were surveyed as to their attitudes toward civic duty, other people and the world in general, and about their charitable activities. Study subjects took part in an Internet survey with questions about civic duty, such as whether people have a duty to report a crime or pay taxes; how they feel about the world, such as whether people are basically good or whether the world is more good than bad; and about their own charitable activities, like giving blood, working for charity or going to PTA meetings.

 

Of those surveyed, 711 subjects provided a sample of saliva for DNA analysis, which showed what form they had of the oxytocin and vasopressin receptors.

 

“The study found that these genes combined with people’s perceptions of the world as a more or less threatening place to predict generosity,” Poulin says.

 

“Specifically, study participants who found the world threatening were less likely to help others — unless they had versions of the receptor genes that are generally associated with niceness,” he says.

 

These “nicer” versions of the genes, says Poulin, “allow you to overcome feelings of the world being threatening and help other people in spite of those fears.

 

“The fact that the genes predicted behavior only in combination with people’s experiences and feelings about the world isn’t surprising,” Poulin says, “because most connections between DNA and social behavior are complex.

 

“So if one of your neighbors seems really generous, caring, civic-minded kind of person, while another seems more selfish, tight-fisted and not as interested in pitching in, their DNA may help explain why one of them is nicer than the other,” he says.

 

“We aren’t saying we’ve found the niceness gene,” he adds. “But we have found a gene that makes a contribution. What I find so interesting is the fact that it only makes a contribution in the presence of certain feelings people have about the world around them.”

 

 

 

Source:- University at Buffalo

 

Published on 12th April 2012

USF researchers find that Alzheimer’s precursor protein controls its own fate

 

A research team led by the University of South Florida’s Department of Psychiatry & Behavioral Neurosciences has found that a fragment of the amyloid precursor protein (APP) — known as sAPP-α and associated with Alzheimer’s disease — appears to regulate its own production.  The finding may lead to ways to prevent or treat Alzheimer’s disease by controlling the regulation of APP.

 


 

Their preclinical study is published online today in Nature Communications.

 

“The purpose of this study was to help better understand why, in most cases of Alzheimer’s disease, the processing of APP becomes deregulated, which leads to the formation of protein deposits and neuron loss,” said study senior author Dr. Jun Tan, professor of psychiatry and the Robert A. Silver Chair, Rashid Laboratory for Developmental Neurobiology at the USF Silver Child Development Center.   “The many risk factors for Alzheimer’s disease can change the way APP is processed, and these changes appear to promote plaque formation and neuron loss.”

 

 

Co-localization of amyloid precursor protein fragment and the APP-converting enzyme BACE

 

Microscopic image showing the merging of the amyloid precursor protein fragment, sAPP-α, and the APP-converting enzyme BACE 1, in neuronal cells.  This co-localization suggests that sAPP-α may serve as the body’s mechanism to inhibit BACE1  activity and thus lower production of the toxic amyloid beta characteristic of Alzheimer’s disease. (Credit : University of South Florida)


 

 

An estimated 30 million people worldwide and 5 million in the U.S. have Alzheimer’s.  With the aging of the “Baby Boom” generation, the prevalence of the debilitating disease is expected to increase dramatically in the U.S. in the coming years.  Currently, there are no disease-modifying treatments to prevent, reverse or halt the progression of Alzheimer’s disease, only medications that may improve symptoms for a short time.

 

“For the first time, we have direct evidence that a secreted portion of APP itself, so called ‘ sAPP-α,’ acts as an essential stop-gap mechanism,” said the study’s lead author Dr. Demian Obregon, a resident specializing in research in the Department of Psychiatry & Behavioral Neurosciences at USF Health. “Risk factors associated with Alzheimer’s disease lead to a decline in sAPP-α levels, which results in excessive activity of a key enzyme in Aβ formation.”

 

In initial studies using cells, and in follow-up studies using mice genetically engineered to mimic Alzheimer’s disease, the investigators found that the neutralization of sAPP-α leads to enhanced Aβ formation.  This activity depended on  sAPP-α’s ability to associate with the APP-converting enzyme, BACE1.  When this interaction was blocked,  Aβ formation was restored.

 

The authors suggest that through monitoring and correcting low sAPP-α levels, or through enhancing its association with BACE, Alzheimer’s disease may be prevented or treated.

 

Source:- University of South Florida, Health
Published on 12th April 2012

 

 

Related articles

HOME-BASED ASSESSMENT TOOL FOR DEMENTIA SCREENING


Alzheimer’s Cognitive Decline Slows in Advanced Age


Alzheimer’s Gene Causes Brain’s Blood Vessels to Leak, Die


Scientists Gain New Understanding of Alzheimer’s Trigger

 

 

 

Caloric moderation can reverse link between low birth weight and obesity, study finds

 

Babies who are born small have a tendency to put on weight during childhood and adolescence if allowed free access to calories. However, a new animal model study at UCLA found when small babies were placed on a diet of moderately regulated calories during infancy, the propensity of becoming obese decreased.

 


 

Because this is an early study, UCLA researchers do not recommend that mothers of low-birth weight infants start restricting their child’s nutrition and suggest they consult with their child’s pediatrician regarding any feeding questions.

 

Previous studies have shown that growth restriction before birth may cause lasting changes of genes in certain insulin-sensitive organs like the pancreas, liver and skeletal muscle. Before birth, these changes may help the malnourished fetus use all available nutrients. However, after birth these changes may contribute to health problems such as obesity and diabetes.

 

“This study shows that if we match the level of caloric consumption after birth to the same level that the growth-restricted baby received in the womb, it results in a lean body type. However, if there is a mismatch where the baby is growth-restricted at birth but exposed to plenty of calories after birth, then that leads to obesity,” said the lead author, Dr. Sherin Devaskar, professor of pediatrics and executive chair of the department of pediatrics at Mattel Children’s Hospital UCLA. “While many trials that include exercise and various drug therapies have tried to reverse the tendency of low birth weight babies becoming obese, we have shown that a dietary intervention during early life can have long lasting effects into childhood, adolescence and adult life.”

 

The study appears in the June issue of the journal Diabetes and is currently available online.

 

About 10 percent of babies in the United States are born small, defined as less than the 10th percentile by weight for a given gestation period, said the study’s first author, Dr. Meena Garg, professor of pediatrics and a neonatologist and medical director of the neonatal intensive care unit at Mattel Children’s Hospital UCLA. She added that some organizations define low birth weight as less than 2,500 grams or 5 pounds, 5 ounces at term.

 

Low birth weight can be caused by malnutrition due to a mother’s homelessness or hunger or her desire not to gain too much weight during pregnancy. Additional causes include illness or infection, a reduction in placental blood, smoking or use of alcohol or drugs during pregnancy.

 

To conduct the study, researchers used rodent animal models and simulated a reduced calorie scenario during pregnancy. The results showed that low-birth weight offspring exposed to moderately tempered caloric intake during infancy and childhood resulted in lean and physically active adults related to high energy expenditure, as opposed to unrestricted intake of calories, which resulted in inactive and obese adults due to reduced energy expenditure. The authors concluded that early life dietary interventions have far reaching effects on the adult state.

 

Future studies will follow this study over the stages of aging to see if early regulation of calorie intake reverses diabetes and obesity while aging.

 

“This is an early pre-clinical trial that first needs to be tested in clinical trials before any form of guidelines can be developed,” Devaskar said. “More importantly, we must make sure that control of caloric intake during infancy and childhood does not have any unintended side effects before taking on clinical trials. More research is required to ensure that these metabolic advantages will persist later in life.”

 

The study was funded by the National Institute of Child Health and Human Development.

 

In addition to Devaskar and Garg, the study was conducted by a team of UCLA researchers including Manikkavasagar Thamotharan, Yun Dai, Shanthie Thamotharan, Bo Chul Shin and David Stout.

 

 

Source: University of California – Los Angeles  Health Sciences

 

 

Published on 5th April 2012

 

Best cpc cpm ppc ad network for publisher