logo

This is your brain on sugar: UCLA study shows high-fructose diet sabotages learning, memory

 

A new UCLA rat study is the first to show how a diet steadily high in fructose slows the brain, hampering memory and learning — and how omega-3 fatty acids can counteract the disruption. The peer-reviewed Journal of Physiology publishes the findings in its May 15 edition.

 


 

“Our findings illustrate that what you eat affects how you think,” said Fernando Gomez-Pinilla, a professor of neurosurgery at the David Geffen School of Medicine at UCLA and a professor of integrative biology and physiology in the UCLA College of Letters and Science. “Eating a high-fructose diet over the long term alters your brain’s ability to learn and remember information. But adding omega-3 fatty acids to your meals can help minimize the damage.”

 

While earlier research has revealed how fructose harms the body through its role in diabetes, obesity and fatty liver, this study is the first to uncover how the sweetener influences the brain.

 

Sources of fructose in the Western diet include cane sugar (sucrose) and high-fructose corn syrup, an inexpensive liquid sweetener. The syrup is widely added to processed foods, including soft drinks, condiments, applesauce and baby food. The average American consumes roughly 47 pounds of cane sugar and 35 pounds of high-fructose corn syrup per year, according to the U.S. Department of Agriculture.

 

“We’re less concerned about naturally occurring fructose in fruits, which also contain important antioxidants,” explained Gomez-Pinilla, who is also a member of UCLA’s Brain Research Institute and Brain Injury Research Center. “We’re more concerned about the fructose in high-fructose corn syrup, which is added to manufactured food products as a sweetener and preservative.”

 

Gomez-Pinilla and study co-author Rahul Agrawal, a UCLA visiting postdoctoral fellow from India, studied two groups of rats that each consumed a fructose solution as drinking water for six weeks. The second group also received omega-3 fatty acids in the form of flaxseed oil and docosahexaenoic acid (DHA), which protects against damage to the synapses — the chemical connections between brain cells that enable memory and learning.

 

“DHA is essential for synaptic function — brain cells’ ability to transmit signals to one another,” Gomez-Pinilla said. “This is the mechanism that makes learning and memory possible. Our bodies can’t produce enough DHA, so it must be supplemented through our diet.”

 

The animals were fed standard rat chow and trained on a maze twice daily for five days before starting the experimental diet. The UCLA team tested how well the rats were able to navigate the maze, which contained numerous holes but only one exit. The scientists placed visual landmarks in the maze to help the rats learn and remember the way.

 

Six weeks later, the researchers tested the rats’ ability to recall the route and escape the maze. What they saw surprised them.

 

“The second group of rats navigated the maze much faster than the rats that did not receive omega-3 fatty acids,” Gomez-Pinilla said. “The DHA-deprived animals were slower, and their brains showed a decline in synaptic activity. Their brain cells had trouble signaling each other, disrupting the rats’ ability to think clearly and recall the route they’d learned six weeks earlier.”

 

The DHA-deprived rats also developed signs of resistance to insulin, a hormone that controls blood sugar and regulates synaptic function in the brain. A closer look at the rats’ brain tissue suggested that insulin had lost much of its power to influence the brain cells.

 

“Because insulin can penetrate the blood–brain barrier, the hormone may signal neurons to trigger reactions that disrupt learning and cause memory loss,” Gomez-Pinilla said.

 

He suspects that fructose is the culprit behind the DHA-deficient rats’ brain dysfunction. Eating too much fructose could block insulin’s ability to regulate how cells use and store sugar for the energy required for processing thoughts and emotions.

 

“Insulin is important in the body for controlling blood sugar, but it may play a different role in the brain, where insulin appears to disturb memory and learning,” he said. “Our study shows that a high-fructose diet harms the brain as well as the body. This is something new.”

 

Gomez-Pinilla, a native of Chile and an exercise enthusiast who practices what he preaches, advises people to keep fructose intake to a minimum and swap sugary desserts for fresh berries and Greek yogurt, which he keeps within arm’s reach in a small refrigerator in his office. An occasional bar of dark chocolate that hasn’t been processed with a lot of extra sweetener is fine too, he said.

 

Still planning to throw caution to the wind and indulge in a hot-fudge sundae? Then also eat foods rich in omega-3 fatty acids, like salmon, walnuts and flaxseeds, or take a daily DHA capsule. Gomez-Pinilla recommends one gram of DHA per day.

 

“Our findings suggest that consuming DHA regularly protects the brain against fructose’s harmful effects,” said Gomez-Pinilla. “It’s like saving money in the bank. You want to build a reserve for your brain to tap when it requires extra fuel to fight off future diseases.”

 

 

Source: University of California, Los Angeles

 

Published on 18th May 2012

 

Related articles

Genes and obesity: Fast food isn’t only culprit in expanding waistlines — DNA is also to blame

UCLA scientists unlock mystery of how ‘handedness’ arises

Achiral triangles form chiral super-structures

 


 

 

 

Colored patches represent parallelogram outlines around pairs of triangles that have formed chiral super-structures. Parallelograms having different “handedness” and orientations are color-coded and superimposed over each other. (Credit: Thomas G. Mason and Kun Zhao)

 

 

 

The overwhelming majority of proteins and other functional molecules in our bodies display a striking molecular characteristic: They can exist in two distinct forms that are mirror images of each other, like your right hand and left hand. Surprisingly, each of our bodies prefers only one of these molecular forms.

 

This mirror-image phenomenon — known as chirality or “handedness” — has captured the imagination of a UCLA research group led by Thomas G. Mason, a professor of chemistry and physics and a member of the California NanoSystems Institute at UCLA.

 

Mason has been exploring how and why chirality arises, and his newest findings on the physical origins of the phenomenon were published May 1 in the journal Nature Communications.

 

“Objects like our hands are chiral, while objects like regular triangles are achiral, meaning they don’t have a handedness to them,” said Mason, the senior author of the study. “Achiral objects can be easily superimposed on top of one another.”

 

Why many of the important functional molecules in our bodies almost always occur in just one chiral form when they could potentially exist in either is a mystery that has confounded researchers for years.

 

“Our bodies contain important molecules like proteins that overwhelmingly have one type of chirality,” Mason said. “The other chiral form is essentially not found. I find that fascinating. We asked, ‘Could this biological preference of a particular chirality possibly have a physical origin?'”

 

In addressing this question, Mason and his team sought to discover how chirality occurs in the first place. Their findings offer new insights into how the phenomenon can arise spontaneously, even with achiral building-blocks.

 

Mason and his colleagues used a manufacturing technique called lithography, which is the basis for making computer chips, to make millions of microscale particles in the shape of achiral triangles. In the past, Mason has used this technique to “print” particles in a wide variety of shapes, and even in the form of letters of the alphabet.

 

Using optical microscopy, the researchers then studied very dense systems of these lithographic triangular particles. To their surprise, they discovered that the achiral triangles spontaneously arranged themselves to form two-triangle “super-structures,” with each super-structure exhibiting a particular chirality.

 

In the image that accompanies this article, the colored outlines in the field of triangles indicate chiral super-structures having particular orientations.

 

So what is causing this phenomenon to occur? Entropy, says Mason. His group has shown for the first time that chiral structures can originate from physical entropic forces acting on uniform achiral particles.

 

“It’s quite bizarre,” Mason said. “You’re starting with achiral components — triangles — which undergo Brownian motion and you end up with the spontaneous formation of super-structures that have a handedness or chirality. I would never have anticipated that in a million years.”

 

Entropy is usually thought of as a disordering force, but that doesn’t capture its subtler aspects. In this case, when the triangular particles are diffusing and interacting at very high densities on a flat surface, each particle can actually maximize its “wiggle room” by becoming partially ordered into a liquid crystal (a phase of matter between a liquid and a solid) made out of chiral super-structures of triangles.

 

“We discovered that just two physical ingredients — entropy and particle shape — are enough to cause chirality to appear spontaneously in dense systems,” Mason said. “In my 25 years of doing research, I never thought that I would see chirality occur in a system of achiral objects driven by entropic forces.”

 

As for the future of this research, “We are very interested to see what happens with other shapes and if we can eventually control the chiral formations that we see occurring here spontaneously,” he said.

 

“To me, it’s intriguing, because I think about the chiral preference in biology,” Mason added. “How did this chiral preference happen? What are the minimum ingredients for that to occur? We’re learning some new physical rules, but the story in biology is far from complete. We have added another chapter to the story, and I’m amazed by these findings.”

 

To learn more, a message board accompanies the publication in Nature Communications, an online journal, as a forum for interactive discussion.

 

This research was funded by the University of California. Kun Zhao, a postdoctoral researcher in Mason’s laboratory, made many key contributions, including fabricating the triangle particles, creating the two-dimensional system of particles, performing the optical microscopy experiments, carrying out extensive particle-tracking analysis and interpreting the results.

 

Along with Mason, co-author Robijn Bruinsma, a UCLA professor of theoretical physics and a member of the California NanoSystems Institute at UCLA, contributed to the understanding of the chiral symmetry breaking and the liquid crystal phases.

 

 


Source: University of California – Los Angeles.

 

Published on 12th May 2012

 

 

 

Plastic Trash Altering Ocean Habitats, Scripps Study Shows

 

A 100-fold upsurge in human-produced plastic garbage in the ocean is altering habitats in the marine environment, according to a new study led by a graduate student researcher at Scripps Institution of Oceanography at UC San Diego.

 


 

In 2009 an ambitious group of graduate students led the Scripps Environmental Accumulation of Plastic Expedition (SEAPLEX) to the North Pacific Ocean Subtropical Gyre aboard the Scripps research vessel New Horizon. During the voyage the researchers, who concentrated their studies a thousand miles west of California, documented an alarming amount of human-generated trash, mostly broken down bits of plastic the size of a fingernail floating across thousands of miles of open ocean.

 

At the time the researchers didn’t have a clear idea of how such trash might be impacting the ocean environment, but a new study published in the May 9 online issue of the journal Biology Letters reveals that plastic debris in the area popularly known as the “Great Pacific Garbage Patch” has increased by 100 times over in the past 40 years, leading to changes in the natural habitat of animals such as the marine insect Halobates sericeus. These “sea skaters” or “water striders”—relatives of pond water skaters—inhabit water surfaces and lay their eggs on flotsam (floating objects). Naturally existing surfaces for their eggs include, for example: seashells, seabird feathers, tar lumps and pumice. In the new study researchers found that sea skaters have exploited the influx of plastic garbage as new surfaces for their eggs. This has led to a rise in the insect’s egg densities in the North Pacific Subtropical Gyre.

 

Such an increase, documented for the first time in a marine invertebrate (animal without a backbone) in the open ocean, may have consequences for animals across the marine food web, such as crabs that prey on sea skaters and their eggs.

 

“This paper shows a dramatic increase in plastic over a relatively short time period and the effect it’s having on a common North Pacific Gyre invertebrate,” said Scripps graduate student Miriam Goldstein, lead author of the study and chief scientist of SEAPLEX, a UC Ship Funds-supported voyage. “We’re seeing changes in this marine insect that can be directly attributed to the plastic.”

 

The new study follows a report published last year by Scripps researchers in the journal Marine Ecology Progress Series showing that nine percent of the fish collected during SEAPLEX contained plastic waste in their stomachs. That study estimated that fish in the intermediate ocean depths of the North Pacific Ocean ingest plastic at a rate of roughly 12,000 to 24,000 tons per year.

 

The Goldstein et al. study compared changes in small plastic abundance between 1972-1987 and 1999-2010 by using historical samples from the Scripps Pelagic Invertebrate Collection and data from SEAPLEX, a NOAA Ship Okeanos Explorer cruise in 2010, information from the Algalita Marine Research Foundation as well as various published papers.

 

In April, researchers with the Instituto Oceanográfico in Brazil published a report that eggs of Halobates micans, another species of sea skater, were found on many plastic bits in the South Atlantic off Brazil.

 

“Plastic only became widespread in late ’40s and early ’50s, but now everyone uses it and over a 40-year range we’ve seen a dramatic increase in ocean plastic,” said Goldstein. “Historically we have not been very good at stopping plastic from getting into the ocean so hopefully in the future we can do better.”

 

Coauthors of the study include Marci Rosenberg, a student at UCLA, and Scripps Research Biologist Emeritus Lanna Cheng.

 

Funding for SEAPLEX was provided by the University of California Ship Funds, an innovative program that allows a new generation of scientists to gain valuable scientific training at sea, Project Kaisei/Ocean Voyages Institute, the Association for Women in Science-San Diego and the National Science Foundation’s (NSF) Integrative Graduate Education and Research Traineeship program. The NOAA Okeanos Explorer Program (2010 Always Exploring expedition) and National Marine Fisheries Service provided support for the 2010 samples. Other study support was provided by Jim and Kris McMillan, Jeffrey and Marcy Krinsk, Lyn and Norman Lear, Ellis Wyer and an anonymous donor. Other support was provided by the California Current Ecosystem (CCE) program, part of NSF’s Long-Term Ecological Research (LTER) program.

 

 

 

Source: University of California, San Diego


Published on 12th  May  2012

 

Groundwater pumping leads to sea level rise, cancels out effect of dams

 

As people pump groundwater for irrigation, drinking water, and industrial uses, the water doesn’t just seep back into the ground — it also evaporates into the atmosphere, or runs off into rivers and canals, eventually emptying into the world’s oceans. This water adds up, and a new study calculates that by 2050, groundwater pumping will cause a global sea level rise of about 0.8 millimeters per year.

 


 

“Other than ice on land, the excessive groundwater extractions are fast becoming the most important terrestrial water contribution to sea level rise,” said Yoshihide Wada, with Utrecht University in the Netherlands and lead author of the study. In the coming decades, he noted, groundwater contributions to sea level rise are expected to become as significant as those of melting glaciers and ice caps outside of Greenland and the Antarctic.

 

Between around 1970 and 1990, sea level rise caused by groundwater pumping was cancelled out as people built dams, trapping water in reservoirs so the water wouldn’t empty into the sea, Wada said. His research shows that starting in the 1990s, that changed as populations started pumping more groundwater and building fewer dams.

 

The researchers looked not only at the contribution of groundwater pumping, which they had investigated before, but also at other factors that influence the amount of terrestrial water entering the oceans, including marsh drainage, forest clearing, and new reservoirs. Wada and his colleagues calculate that by mid-century, the net effect of these additional factors is an additional 0.05 mm per year of annual sea level rise, on top of the contribution from groundwater pumping alone.

 

The research team’s article is being published today in Geophysical Research Letters, a journal of the American Geophysical Union.

 

The last report of the United Nations Intergovernmental Panel on Climate Change in 2007 addressed the effect on sea level rise of melting ice on land, including glaciers and ice caps, Wada said. But it didn’t quantify the future contribution from other terrestrial water sources, such as groundwater, reservoirs, wetlands and more, he said, because the report’s authors thought the estimates for those sources were too uncertain.

 

“They assumed that the positive and negative contribution from the groundwater and the reservoirs would cancel out,” Wada said. “We found that wasn’t the case. The contribution from the groundwater is going to increase further, and outweigh the negative contribution from reservoirs.”

 

In the current study, the researchers estimated the impact of groundwater depletion since 1900 using data from individual countries on groundwater pumping, model simulations of groundwater recharge, and reconstructions of how water demand has changed over the years. They also compared and corrected those estimates with observations from sources such as the GRACE satellite, which uses gravity measurements to determine variations in groundwater storage.

 

With these groundwater depletion rates, Wada and his colleagues estimate that in 2000, people pumped about 204 cubic kilometers (49 cubic miles) of groundwater, most of which was used for irrigation. Most of this, in turn, evaporates from plants, enters the atmosphere and rains back down. Taking into account the seepage of groundwater back into the aquifers, as well as evaporation and runoff, the researchers estimated that groundwater pumping resulted in sea level rise of about 0.57 mm in 2000 — much greater than the 1900 annual sea level rise of 0.035 mm.

 

The researchers also projected groundwater depletion, reservoir storage, and other impacts for the rest of the century, using climate models and projected population growth and land use changes. The increase in groundwater depletion between 1900 and 2000 is due mostly to increased water demands, the researchers find. But the increase projected between 2000 and 2050 is mostly due to climate-related factors like decreased surface water availability and irrigated agricultural fields that dry out faster in a warmer climate.

 

If things continue as projected, Wada estimates that by 2050, the net, cumulative effect of these non-ice, land-based water sources and reservoirs — including groundwater pumping, marsh drainage, dams, and more — will have added 31 mm to sea level rise since 1900.

 

The new study assumes that, where there is groundwater, people will find a way to extract it, Wada said, but some of his colleagues are investigating the limits of groundwater extraction. One way to decrease groundwater’s contribution to sea level rise, he noted, is to improve water efficiency in agriculture — to grow more with less groundwater.

 

 

 

Source:  American Geophysical Union

 

 

Published on 10th May 2012

 

Advanced genetic screening method may speed vaccine development

 

Infectious diseases—both old and new—continue to exact a devastating toll, causing some 13 million fatalities per year around the world.


Vaccines remain the best line of defense against deadly pathogens and now Kathryn Sykes and Stephen Johnston, researchers at Arizona State University’s Biodesign Institute, along with co-author Michael McGuire from the University of Texas Southwestern Medical Center are using clever functional screening methods to attempt to speed new vaccines into production that are both safer and more potent.

 

In a recent study appearing in the journal Proteome Science, the group used high-throughput methods to identify a modulator of immune activity that exists naturally in an unusual pathogen belonging to the Poxviridae family of viruses.

 

Parapoxvirus infection causes immune cell accumulation at the site of infection; direct screening in the host for this biological activity enabled the isolation of an immunomodulator—labeled B2.  Indeed, B2 by itself causes immune cell accumulation at the site of skin injection. When added to a traditional influenza vaccine, B2 improves the vaccine’s protective capacity. Furthermore, the immunomodulator also demonstrated the ability to shrink the size of cancerous tumors, even in the absence of any accompanying specific antigen.

 

In the past, the process of vaccine discovery involved the random selection of naturally attenuated strains of viruses and bacteria, which were found to provide protection in humans. Examples of this approach include the use of vaccinia to protect against smallpox and attenuated mycobacterium bovis (BCG) to protect against tuberculosis.

 

In recent years, many vaccines have been developed using only selected portions of a given pathogen to confer immunity. These so-called subunit vaccines have several advantages over whole pathogen vaccines. Genetic components that allow a given pathogen to elude immune detection for example may be screened out, as well as any factors causing unwanted vaccine side effects. Through careful screening, just those elements responsible for eliciting protective immune responses in the host can be extracted from the pathogen and reassembled into an effective, safer subunit vaccine.

 

In practice, the process of narrowing the field of promising subunit candidates from the whole genome of a pathogen has often been time consuming, laborious and perplexing. In the current study, their earlier-developed strategy, known as expression library immunization, is extended to develop a scheme to find the protein-encoding segments—known as open reading frames (ORFs)—from a pathogenic genome that have any biological function of interest.

 

This simple, yet powerful technique uses the host’s immune system itself to rapidly reduce any pathogenic genome (viral, fungal, bacterial or parasitic) to a handful of antigens capable of conferring protection in the host.

 

The advantage of this in vivo technique is that it offers a means of rapidly screening entire genomes, with the results of the search displaying desired immunogenic traits. The mode of entry of vaccines designed in this way closely resembles the natural infection process of host cells—an improvement over live attenuated vaccines.

 

This promising approach has been used effectively to engineer a vaccine against hepatitis and may provide a new avenue for the development of protective agents against pathogens that have thus far eluded traditional vaccine efforts, including HIV and ebola.

 

“We had developed a method for screening for protective subunits against a specific disease,” Sykes says.  “However this type of safer vaccine design is notoriously less potent than the whole pathogen designs.  What we needed was a method to find  generally useful vaccine components that would serve to enhance and control immunity.”

 

The group chose the pathogen parapoxvirus ovis (known as the Orf virus) for the current set of experiments, in which expression library immunization techniques were used to screen for an immunogenic factor buried in the pathogen’s genome.

 

Parapoxvirus ovis causes a highly infectious disease known as Orf, which is prevalent in sheep and goats and may be transmitted cutaneously to humans handling these animals, causing pustular lesions and scabs.

 

Once the group had sequenced the full genome of parapoxvirus, PCR was used to amplify all the viral open reading frames, which code for all of the viruse’s proteins. Each ORF, comprising a library of genomic components, was compiled into a unique high throughput expression construct, and these were randomly distributed into sub-library pools. These pools were directly delivered into sets of mice for in vivo expression. Functional testing for the activity desired identified B2 as the immune cell accumulator.

 

In further experiments, the team co-delivered B2L as an additive or adjuvant for an influenza gene vaccine, to see if it could improve survival rates in mice challenged with the influenza virus. The co-immunized mice indeed displayed full protection against influenza compared with 50 percent protection of the control group, immunized with influenza vaccine alone.

 

In addition to infectious agents like Orf, non-infectious diseases including cancer may be amenable to vaccine defense. Thus far however, the discovery of tumor-specific antigens has been frustrating. One approach may lie in using non-specific immunogenic factors like B2.

 

In the current study, two forms of cancer were investigated in a mouse model, following the administering of B2 alone, in the absence of a disease antigen. The experiments evaluated B2’s ability to enhance survival and shrink tumor size. In the case of an aggressive melanoma, tumor size was significantly reduced and survival rate improved.  Administration of B2 to an infection induced by a breast cancer cell line also showed a modest but measureable reduction in tumor size.

 

With the growing popularity of sub-unit vaccines, the need arises for more effective adjuvants, which may be used to compensate for the reduced immunogenicity of such vaccines compared with their whole-pathogen counterparts. Techniques similar to those applied here to isolate and evaluate B2 could potentially permit the screening of virtually any genome for any gene-encoded activity testable in an organism.

 

The original article was written by Richard Harth

 

Source:-  Arizona State University

 

Published on 10th May 2012

 

Scientists Gain New Understanding of Alzheimer’s Trigger

 

A highly toxic beta-amyloid – a protein that exists in the brains of Alzheimer’s disease victims – has been found to greatly increase the toxicity of other more common and less toxic beta-amyloids, serving as a possible “trigger” for the advent and development of Alzheimer’s, researchers at the University of Virginia and German biotech company Probiodrug have discovered.

 


 

The finding, reported in the May 2 online edition of the journal Nature, could lead to more effective treatments for Alzheimer’s. Already, Probiodrug AG, based in Halle, Germany has completed phase 1 clinical trials in Europe with a small molecule that inhibits an enzyme, glutaminyl cyclase, that catalyzes the formation of this hypertoxic version of beta-amyloid.

 

“This form of beta-amyloid, called pyroglutamylated (or pyroglu) beta-amyloid, is a real bad guy in Alzheimer’s disease,” said principal investigator George Bloom, a U.Va. professor of biology and cell biology in the College of Arts & Sciences and School of Medicine, who is collaborating on the study with scientists at Probiodrug. “We’ve confirmed that it converts more abundant beta-amyloids into a form that is up to 100 times more toxic, making this a very dangerous killer of brain cells and an attractive target for drug therapy.”

 

Bloom said the process is similar to various prion diseases, such as mad cow disease or chronic wasting disease, where a toxic protein can “infect” normal proteins that spread through the brain and ultimately destroy it.

 

In the case of Alzheimer’s, severe dementia occurs over the course of years prior to death.

 

“You might think of this pyroglu beta-amyloid as a seed that can further contaminate something that’s already bad into something much worse – it’s the trigger,” Bloom said. Just as importantly, the hypertoxic mixtures that are seeded by pyroglu beta-amyloid exist as small aggregates, called oligomers, rather than as much larger fibers found in the amyloid plaques that are a signature feature of the Alzheimer’s brain.

 

And the trigger fires a “bullet,” as Bloom puts it. The bullet is a protein called tau that is stimulated by beta-amyloid to form toxic “tangles” in the brain that play a major role in the onset and development of Alzheimer’s. Using mice bred to have no tau genes, the researchers found that without the interaction of toxic beta-amyloids with tau, the Alzheimer’s cascade cannot begin. The pathway by which pyroglu beta-amyloid induces the tau-dependent death of neurons is now the target of further investigation to understand this important step in the early development of Alzheimer’s disease

 

“There are two matters of practical importance in our discovery,” Bloom said. “One, is the new insights we have as to how Alzheimer’s might actually progress – the mechanisms which are important to understand if we are to try to prevent it from happening; and second, it provides a lead into how to design drugs that might prevent this kind of beta-amyloid from building up in the first place.”

 

Said study co-author Hans-Ulrich Demuth, a biochemist and chief scientific officer at Probiodrug, “This publication further adds significant evidence to our hypothesis about the critical role pyroglu beta-amyloid plays in the initiation of Alzheimer’s Disease. For the first time we have found a clear link in the relationship between pyroglu beta-amyloid, oligomer formation and tau protein in neuronal toxicity.”

 

Bloom and his collaborators are now looking for other proteins that are needed for pyroglu beta-amyloid to become toxic. Any such proteins they discover are potential targets for the early diagnosis and/or treatment of Alzheimer’s disease.

 

 

 

Source:   University of Virginia

 

Published on 3rd  May 2012

 

 

Related articles

HOME-BASED ASSESSMENT TOOL FOR DEMENTIA SCREENING


Alzheimer’s Cognitive Decline Slows in Advanced Age


Alzheimer’s Gene Causes Brain’s Blood Vessels to Leak, Die


USF researchers find that Alzheimer’s precursor protein controls its own fate

 

 

Tiny ‘spherules’ reveal details about Earth’s asteroid impacts

 


 

Researchers are learning details about asteroid impacts going back to the Earth’s early history by using a new method for extracting precise information from tiny “spherules” embedded in layers of rock. The spherules were created when asteroids crashed into Earth, vaporizing rock that expanded as a giant vapor plume. Small droplets of molten rock in the plume condensed and solidified, falling back to the surface as a thin layer. This sample was found in Western Australia and formed 2.63 billion years ago in the aftermath of a large impact. (Credit: Oberlin College photo/Bruce M. Simonson)

 

Researchers are learning details about asteroid impacts going back to the Earth’s early history by using a new method for extracting precise information from tiny “spherules” embedded in layers of rock.


The spherules were created when asteroids crashed into the Earth, vaporizing rock that expanded into space as a giant vapor plume. Small droplets of molten and vaporized rock in the plume condensed and solidified, falling back to Earth as a thin layer. The round or oblong particles were preserved in layers of rock, and now researchers have analyzed them to record precise information about asteroids impacting Earth from 3.5 billion to 35 million years ago.


“What we have done is provide the foundation for understanding how to interpret the layers in terms of the size and velocity of the asteroid that made them,” said Jay Melosh, an expert in impact cratering and a distinguished professor of earth and atmospheric sciences, physics and aerospace engineering at Purdue University.


Findings, which support a theory that the Earth endured an especially heavy period of asteroid bombardment early in its history, are detailed in a research paper appearing online in the journal Nature on Wednesday (April 25). The paper was written by Purdue physics graduate student Brandon Johnson and Melosh. The findings, based on geologic observations, support a theoretical study in a companion paper in Nature by researchers at the Southwest Research Institute in Boulder, Colo.


The period of heavy asteroid bombardment – from 4.2 to 3.5 billion years ago – is thought to have been influenced by changes in the early solar system that altered the trajectory of objects in an asteroid belt located between Mars and Jupiter, sending them on a collision course with Earth.


“That’s the postulate, and this is the first real solid evidence that it actually happened,” Melosh said. “Some of the asteroids that we infer were about 40 kilometers in diameter, much larger than the one that killed off the dinosaurs about 65 million years ago that was about 12-15 kilometers. But when we looked at the number of impactors as a function of size, we got a curve that showed a lot more small objects than large ones, a pattern that matches exactly the distribution of sizes in the asteroid belt. For the first time we have a direct connection between the crater size distribution on the ancient Earth and the sizes of asteroids out in space.”


Because craters are difficult to study directly, impact history must be inferred either by observations of asteroids that periodically pass near the Earth or by studying craters on the moon. Now, the new technique using spherules offers a far more accurate alternative to chronicle asteroid impacts on Earth, Melosh said.


“We can look at these spherules, see how thick the layer is, how big the spherules are, and we can infer the size and velocity of the asteroid,” Melosh said. “We can go back to the earliest era in the history of the Earth and infer the population of asteroids impacting the planet.”


For asteroids larger than about 10 kilometers in diameter, the spherules are deposited in a global layer.


“Some of these impacts were several times larger than the Chicxulub impact that killed off the dinosaurs 65 million years ago,” Johnson said. “The impacts may have played a large role in the evolutional history of life. The large number of impacts may have helped simple life by introducing organics and other important materials at a time when life on Earth was just taking hold.”


A 40-kilometer asteroid would have wiped out everything on the Earth’s surface, whereas the one that struck 65 million years ago killed only land animals weighing more than around 20 kilograms.


“Impact craters are the most obvious indication of asteroid impacts, but craters on Earth are quickly obscured or destroyed by surface weathering and tectonic processes,” Johnson said. “However, the spherule layers, if preserved in the geologic record, provide information about an impact even when the source crater cannot be found.”


The Purdue researchers studied the spherules using computer models that harness mathematical equations developed originally to calculate the condensation of vapor.


“There have been some new wrinkles in vapor condensation modeling that motivated us to do this work, and we were the first to apply it to asteroid impacts,” Melosh said.


The spherules are about a millimeter in diameter.


The researchers also are studying a different type of artifact similar to spherules but found only near the original impact site. Whereas the globally distributed spherules come from the condensing vaporized rock, these “melt droplets” are from rock that’s been melted and not completely vaporized.


“Before this work, it was not possible to distinguish between these two types of formations,” Melosh said. “Nobody had established criteria for discriminating between them, and we’ve done that now.”


One of the authors of the Southwest Research Institute paper, David Minton, is now an assistant professor of earth and atmospheric sciences at Purdue.


Findings from the research may enable Melosh’s team to enhance an asteroid impact effects calculator he developed to estimate what would happen if asteroids of various sizes were to hit the Earth. The calculator, “Impact: Earth!” allows anyone to calculate potential comet or asteroid damage based on the object’s mass.


The research has been funded by NASA.

 

 

 

Source: Purdue University

 

Published on 3rd May 2012

 

24 new species discovered on Caribbean islands are close to extinction

Lizard with blue tail on brown rock

 


 

 

An Anguilla Bank skink. Blair Hedges and his team have discovered and scientifically named 24 new species of lizards known as skinks. (Credit: Karl Questel)

 

 

 

In a single new scientific publication, 24 new species of lizards known as skinks, all from islands in the Caribbean, have been discovered and scientifically named. According to Blair Hedges, professor of biology at Penn State University and the leader of the research team, half of the newly added skink species already may be extinct or close to extinction, and all of the others on the Caribbean islands are threatened with extinction. The researchers found that the loss of many skink species can be attributed primarily to predation by the mongoose — an invasive predatory mammal that was introduced by farmers to control rats in sugarcane fields during the late 19th century. The research team reports on the newly discovered skinks in a 245-page article published today (April 30) in the journal Zootaxa.

About 130 species of reptiles from all over the world are added to the global species count each year in dozens of scientific articles. However, not since the 1800s have more than 20 reptile species been added at one time. Primarily through examination of museum specimens, the team identified a total of 39 species of skinks from the Caribbean islands, including six species currently recognized, and another nine named long ago but considered invalid until now. Hedges and his team also used DNA sequences, but most of the taxonomic information, such as counts and shapes of scales, came from examination of the animals themselves.

“Now, one of the smallest groups of lizards in this region of the world has become one of the largest groups,” Hedges said. “We were completely surprised to find what amounts to a new fauna, with co-occurring species and different ecological types.”

He said some of the new species are six times larger in body size than other species in the new fauna.

Hedges also explained that these New World skinks, which arrived in the Americas about 18 million years ago from Africa by floating on mats of vegetation, are unique among lizards in that they produce a human-like placenta, which is an organ that directly connects the growing offspring to the maternal tissues that provide nutrients.

“While there are other lizards that give live birth, only a fraction of the lizards known as skinks make a placenta and gestate offspring for up to one year,” Hedges said.

He also speculated that the lengthy gestational period may have given predators a competitive edge over skinks, since pregnant females are slower and more vulnerable.

“The mongoose is the predator we believe is responsible for many of the species’ close-to-extinction status in the Caribbean,” Hedges said. “Our data show that the mongoose, which was introduced from India in 1872 and spread around the islands over the next three decades, has nearly exterminated this entire reptile fauna, which had gone largely unnoticed by scientists and conservationists until now.”

According to Hedges, the “smoking gun” is a graph included in the scientific paper showing a sharp decline in skink populations that occurred soon after the introduction of the mongoose. Hedges explained that the mongoose originally was brought to the New World to control rats, which had become pests in the sugarcane fields in Cuba, Hispaniola, Puerto Rico, Jamaica and the Lesser Antilles. While this strategy did help to control infestations of some pests; for example, the Norway rat, it also had the unintended consequence of reducing almost all skink populations.

“By 1900, less than 50 percent of those mongoose islands still had their skinks, and the loss has continued to this day,” Hedges said.

This newly discovered skink fauna will increase dramatically the number of reptiles categorized as “critically endangered” by the International Union for Conservation of Nature in their “Red List of Threatened Species,” which is recognized as the most comprehensive database evaluating the endangerment status of various plant and animal species.

“According to our research, all of the skink species found only on Caribbean islands are threatened,” Hedges said. “That is, they should be classified in the Red List as either vulnerable, endangered, or critically endangered. Finding that all species in a fauna are threatened is unusual, because only 24 percent of the 3,336 reptile species listed in the Red List have been classified as threatened with extinction. Most of the 9,596 named reptile species have yet to be classified in the Red List.”

Hedges explained that there are two reasons why such a large number of species went unnoticed for so many years, in a region frequented by scientists and tourists.

“First, Caribbean skinks already had nearly disappeared by the start of the 20th century, so people since that time rarely have encountered them and therefore have been less likely to study them,” he said. “Second, the key characteristics that distinguish this great diversity of species have been overlooked until now.”

Hedges also noted that many potential new species of animals around the world have been identified in recent years with DNA data. However, much more difficult is the task of following up DNA research with the work required to name new species and to formally recognize them as valid, as this team did with Caribbean skinks.

The other member of the research team, Caitlin Conn, now a researcher at the University of Georgia and formerly a biology major in Penn State’s Eberly College of Science and a student in Penn State’s Schreyer Honors College at the time of the research, added that researchers might be able to use the new data to plan conservation efforts, to study the geographic overlap of similar species, and to study in more detail the skinks’ adaptation to different ecological habitats or niches. The research team also stressed that, while the mongoose introduction by humans now has been linked to these reptile declines and extinctions, other types of human activity, especially the removal of forests, are to blame for the loss of other species in the Caribbean.

Funding for the research comes from the National Science Foundation.

 

 

 


Source: Pennsylvania State University

 

Published on 2nd May 2012

 

COMPRESSED SENSING ALLOWS SUPER-RESOLUTION MICROSCOPY IMAGING OF LIVE CELL STRUCTURES

 

Single Molecule Identification

 


 

 

 

Image shows single-molecule identification. The green cross signs show the locations of single molecules using the super resolution technique.  (Credit Lei Zhu and Bo Huang)


 

 

Researchers from the Georgia Institute of Technology and University of California San Francisco have advanced scientists’ ability to view a clear picture of a single cellular structure in motion. By identifying molecules using compressed sensing, this new method provides needed spatial resolution plus a faster temporal resolution than previously possible.

 

Despite many achievements in the field of super-resolution microscopy in the past few years with spatial resolution advances, live-cell imaging has remained a challenge because of the need for high temporal resolution.

 

Now, Lei Zhu, assistant professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering, and Bo Huang, assistant professor in UCSF’s Department of Pharmaceutical Chemistry and Department of Biochemistry and Biophysics, have developed an advanced approach using super-resolution microscopy to resolve cellular features an order of magnitude smaller than what could be seen before. This allows the researchers to tap previously inaccessible information and answer new biological questions.

 

The research was published April 22, 2012 in the journal Nature Methods. The research is funded by the National Institutes of Health, UCSF Program for Breakthrough Biomedical Research, Searle Scholarship and Packard Fellowship for Science and Engineering.

 

The previous technology using the single-molecule-switching approach for super-resolution microscopy depends on spreading single molecule images sparsely into many, often thousands of, camera frames. It is extremely limited in its temporal resolution and does not provide the ability to follow dynamic processes in live cells.

 

“We can now use our discovery using super-resolution microscopy with seconds or even sub-second temporal resolution for a large field of view to follow many more dynamic cellular processes,” said Zhu. “Much of our knowledge of the life of a cell comes from our ability to see the small structures within it.”

 

Huang noted, “One application, for example, is to investigate how mitochondria, the power house of the cell, interact with other organelles and the cytoskeleton to reshape the structure during the life cycle of the cell.”

 

Currently, light microscopy, especially in the modern form of fluorescence microscopy, is still used frequently by many biologists. However, the authors say, conventional light microscopy has one major limitation: the inability to resolve two objects closer than half the wavelength of the light because of the phenomenon called diffraction. With diffraction, the images look blurry and overlapped no matter how high the magnification that is used.

 

“The diffraction limit has long been regarded as one of the fundamental constraints for light microscopy until the recent inventions of super-resolution fluorescence microscopy techniques,” said Zhu. Super-resolution microscopy methods, such as stochastic optical reconstruction microscopy (STORM) or photoactivated localization microscopy (PALM), rely on the ability to record light emission from a single molecule in the sample.

 

Using probe molecules that can be switched between a visible and an invisible state, STORM/PALM determines the position of each molecule of interest. These positions ultimately define a structure.

 

The new finding is significant, said Zhu and Huang, because they have shown that the technology allows for following the dynamics of a microtubule cytoskeleton with a three-second time resolution, which would allow researchers to study the active transports of vesicles and other cargos inside the cell.

 

Using the same optical system and detector as in conventional light microscopy, super-resolution microscopy naturally requires longer acquisition time to obtain more spatial information, leading to a trade-off between its spatial and temporal resolution. In super-resolution microscopy methods based on STORM/PALM, each camera image samples a very sparse subset of probe molecules in the sample.

 

An alternative approach is to increase the density of activated fluorophores so that each camera frame samples more molecules. However, this high density of fluorescent spots causes them to overlap, invalidating the widely used single-molecule localization method.

 

The authors said that a number of methods have been reported recently that can efficiently retrieve single-molecule positions even when the single fluorophore signals overlap. These methods are based on fitting clusters of overlapped spots with a variable number of point-spread functions (PSFs) with either maximum likelihood estimation or Bayesian statistics. The Bayesian method has also been applied to the whole image set.

 

As a result of new research, Zhu and Huang present a new approach based on global optimization using compressed sensing, which does not involve estimating or assuming the number of molecules in the image. They show that compressed sensing can work with much higher molecule densities compared to other technologies and demonstrate live cell imaging of fluorescent protein-labeled microtubules with three-second temporal resolution.

 

The STORM experiment used by the authors, with immunostained microtubules in Drosophila melanogaster S2 cells, demonstrated that nearby microtubules can be resolved by compressed sensing using as few as 100 camera frames, whereas they were not discernible by the single-molecule fitting method. They have also performed live STORM on S2 cells stably expressing tubulin fused to mEos2.

 

At the commonly used camera frame rate of 56.4 Hertz, a super-resolution movie was constructed with a time resolution of three seconds (169 frames) and a Nyquist resolution of 60 nanometers, much faster than previously reported, said Zhu and Huang. These results have proven that compressed sensing can enable STORM to monitor live cellular processes with second-scale time resolution, or even sub-second-scale resolution if a faster camera can be used.

 

 

 

Source: Georgia Institute of Technology

 

Published on 24th April 2012

 

 

Clinical Decline in Alzheimer’s Requires Plaque and Proteins

 

According to a new study, the neuron-killing pathology of Alzheimer’s disease (AD), which begins before clinical symptoms appear, requires the presence of both amyloid-beta (a-beta) plaque deposits and elevated levels of an altered protein called p-tau.

 


 

Without both, progressive clinical decline associated with AD in cognitively healthy older individuals is “not significantly different from zero,” reports a team of scientists at the University of California, San Diego School of Medicine in the April 23 online issue of the Archives of Neurology.

 

“I think this is the biggest contribution of our work,” said Rahul S. Desikan, MD, PhD, research fellow and resident radiologist in the UC San Diego Department of Radiology and first author of the study.  “A number of planned clinical trials – and the majority of Alzheimer’s studies – focus predominantly on a-beta. Our results highlight the importance of also looking at p-tau, particularly in trials investigating therapies to remove a-beta. Older, non-demented individuals who have elevated a-beta levels, but normal p-tau levels, may not progress to Alzheimer’s, while older individuals with elevated levels of both will likely develop the disease.”

 

The findings also underscore the importance of p-tau as a target for new approaches to treating patients with conditions ranging from mild cognitive impairment (MCI) to full-blown AD. An estimated 5.4 million Americans have AD. It’s believed that 10 to 20 percent of Americans age 65 and older have MCI, a risk factor for AD. Some current therapies appear to delay clinical AD onset, but the disease remains irreversible and incurable.

 

“It may be that a-beta initiates the Alzheimer’s cascade,” said Desikan. “But once started, the neurodegenerative mechanism may become independent of a-beta, with p-tau and other proteins playing a bigger role in the downstream degenerative cascade. If that’s the case, prevention with anti-a-beta compounds may prove efficacious against AD for older, non-demented individuals who have not yet developed tau pathology.  But novel, tau-targeting therapies may help the millions of individuals who already suffer from mild cognitive impairment or Alzheimer’s disease.”

 

The new study involved evaluations of healthy, non-demented elderly individuals participating in the ongoing, multi-site Alzheimer’s Disease Neuroimaging Initiative, or ADNI. Launched in 2003, ADNI is a longitudinal effort to measure the progression of mild cognitive impairment and early-stage AD.

 

The researchers studied samples of cerebrospinal fluid (CSF) taken from ADNI participants.

 

“In these older individuals, the presence of a-beta alone was not associated with clinical decline,” said Anders M. Dale, PhD, professor of radiology, neurosciences, and psychiatry at UC San Diego and senior author of the study. “However, when p-tau was present in combination with a-beta, we saw significant clinical decline over three years.”

 

A-beta proteins have several normal responsibilities, including activating enzymes and protecting cells from oxidative stress. It is not known why a-beta proteins form plaque deposits in the brain. Similarly, the origins of p-tau are not well understood. One hypothesis, according to Desikan, is that a-beta plaque deposits trigger hyperphosphorylation of nearby tau proteins, which normally help stabilize the structure of brain cells. Hyperphosphorylation occurs when phosphate groups attach to a protein in excess numbers, altering their normal functions. Hyperphosphorylated tau – or p-tau – can then exacerbate the toxic effects of a-beta plaque upon neurons.

 

The discovery of p-tau’s heightened role in AD neurodegeneration suggests it could be a specific biomarker for the disease before clinical symptoms appear. While high levels of another tau protein – t-tau – in cerebrospinal fluid have been linked to neurologic disorders, such as frontotemporal dementia and traumatic brain injury, high levels of p-tau correlates specifically to increased neurofibrillary tangles in brain cells, which are seen predominantly with AD.

 

“These results are in line with another ADNI study of healthy controls and MCI participants that found progressive atrophy in the entorhinal cortex – one of the areas of the brain first affected in AD –only in amyloid positive individuals who also showed evidence of elevated p-tau levels,” said Linda McEvoy, PhD, assistant professor of radiology and study co-author.

 

“One of the exciting dimensions of this paper was the combined use of cerebrospinal fluid markers and clinical assessments to better elucidate the neurodegenerative process underlying Alzheimer’s disease in individuals who do not yet show clinical signs of dementia,” added co-author James Brewer, MD, PhD, an associate professor of radiology and neurosciences at UC San Diego School of Medicine.  “We do not have an animal model that works very well for studying this disease, so the ability to examine the dynamics of neurodegeneration in living humans is critical.”

 

Nonetheless, the scientists say more research is needed. They note that CSF biomarkers provide only an indirect assessment of amyloid and neurofibrillary pathology and may not fully reflect the underlying biological processes of AD.

 

“This study highlights the complex interaction of multiple pathologies that likely contribute to the clinical symptomatology of Alzheimer’s disease,” said co-author Reisa Sperling, MD, a neurologist at Massachusetts General Hospital and Brigham and Women’s Hospital. “It suggests we may be able to intervene in the preclinical stages of AD before there is significant neurodegeneration and perhaps prevent the onset of symptoms.”

 

Other co-authors are Wesley K. Thompson, Department of Psychiatry; and Dominic Holland and Paul S. Aisen, Department of Neuroscience, UC San Diego School of Medicine.

 

Funding for this research came, in part, from the National Institutes of Health and the Alzheimer’s Disease Neuroimaging Initiative.

 

 

 

Source: University of California, San Diego


Published on 24th  April  2012

 

Best cpc cpm ppc ad network for publisher