logo

Study Links Gum Disease and HPV-status of Head and Neck Cancer

 

Human Papilloma Virus (HPV), once almost exclusively associated with cancer of the cervix, is now linked to head and neck cancer. According to a new University at Buffalo study just published in the Archives of Otolaryngology — Head & Neck Surgery, a JAMA publication, gum disease is associated with increased odds of tumors being HPV-positive.

 


 

Mine Tezal, DDS, PhD, assistant professor of oral biology in the UB School of Dental Medicine who is the primary investigator on the study, and a team of scientists from UB evaluated data from 124 patients diagnosed with primary head and neck squamous cell carcinoma (HNSCC) between 1999 and 2007.

 

“The aim of the study was to test the presence of periodontitis, a persistent inflammatory process and HPV-status of HNSCC,” she said.

 

Of the 124 tumor samples Tezal and her team studied, 50 were positive for HPV-16 DNA and that subjects with HPV-positive tumors had a significantly higher severity of periodontitis when compared to subjects with HPV-negative tumors.

 

According to the National Cancer Institute, there has been a steady increase in the prevalence of oropharyngeal cancers in the U.S. since 1973 despite the significant decline in tobacco use since 1965.

 

Tezal notes that this increase has mainly been attributed to oral HPV infection.

 

Understanding the natural history of the oral HPV infection and targeting factors associated not only with its acquisition but also with its persistence, says Tezal, will lead to more effective strategies, not only for prevention, but also for treatment.

 

“While there is an effective vaccine for cervical HPV infection if given prior to the exposure of the virus (females 9-26; males 9-21), oral HPV infection can be transmitted at or any time after birth, and the target population for a vaccine to prevent oral HPV infection has not yet been defined,” said Tezal.

 

Tezal pointed out that though many previous studies combined periodontitis and dental decay as indicators of poor oral health, dental decay was not significantly linked to tumor-HPV status in the present study.

 

“The fact that only periodontitis was associated with tumor HPV status points to the potential association of inflammation with tumor HPV status,” she says.

 

When Tezal and colleagues started their research about eight years ago they were looking at the potential association between chronic inflammation and head and neck cancers because the importance of the local oral environment for malignant tumor growth was widely accepted. However there wasn’t research evaluating the role of local oral factors in the natural history of HNSCC, Tezal said.

 

“The next step in this research will be intervention studies to test whether treating the sources of inflammation, like gum disease, can reduce the acquisition and/or persistence of oral HPV infection and improve the prognosis of HPV-related diseases,” she said.

 

 

 

Source:- University at Buffalo

 

Published on 22nd June 2012

 

 

Related Articles

URMC Geneticists Verify Cholesterol-Cancer Link

Aspirin may help men with prostate cancer live longer, study suggests

Would Sliding Back to pre-PSA Era Cancel Progress in Prostate Cancer?

Glucose deprivation activates feedback loop that kills cancer cells, UCLA study shows

NOVEL COMPOUND HALTS TUMOR SPREAD, IMPROVES BRAIN CANCER TREATMENT IN ANIMAL STUDIES

 

Predators Have Outsized Influence Over Habitats

 

A grasshopper’s change in diet to high-energy carbohydrates while being hunted by spiders may affect the way soil releases carbon dioxide into the atmosphere, according to research results published this week in the journal Science.

 


 

Grasshoppers like to munch on nitrogen-rich grass because it stimulates their growth and reproduction.

 

But when spiders enter the picture, grasshoppers cope with the stress from fear of predation by shifting to carbohydrate-rich plants, setting in motion dynamic changes to the ecosystem they inhabit, scientists have found.

 

“Under stressful conditions they go to different parts of the ‘grocery store’ and choose different foods, changing the makeup of the plant community,” said Oswald Schmitz, a co-author of the paper and an ecologist at Yale University.

 

The high-energy, carbohydrate diet also tilts a grasshopper’s body chemistry toward carbon at the expense of nitrogen.

 

So when a grasshopper dies, its carcass breaks down more slowly, thus depriving the soil of high-quality fertilizer and slowing the decomposition of uneaten plants.

 

“This study casts a new light on the importance of predation in natural communities,” said Saran Twombly, program director in the National Science Foundation’s Division of Environmental Biology, which funded the research.

 

“A clever suite of experiments shows that the dark hand of predation extends all the way from altering what prey eat to the nutrients their decomposing bodies contribute to soil.”

 

Microbes in the soil require a lot of nitrogen to function and to produce the enzymes that break down organic matter.

 

“It only takes a slight change in the chemical composition of that animal biomass to fundamentally alter how much carbon dioxide the microbial pool is releasing to the atmosphere while it is decomposing plant organic matter,” said Schmitz.

 

“This shows that animals could potentially have huge effects on the global carbon balance because they’re changing the way microbes respire organic matter.”

 

The researchers found that the rate at which the organic matter of leaves decomposed increased between 60 percent and 200 percent in stress-free conditions relative to stressed conditions, which they consider “huge.”

 

“Climate and litter quality are considered the main controls on organic-matter decomposition, but we show that aboveground predators change how soil microbes break down organic matter,” said Mark Bradford, a co-author of the study and also an ecologist at Yale.

 

Schmitz added: “What it means is that we’re not paying enough attention to the control that animals have over what we view as a classically important process in ecosystem functioning.”

 

The researchers took soil from the field, put it in test tubes and ground up grasshopper carcasses obtained from environments either with or without grasshopper predators.

 

They then sprinkled the powder atop the soil, where the microbes digested it.

 

When the grasshopper carcasses were completely decomposed, the researchers added leaf litter and measured the rate of leaf-litter decomposition.

 

The experiment was then replicated in the field at the Yale Myers Forest in northeastern Connecticut.

 

“It was a two-stage process where the grasshoppers were used to prime the soil, then we measured the consequences of that priming,” said Schmitz.

 

The effect of animals on ecosystems is disproportionately larger than their biomass would suggest.

 

“Traditionally people thought that animals had no important role in recycling of organic matter, because their biomass is relatively small compared to the plant material that’s entering ecosystems,” Schmitz said.

 

“We need to pay more attention to the role of animals, however. In an era of biodiversity loss we’re losing many top predators and larger herbivores from ecosystems.”

 

Source: National Science Foundation

 

Published on 18th June 2012

 

Self-Assembling Nanocubes for Next Generation Antennas and Lenses


 

Researchers at the University of California, San Diego Jacobs School of Engineering have developed a technique that enables metallic nanocrystals to self-assemble into larger, complex materials for next-generation antennas and lenses. The metal nanocrystals are cube-shaped and, like bricks or Tetris blocks, spontaneously organize themselves into larger-scale structures with precise orientations relative to one another.  Their findings were published online June 10 in the journal Nature Nanotechnology.


 

 

This research is in the new field of nanoplasmonics, where researchers are developing materials that can manipulate light using structures that are smaller than the wavelength of light itself. The nanocubes used in this study were less than 0.1 microns; by comparison, the breadth of a human hair is 100 microns. Precise orientation is necessary so that the cubes can confine light (for a nanoscale antenna) or focus light (for a nanoscale lens) at different wavelengths.


“Our findings could have important implications in developing new optical chemical and biological sensors, where light interacts with molecules, and in optical circuitry, where light can be used to deliver information,” said Andrea Tao, a professor in the Department of NanoEngineering at the Jacobs School.  Tao collaborated with nanoengineering professor Gaurav Arya and post-doctoral researcher Bo Gao.


To construct objects like antennas and lenses, Tao’s team is using chemically synthesized metal nanocrystals. The nanocrystals can be synthesized into different shapes to build these structures; in this study, Tao’s team created tiny cubes composed of crystalline silver that can confine light when organized into multi-particle groupings. Confining light into ultra-small volumes could allow optical sensors that are extremely sensitive and that could allow researchers to monitor how a single molecule moves, reacts, and changes with time.


To control how the cubes organize, Tao and her colleagues developed a method to graft polymer chains to the silver cube surfaces that modify how the cubes interact with each other. Normally when objects like cubes stack, they pack side-by-side like Tetris blocks. Using simulations, Tao’s team predicted that placing short polymer chains on the cube surface would cause them to stack normally, while placing long polymer chains would cause the cubes to stack edge-to-edge. The approach is simple, robust, and versatile.


In demonstrating their technique, the researchers created macroscopic films of nanocubes with these two different orientations and showed that the films reflected and transmitted different wavelengths of light.


The research was supported by the National Science Foundation, the Hellman Foundation, and Jacobs School of Engineering at UC San Diego.

 

 

Source: University of California, San Diego


Published on 18th June  2012

 

 

Related articles

Rice unveils super-efficient solar-energy technology

Lava dots: Rice makes hollow, soft-shelled quantum dots

 

New UCLA Engineering research center to revolutionize nanoscale electromagnetic devices

 

 

‘Nanoresonators’ might improve cell phone performance


 

New imaging technique homes in on electrocatalysis of nanoparticles

 

 

Butterflies and Bats Reveal Clues About Spread of Infectious Disease

 

There’s a most unusual gym in ecologist Sonia Altizer’s lab at the University of Georgia in Athens. The athletes are monarch butterflies, and their workouts are carefully monitored to determine how parasites impact their flight performance.

 


 

With support from the National Science Foundation (NSF), Altizer and her team study how animal behavior, including long distance migration, affects the spread and evolution of infectious disease. In monarchs, the researchers study a protozoan parasite called Ophryocystis elektroscirrha, or “OE” for short.

 

In Altizer’s lab, the adult butterflies are tethered to a “flying treadmill” and the time and speed of each lap is recorded on a computer. They fly at between two and five miles per hour in this setting. Infected butterflies, on average, fly about 20 percent less well than healthy butterflies. “So, they actually fly shorter total distances, they’ve got slower flight speeds and they lose more weight per distance flown than healthy butterflies,” says Altizer.

 

Up to two billion monarchs migrate every year to central Mexico, where Altizer and her colleagues capture, sample and release hundreds of butterflies each day during the researchers’ field study. “The sound of the wings of the butterflies just whirring past your head is about as good as it gets for a terrestrial ecologist, I think!” says Altizer.

 

Altizer says even a tiny impact from infection on the monarchs’ migration ability could make the difference between survival and death. Her work is providing some details on the differences in how diseases spread in human and animal populations.

 

“General models for predicting the spread of infectious disease largely ignore behavioral changes,” says Alan Tessier, program director in the Division of Environmental Biology within the NSF Biological Sciences Directorate. “This research addresses a critical gap in understanding how infection changes the movement behavior of animals from the scale of individuals to the dynamics of populations spread across a landscape. Lessons learned from this work will be broadly relevant to disease spread in other species, including humans.”

 

“We know that for humans, travel and migration can help spread disease. With more and more air travel, a person can get on a plane and move a virus to the other side of the world in a matter of hours. But, many animal species have to undertake these really strenuous long-distance journeys on their own power. And, if these journeys are really costly, animals that are heavily infected are probably not going to make it,” explains Altizer. “So, we can think about it from our own perspective. If we had to run a marathon with the flu, we probably wouldn’t do very well. The animals that are the most heavily infected simply can’t make a long-distance journey.”

 

Take the migration away and what’s left are smaller remnant populations that don’t migrate. “We could actually see infections build up in those populations and that could possibly increase the risk of pathogens jumping over into people and their domesticated animals,” says Altizer.

 

Human activities, from logging and other habitat destruction to herbicide use, are disrupting longstanding migration patterns for monarchs and other animals, according to Altizer. Over a decade ago, around half of the monarch population that overwinters in Mexico originated from the corn belt of the United States, where their milkweed host plants commonly grew in agricultural fields and roadsides. Altizer says that today monarch populations in those same areas are declining, in large part due to transgenic crops that are tolerant of herbicides. This allows farmers to more effectively eliminate weeds, including milkweeds, thus removing a large fraction of the monarchs’ former habitat.

 

Another aspect of this research builds on the fascination many people have for these beautiful insects and their arduous migratory journey.

 

“Monarchs have this amazing annual lifecycle where they’ve got three or four short-lived generations that breed during the summer months, and then they’ve got one long-lived generation where butterflies that emerge at the end of the summer live for eight, nine, sometimes 10 months. It’s that generation that travels all the way to the overwintering sites and then re-migrates north in the spring to re-colonize the southern part of their breeding range. So, it’s the great, great, grand-progeny of that generation that will make the journey the following year,” explains Altizer.

 

Graduate student Dara Satterfield processes data sent in from volunteers who sample the butterflies in their backyards. She’s looking for OE infection.

 

“The Monarch Health Program is our citizen science project. People from the eastern part of the U.S. and Canada send us samples from monarchs in their own backyard,” says Satterfield. “The citizen scientists put a piece of clear tape on the monarch’s abdomen and that will pick up spores and scales from the monarch. Then they place those tape samples on index cards and mail them to us. We can tell from the samples whether or not the monarch in their yard has a parasite.”

 

“Monarchs, like a lot of other migratory species, face complex conservation challenges because they have very different habitats at different times of the year and they cross international boundaries. We need to identify the threats and protect them,” says Altizer.

 

“There’s also a need to study pathogen dynamics in other migratory species, as well as how human activities affect those dynamics,” she adds.

 

Vampire bats may not have the beauty factor that monarch butterflies do, but the bats are important in Altizer’s study of how the spread of infectious diseases by animals is affected by human activities.

 

In Peru, University of Georgia postdoctoral researcher Daniel Streicker focuses on these bats whose populations have exploded in recent years. Ranchers have introduced livestock into the Andes and the Amazon. More bloodthirsty bats might mean more rabies.

 

“One of the main goals we have is to try to understand what determines the frequency and intensity of rabies outbreaks and what we can do about it,” says Streicker.

 

That fieldwork involves capturing vampire bats to determine what’s on their menu.

 

“We’re catching bats in the Amazon jungle, particularly in areas where there aren’t a lot of livestock. These are usually areas where bats are reported to bite people, but we don’t know what else they’re feeding on,” continues Streicker. “So, we catch these bats after they’ve taken a blood meal and we extract the stomach contents in a way that’s fairly non-invasive, and then we can use genetic typing to figure out what species were actually being fed upon.”

 

While vampire bats have a hyped Hollywood reputation for danger, Streicker says there are things people can learn from them about rabies and other diseases. For example, some bats have antibodies against rabies, so they appear healthy even though they have been exposed. This runs counter to the common wisdom that rabies is universally fatal to all mammals. If antibodies protect these bats from future exposures, that could fundamentally change our view of how rabies persists in wild bat populations. Understanding how bats survive these exposures could also eventually help researchers develop a treatment. Vampire bat saliva has already been used to develop a medicine for treating stroke victims.

 

Streicker and Altizer say that the results of this study will improve rabies control efforts in Latin America, where vampire bats cause most human and livestock cases. More generally, because deforestation and livestock rearing are intensifying in much of the developing world, a better understanding of how wildlife-pathogen interactions will respond to such changes is urgently needed.

 

 

 

Source: National Science Foundation

 

Published on 12th June 2012

 

Complex world of microbes fine-tune body weight

 

Microorganisms in the human gastrointestinal tract form an intricate, living fabric made up of some 500 to 1000 distinct bacterial species, (in addition to other microbes). Recently, researchers have begun to untangle the subtle role these diverse life forms play in maintaining health and regulating weight.

 


 

In a new study appearing in the journal Nutrition in Clinical Practice, researcher Rosa Krajmalnik-Brown and her colleagues at the Swette Center for Environmental Biotechnology at Arizona State University’s Biodesign Institute in collaboration with John DiBaise from the Division of Gastroenterology at the Mayo Clinic, review the role of gut microbes in nutrient absorption and energy regulation.

 

According to Krajmalnik-Brown, “Malnutrition may manifest as either obesity or undernutrition, problems of epidemic proportion worldwide.  Microorganisms have been shown to play an important role in nutrient and energy extraction and energy regulation although the specific roles that individual and groups/teams of gut microbes play remain uncertain.”

 

The study outlines the growth of varied microbial populations—from birth onwards— highlighting their role in extracting energy from the diet. The composition of microbial communities is shown to vary with age, body weight, and variety of food ingested; as well as in response to bariatric surgery for obesity, use of antibiotics and many other factors.

 

Based on current findings, the authors suggest that therapeutic modification of the gut microbiome may offer an attractive approach to future treatment of nutrition-related maladies, including obesity and a range of serious health consequences linked to under-nutrition.

 

Micromanagers

The microbes in the human gut belong to three broad domains, defined by their molecular phylogeny: Eukarya, Bacteria, and Achaea. Of these, bacteria reign supreme, with two dominant divisions—known as Bacteroidetes and Firmicutes— making up over 90 percent of the gut’s microbial population. In contrast, the Achaea that exist in the gut are mostly comprised of methanogens (producers of methane) and specifically by Methanobrevibacter smithii— a hydrogen-consumer.

 

Within the bacterial categories however, enormous diversity exists. Each individual’s community of gut microbes is unique and profoundly sensitive to environmental conditions, beginning at birth. Indeed, the mode of delivery during the birthing process has been shown to affect an infant’s microbial profile.

 

Communities of vaginal microbes change during pregnancy in preparation for birth, delivering beneficial microbes to the newborn. At the time of delivery, the vagina is dominated by a pair of bacterial species, Lactobacillus and Prevotella. In contrast, infants delivered by caesarean section typically show microbial communities associated with the skin, including Staphylococcus, Corynebacterium, and Propionibacterium. While the full implications of these distinctions are still murky, evidence suggests they may affect an infant’s subsequent development and health, particularly in terms of susceptibility to pathogens.

 

Diet and destiny

After birth, diet becomes a critical determinant in microbial diversity within the gut. Recent research indicates that microbial populations vary geographically in a manner consistent with regional differences in diet. Children in rural areas of Burkina Faso for example showed much more abundant concentrations of Bacteroidetes compared with their cohorts in Italy, a finding consistent with the African children’s plant-rich diet.

 

While microbiomes appear to have adapted to local diets, changes in eating habits significantly alter composition of gut microbes. Variations in macronutrient composition can modify the structure of gut microbiota in a few days—in some cases, a single day. Studies in mice show that changing from a low fat, plant polysaccharide diet to a Western diet high in sugar and fat rapidly and profoundly reconfigures the composition of microbes in the gut.

 

Another modifier of gut microbe composition is gastric bypass surgery, used in certain cases to alleviate conditions of serious obesity. In earlier work, the authors found that the post-surgical microbial composition of patients who underwent so-called Roux-en-Y gastric bypass was distinct from both obese and normal weight individuals.

 

“Obesity affects more than a third of adults in the U.S. and is associated with a raft of health conditions including heart disease, stroke, type 2 diabetes and certain forms of cancer,” says Dr. John DiBaise.  The authors further note that concentrations in the blood of lipopolysaccharides derived from gut bacteria increase in obese individuals, producing a condition known as metabolic endotoxemia. The disorder has been linked with chronic, systemic, low-level inflammation as well as insulin resistance.

 

Energy harvest

In the current review, the cycle of microbial energy extraction from food, involving hydrogen-producing and consuming reactions in the human intestine, is described in detail.  Short chain fatty acids (SCFAs) are a critical component in this system. During the digestive process, fermentation in the gut breaks down complex organic compounds, producing SCFA and hydrogen. The hydrogen is either excreted in breath or consumed by 3 groups of microorganisms inhabiting the colon: methanogens, acetogens and sulfate reducers.

 

Research conducted by the authors and others has demonstrated that hydrogen-consuming methanogens appear in greater abundance in obese as opposed to normal weight individuals. Further, the Firmicutes—a form of acetogen—also seem to be linked with obesity. Following fermentation, SCFAs persist in the colon. Greater concentration of SCFAs, especially propionate, were observed in fecal samples from obese as opposed to normal weight children. (SCFAs also behave as signaling molecules, triggering the expression of leptin, which acts as an appetite suppressor.)

 

While it now seems clear that certain microbial populations help the body process otherwise indigestible carbohydrates and proteins, leading to greater energy extraction and associated weight gain, experimental results have shown some inconsistency. For example, while a number of studies have indicated a greater prevalence of Bacteroidetes in lean individuals and have linked the prevalence of Firmicutes with obesity, the authors stress that many questions remain.

 

Alterations in gut microbiota are also of crucial concern for the one billion people worldwide who suffer from undernutrition. Illnesses resulting from undernutrition contribute to over half of the global fatalities in children under age 5. Those who do survive undernutrition often experience a range of serious, long-term mental and physical effects. The role of gut microbial diversity among the undernourished has yet to receive the kind of concentrated research effort applied to obesity—a disease which has reached epidemic proportions in the developed world.

 

Exploiting microbes affecting energy extraction may prove a useful tool for non-surgically addressing obesity as well as treating undernutrition, though more research is needed for a full understanding of regulatory mechanisms governing the delicate interplay between intestinal microbes and their human hosts.

 

Dr. Krajmalnik-Brown and colleagues at the Biodesign Institute and Mayo Clinic are currently in the second year of an NIH-funded study to better understand the role of the gut microbiome in the success or failure of surgical procedures performed to treat obesity including the Roux-en-Y gastric bypass, adjustable gastric band and vertical sleeve gastrectomy.

 

 

 

The original article was written by Richard Harth

 

Source:-  Arizona State University

 

Published on 12th June  2012

 

Ecologists Call for Preservation of Planet’s Remaining Biological Diversity

Photo of the Flume Room at the University of Michigan.

 


 

 

The “Flume Room” at the University of Michigan is used to assess biodiversity in streams. (Credit: Brad Cardinale)

 

 

Twenty years after the Earth Summit in Rio de Janeiro, 17 ecologists are calling for renewed international efforts to curb the loss of Earth’s biological diversity.

 

The loss is compromising nature’s ability to provide goods and services essential for human well-being, the scientists say.

 

Over the past two decades, strong scientific evidence has emerged showing that decline of the world’s biological diversity reduces the productivity and sustainability of ecosystems, according to an international team led by the University of Michigan’s Bradley Cardinale.

 

It also decreases ecosystems’ ability to provide society with goods and services like food, wood, fodder, fertile soils and protection from pests and disease.

 

“Water purity, food production and air quality are easy to take for granted, but all are largely provided by communities of organisms,” said George Gilchrist, program director in the National Science Foundation’s Division of Environmental Biology, which funded the research.

 

“This paper demonstrates that it is not simply the quantity of living things, but their species, genetic and trait biodiversity, that influences the delivery of many essential ‘ecosystem services.”’

 

Human actions are dismantling ecosystems, resulting in species extinctions at rates several orders of magnitude faster than observed in the fossil record.

 

If the nations of the world make biodiversity an international priority, the scientists say, there’s still time to conserve much of the remaining variety of life–and possibly to restore much of what’s been lost.

 

The researchers present their findings in this week’s issue of the journal Nature.

 

The paper is a scientific consensus statement that summarizes evidence from more than 1,000 ecological studies over the last two decades.

 

“Much as consensus statements by doctors led to public warnings that tobacco use is harmful to your health, this is a consensus statement that loss of Earth’s wild species will be harmful to the world’s ecosystems and may harm society by reducing ecosystem services that are essential to human health and prosperity,” said Cardinale.

 

“We need to take biodiversity loss far more seriously–from individuals to international governing bodies–and take greater action to prevent further losses of species.”

 

An estimated nine million species of plants, animals, protists and fungi inhabit the Earth, sharing it with some seven billion people.

 

The call to action comes as international leaders prepare to gather in Rio de Janeiro on June 20-22 for the United Nations Conference on Sustainable Development, known as the Rio+20 Conference.

 

The upcoming conference marks the 20th anniversary of the 1992 Earth Summit in Rio, which resulted in 193 nations supporting the Convention on Biological Diversity’s goals of biodiversity conservation and the sustainable use of natural resources.

 

The 1992 Earth Summit caused an explosion of interest in understanding how biodiversity loss might affect the dynamics and functioning of ecosystems, as well as the supply of goods and services of value to society.

 

In the Nature paper, the scientists review published studies on the topic and list six consensus statements, four emerging trends, and four “balance of evidence” statements.

 

The balance of evidence shows, for example, that genetic diversity increases the yield of commercial crops, enhances the production of wood in tree plantations, improves the production of fodder in grasslands, and increases the stability of yields in fisheries.

 

Increased plant diversity results in greater resistance to invasion by exotic plants, inhibits plant pathogens such as fungal and viral infections, increases above-ground carbon sequestration through enhanced biomass, and increases nutrient remineralization and soil organic matter.

 

“No one can agree on what exactly will happen when an ecosystem loses a species, but most of us agree that it’s not going to be good,” said Shahid Naeem of Columbia University, a co-author of the paper. “And we agree that if ecosystems lose most of their species, it will be a disaster.”

 

“Twenty years and a thousand studies later, what the world thought was true in Rio in 1992 has finally been proven: biodiversity underpins our ability to achieve sustainable development,” Naeem said.

 

Despite far-reaching support for the Convention on Biological Diversity, biodiversity loss has continued over the last two decades, often at increasing rates.

 

In response, a new set of diversity-preservation goals for 2020, known as the Aichi targets, was recently formulated.

 

And a new international body called the Intergovernmental Platform on Biodiversity and Ecosystem Services was formed in April 2012 to guide a global response toward sustainable management of the world’s biodiversity and ecosystems.

 

Significant gaps in the science behind biological diversity remain and must be addressed if the Aichi targets are to be met, the scientists write in their paper.

 

“This paper is important both because of what it shows we know, and what it shows we don’t know,” said David Hooper of Western Washington University, one of the co-authors.

 

“Several of the key questions we outline help point the way for the next generation of research on how changing biodiversity affects human well-being.”

 

Without an understanding of the fundamental ecological processes that link biodiversity, ecosystem functions and services, attempts to forecast the societal consequences of diversity loss, and to meet policy objectives, are likely to fail, the ecologists write.

 

“But with that fundamental understanding in hand, we may yet bring the modern era of biodiversity loss to a safe end for humanity,” they conclude.

 

In addition to Cardinale, Naeem and Hooper, co-authors of the Nature paper are: J. Emmett Duffy of The College of William and Mary; Andrew Gonzalez of McGill University; Charles Perrings and Ann P. Kinzig of Arizona State University; Patrick Venail and Anita Narwani of the University of Michigan; Georgina M. Mace of Imperial College London; David Tilman of the University of Minnesota; David A. Wardle of the Swedish University of Agricultural Sciences; Gretchen C. Daily of Stanford University; Michel Loreau of the National Centre for Scientific Research in Moulis, France; James B. Grace of the U.S. Geological Survey; Anne Larigauderie of the National Museum of Natural History in Rue Cuvier, France; and Diane Srivastava of the University of British Columbia.

 

Source: National Science Foundation

 

Published on 8th June 2012

Search Term:

Today’s Climate More Sensitive to Carbon Dioxide Than in Past 12 Million Years

Map showing location of core sampling sites in the North Pacific Ocean.

 


 

 

Core samples were collected at the sites noted in the North Pacific Ocean.( Credit: Jonathan LaRiviere/Ocean Data View)


 

 

Until now, studies of Earth’s climate have documented a strong correlation between global climate and atmospheric carbon dioxide; that is, during warm periods, high concentrations of CO2 persist, while colder times correspond to relatively low levels.

 

However, in this week’s issue of the journal Nature, paleoclimate researchers reveal that about 12-5 million years ago climate was decoupled from atmospheric carbon dioxide concentrations. New evidence of this comes from deep-sea sediment cores dated to the late Miocene period of Earth’s history.

 

During that time, temperatures across a broad swath of the North Pacific were 9-14 degrees Fahrenheit warmer than today, while atmospheric carbon dioxide concentrations remained low–near values prior to the Industrial Revolution.

 

The research shows that, in the last five million years, changes in ocean circulation allowed Earth’s climate to become more closely coupled to changes in carbon dioxide concentrations in the atmosphere.

 

The findings also demonstrate that the climate of modern times more readily responds to changing carbon dioxide levels than it has during the past 12 million years.

 

“This work represents an important advance in understanding how Earth’s past climate may be used to predict future climate trends,” says Jamie Allan, program director in the National Science Foundation’s (NSF) Division of Ocean Sciences, which funded the research.

 

The research team, led by Jonathan LaRiviere and Christina Ravelo of the University of California at Santa Cruz (UCSC), generated the first continuous reconstructions of open-ocean Pacific temperatures during the late Miocene epoch.

 

It was a time of nearly ice-free conditions in the Northern Hemisphere and warmer-than-modern conditions across the continents.

 

The research relies on evidence of ancient climate preserved in microscopic plankton skeletons–called microfossils–that long-ago sank to the sea-floor and ultimately were buried beneath it in sediments.

 

Samples of those sediments were recently brought to the surface in cores drilled into the ocean bottom.  The cores were retrieved by marine scientists working aboard the drillship JOIDES Resolution.

 

The microfossils, the scientists discovered, contain clues to a time when the Earth’s climate system functioned much differently than it does today.

 

“It’s a surprising finding, given our understanding that climate and carbon dioxide are strongly coupled to each other,” LaRiviere says.

 

“In the late Miocene, there must have been some other way for the world to be warm. One possibility is that large-scale patterns in ocean circulation, determined by the very different shape of the ocean basins at the time, allowed warm temperatures to persist despite low levels of carbon dioxide.”

 

The Pacific Ocean in the late Miocene was very warm, and the thermocline, the boundary that separates warmer surface waters from cooler underlying waters, was much deeper than in the present.

 

The scientists suggest that this deep thermocline resulted in a distribution of atmospheric water vapor and clouds that could have maintained the warm global climate.

 

“The results explain the seeming paradox of the warm–but low greenhouse gas–world of the Miocene,” says Candace Major, program director in NSF’s Division of Ocean Sciences.

 

Several major differences in the world’s waterways could have contributed to the deep thermocline and the warm temperatures of the late Miocene.

 

For example, the Central American Seaway remained open, the Indonesian Seaway was much wider than it is now, and the Bering Strait was closed.

 

These differences in the boundaries of the world’s largest ocean, the Pacific, would have resulted in very different circulation patterns than those observed today.

 

By the onset of the Pliocene epoch, about five million years ago, the waterways and continents of the world had shifted into roughly the positions they occupy now.

 

That also coincides with a drop in average global temperatures, a shoaling of the thermocline, and the appearance of large ice sheets in the Northern Hemisphere–in short, the climate humans have known throughout recorded history.

 

“This study highlights the importance of ocean circulation in determining climate conditions,” says Ravelo. “It tells us that the Earth’s climate system has evolved, and that climate sensitivity is possibly at an all-time high.”

 

Other co-authors of the paper are Allison Crimmins of UCSC and the U.S. Environmental Protection Agency; Petra Dekens of UCSC and San Francisco State University; Heather Ford of UCSC; Mitch Lyle of Texas A&M University; and Michael Wara of UCSC and Stanford University.

 

 

 

Source: National Science Foundation

 

Published on 8th June 2012

 

Search Term:

All the Colors of a High-Energy Rainbow, in a Tightly Focused Beam

 

For the first time, researchers have produced a coherent, laser-like, directed beam of light that simultaneously streams ultraviolet light, X-rays and all wavelengths in between.

 


 

One of the few light sources to successfully produce a coherent beam that includes X-rays, this new technology is the first to do so using a setup that fits on a laboratory table.

 

An international team of researchers, led by engineers from the National Science Foundation’s Engineering Research Center (ERC) for EUV Science and Technology, reports its findings in the June 8, 2012, issue of Science.

 

By focusing intense pulses of infrared light–each just a few optical cycles in duration–into a high-pressure gas cell, the researchers converted part of the original laser energy into a coherent super-continuum of light that extends well into the X-ray region of the spectrum.

 

The X-ray burst that emerges has much shorter wavelengths than the original laser pulse, which will make it possible to follow the tiniest, fastest physical processes in nature, including the coupled dance of electrons and ions in molecules as they undergo chemical reactions, or the flow of charges and spins in materials.

 

“This is the broadest spectral, coherent-light source ever generated,” says engineering and physics professor Henry Kapteyn of JILA at the University of Colorado at Boulder, who led the study with fellow JILA professor Margaret Murnane and research scientist Tenio Popmintchev, in collaboration with researchers from the Vienna University of Technology, Cornell University and the University of Salamanca.

 

“It definitely opens up the possibility to probe the shortest space and time scales relevant to any process in our natural world other than nuclear or fundamental particle interactions,” Kapteyn adds. The breakthrough builds upon earlier discoveries from Murnane, Kapteyn and their colleagues to generate laser-like beams of light across a broad spectrum of wavelengths.

 

The researchers use a technique called high-harmonic generation (HHG). HHG was first discovered in the late 1980s, when researchers focused a powerful, ultra-short laser beam into a spray of gas. The researchers were surprised to find that the output beam contained a small amount of many different wavelengths in the ultraviolet region of the spectrum, as well as the original laser wavelength. The new ultraviolet wavelengths were created as the gas atoms were ionized by the laser.

 

“Just as a violin or guitar string will emit harmonics of its fundamental sound tone when plucked strongly, an atom can also emit harmonics of light when plucked violently by a laser pulse,” adds Murnane. “The laser pulse first plucks electrons from the atoms, before driving them back again where they can collide with the atoms from which they came. Any excess energy is emitted as high-energy ultraviolet photons.”

 

Like many phenomena, when HHG was first discovered, there was little science to explain it, and it was considered more a curious phenomenon than a potentially useful light source. After years of work, scientists eventually understood how very high harmonics were emitted. However, there was one major challenge that most researchers gave up on–for most wavelengths in the X-ray region, the output HHG beams were extremely weak.

 

Murnane, Kapteyn and their students realized that there might be a chance to overcome that challenge and turn HHG into a useful X-ray light source–the tabletop-scale X-ray laser that has been a goal for laser science since shortly after the laser was first demonstrated in 1960.

 

“This was not an easy task,” says Murnane. “Unlike a laser–which gets more intense as more energy is pumped into the system–in HHG, if the laser hits the atoms too hard, too many electrons are liberated from the gas atoms, and those electrons cause the laser light to speed up. If the speed of the laser and X-rays do not match, there is no way to combine the many X-ray waves together to create a bright output beam, since the X-ray waves from different gas atoms will interfere destructively.”

 

Popmintchev and JILA graduate student Ming-Chang Chen worked out conditions that enable X-ray waves from many atoms in the gas to interfere constructively. The key was to use a relatively long-wavelength, mid-infrared laser and a high pressure gas cell that also guides the laser light. The resulting bright, X-ray beams maintain the coherent, directed beam qualities of the laser that drives the process.

 

The HHG process is effective only when the atoms are hit “hard and fast” by the laser pulses, with durations nearing 10-14 seconds–a fundamental limit representing just a few oscillations of the electromagnetic fields. Murnane and Kapteyn pioneered the technology for generating such light pulses in the 1990s, and used those lasers to develop and utilize HHG-based light sources in the extreme-ultraviolet (EUV) region of the spectrum in the 2000s. However, while researchers were using those lasers and the HHG technique to measure ever-shorter duration light pulses, they were stymied in how to make coherent light at shorter wavelengths in the more penetrating X-ray region of the spectrum.

 

The new paper in Science, under lead author and senior research associate Popmintchev, demonstrates that breakthrough, showing that the understanding of the HHG process the researchers developed is broadly valid.

 

“We would have never found this if we hadn’t sat down and thought about what happens overall during HHG, when we change the wavelength of the laser driving it, what parameters have to be changed to make it work,” added Kapteyn. “The amazing thing is that the physics seem to be panning out even over a very broad range of parameters. Usually in science you find a scaling rule that prevents you from making a dramatic jump, but in this case, we were able to generate 1.6 keV – each X-ray photon was generated from more than 5,000 infrared photons.”

 

When the researchers first started to work with ultrafast, mid-infrared lasers just a few years ago, they actually made a step backwards and generated bright extreme-ultraviolet light of longer wavelengths than they used to achieve in the lab.

 

“However, we discovered a new regime that helped us to realize, just on paper, that we could make this giant step forward towards much shorter electromagnetic wavelengths and generate bright, laser-like, soft and hard X-rays,” adds Popmintchev. “What the experiments were suggesting back then looked too good to be true! It seemed that Mother Nature has combined together, in the most simple and beautiful way, all the microscopic and macroscopic physics. Now, we are already at X-ray wavelengths as short as roughly 7.7 angstroms, and we do not know the limit.”

 

To truly control the beam of photons, the researchers needed to understand the HHG process at the atomic level and how X-rays emitted from individual atoms combine to form a coherent beam of light.

 

That understanding combines microscopic and macroscopic models of the HHG process with the fact that those interactions occur at very high intensity in a dynamically changing medium. The development of such a conceptual understanding took the last decade to develop.

 

The result was the realization that there is no fundamental limit to the energy of the photons that can be generated using the HHG process. To obtain higher-energy photons, the system paradoxically begins with laser light using lower energy photons–specifically, mid-infrared lasers.

 

The JILA researchers demonstrated the validity of that principle in their labs in Colorado, but to achieve their breakthrough, the researchers traveled to Vienna with their beam-generating setup. There, they used a laser developed by co-author Andrius Baltuška and colleagues at the Vienna University of Technology–the world’s most-intense ultrashort-pulse laser operating in the mid-infrared, with a wavelength of four microns.

 

“Thirty years ago, people were saying we could make a coherent X-ray source, but it would have to be an X-ray laser, and we’d need an atomic bomb as the energy source to pump it,” said Deborah Jackson, the program officer who oversees the ERC’s grant. “Now, we have these guys who understand the science fundamentals well enough to introduce new tricks for efficiently extracting energetic photons, pulling them out at X-ray wavelengths … and it’s all done on a table-top!”

 

In addition to achieving the high energy, the increasingly broad spectrum opens a range of new applications.

 

“In an experiment using such a source, one energy region from the beam will correspond with one element, another with another element, and so on to simultaneously look at atoms across entire molecules, and that will allow us to see how charge moves from one part of a molecule to another as a chemical reaction is happening,” adds Kapteyn. “It’ll take us awhile to learn how to use this, but it’s very exciting.”

 

 

 

Source: National Science Foundation

 

Published on 8th June 2012

 

Search Term:

Pivotal Role for Proteins-From Helping Turn Carbs into Energy to Causing Devastating Neuromuscular Disease

 

Research into how carbohydrates are converted into energy has led to a surprising discovery with implications for the treatment of a perplexing and potentially fatal neuromuscular disorder and possibly even cancer and heart disease.

 


 

Until this study, the cause of this neuromuscular disorder was unknown. But after obtaining DNA from three families with members who have the disorder, a team led by University of Utah scientists Jared Rutter, Ph.D., associate professor of biochemistry and Carl Thummel, Ph.D., professor of human genetics, sequenced two genes and identified two mutations that cause this devastating disease.

 

“The ability to convert carbohydrates into energy is critical for people and other organisms to live. But when that process goes awry, potentially fatal health problems can occur,” Rutter says. “If we can figure out a way to correct the defects, we might be able to treat the disease.”

 

Rutter and Thummel are senior authors on a study published online in Science Express on Thursday, May 24, 2012.

 

The researchers studied two proteins, Mpc1 and Mpc2, which are among a dozen proteins they looked at in fruit flies, yeast, and then humans. They discovered that the two proteins play a pivotal role in the cellular process that produces the majority of ATP, a molecule that is the main source of energy for cells and is essential for people and other animals to live. Rutter and his colleagues also discovered that when Mpc1 and Mpc2 are impaired they cause the deadly and as of yet unnamed neuromuscular disorder. This disorder affects thousands of people worldwide.

 

To produce ATP, the body metabolizes carbohydrates and converts them into pyruvate, which then typically enters into the mitochondria in cells. Once inside the mitochondria—a self-contained unit often referred to as a cellular power plant—pyruvate is consumed in the production of ATP. Rutter and his fellow researchers discovered that Mpc1 and Mpc2 are critical for pyruvate entry into mitochondria. When Mpc1 and Mpc2 are eliminated or mutated, pyruvate cannot enter into mitochondria and ATP is not efficiently produced – and that’s when serious health problems can arise, including the neuromuscular disorder that in its most severe forms is deadly.

 

The ramifications of this study go beyond the production of ATP and birth defects seen in the neuromuscular disorder. The findings may be useful in understanding some of the metabolic defects associated with cancer and heart disease, according to Rutter.  Cancer cells typically don’t consume their pyruvate in the production of ATP at the same rate as normal cells.  Instead, they convert the pyruvate to lactate.  This property of cancer cells is called the Warburg Effect and is named after Nobel laureate and cancer researcher Otto Heinrich Warburg. Some forms of heart disease have a similar problem.

 

Further study based on the current research may provide important information regarding those diseases, according to Rutter. “That might be the most important outcome of our studies in the long run,” he says.

 

The study’s first author is Daniel K. Bricker, a doctoral student in human genetics at the University of Utah. Researchers from Harvard University, and the Laboratoire de Biochimie and the Institut de Genetique et de Biologie Moleculaire et Cellulaire, both in France, also contributed to the study.

 

 

Source: University of Utah Health Sciences

 

Published on 2nd June 2012

 

BioChip may make diagnosis of leukemia and HIV faster, cheaper

An assembled flow cytometry chip with size comparable to a U.S. Quarter.

 


 

 

An assembled flow cytometry chip with size comparable to a U.S. Quarter. Credit: Tony Huang

 

 

Inexpensive, portable devices that can rapidly screen cells for leukemia or HIV may soon be possible thanks to a chip that can produce three-dimensional focusing of a stream of cells, according to researchers.

“HIV is diagnosed based on counting CD4 cells,” said Tony Jun Huang, associate professor of engineering science and mechanics at Penn State. “Ninety percent of the diagnoses are done using flow cytometry.”

Huang and his colleagues designed a mass-producible device that can focus particles or cells in a single stream and performs three different optical assessments for each cell. They believe the device represents a major step toward low-cost flow cytometry chips for clinical diagnosis in hospitals, clinics and in the field.

“The full potential of flow cytometry as a clinical diagnostic tool has yet to be realized and is still in a process of continuous and rapid development,” the team said in a recent issue of Biomicrofluidics. “Its current high cost, bulky size, mechanical complexity and need for highly trained personnel have limited the utility of this technique.”

Flow cytometry typically looks at cells in three ways using optical sensors. Flow cytometers use a tightly focused laser light to illuminate focused cells and to produce three optical signals from each cell. These signals are fluorescence from antibodies bound to cells, which reveals the biochemical characteristics of cells; forward scattering, which provides the cell size and its refractive index; and side scattering, which provides cellular granularity. Processing these signals allows diagnosticians to identify individual cells in a mixed cell population, identify fluorescent markers and count cells and other analysis to diagnose and track the progression of HIV, cancer and other diseases.

“Current machines are very expensive, costing $100,000,” said Huang. “Using our innovations, we can develop a small one that could cost about $1,000.”

One reason the current machines are so large and expensive is the method used to channel cells into single file and the necessary alignment of lasers and multiple sensors with the single-file cell stream. Currently, cells are guided into single file using a delicate three-dimensional flow cell that is difficult to manufacture. More problematic is that these current machines need multiple lenses and mirrors for optical alignment.

“Our approach needs only a simple one-layer, two-dimensional flow cell and no optical alignment is required,” said Huang.

Huang and his team used a proprietary technology named microfluidic drifting to create a focused stream of particles. Using a curved microchannel, the researchers took advantage of the same forces that try to move passengers in a car to the outside of a curve when driving. The microfluidic chip’s channel begins as a main channel that contains the flow of carrier liquid and a second channel that comes in perpendicularly that carries the particles or cells. Immediately after these two channels join, the channel curves 90 degrees, which moves all the cells into a horizontal line. After the curve, liquid comes into the channel on both sides, forcing the horizontal line of cells into single file. The cells then pass through a microlaser beam.

An advantage of this microfluidic flow cytometry chip is that it can be mass-produced by molding and standard lithographic processes. The fibers for the optical-fiber delivered laser beams and optical signals already exist.

“The optical fibers are automatically aligned once inserted into the chip, therefore requiring no bulky lenses and mirrors for optical alignment,” said Huang. “Our machine is small enough it can be operated by battery, which makes it usable in Africa and other remote locations.”

The researchers tested the device using commercially available, cell-sized fluorescent beads. They are now testing the device with actual cells.

Working with Huang were Xiaole Mao, graduate student in bioengineering; Ahmad Ahsan Nawaz, Xz-Chin Steven Lin, Michael Ian Lapsley, Yanhui Zhao, graduate students in engineering science and mechanics, and Wafik S. el-Deiry, professor of medicine, Rose Dunlap Division Chair in Hematology/Oncology and associate director for translational research, Cancer Institute, all at Penn State, and J. Philip McCoy, National Heart, Lung, and Blood Institute at the National Institutes of Health.

Source: Pennsylvania State University

 

Published on 2nd June 2012

 

Best cpc cpm ppc ad network for publisher