Living systems owe their existence to a pair of information-carrying molecules: DNA and RNA. These fundamental chemical forms possess two features essential for life: they display heredity—meaning they can encode and pass on genetic information, and they can adapt over time, through processes of Darwinian evolution.
A long-debated question is whether heredity and evolution could be performed by molecules other than DNA and RNA.
John Chaput, a researcher at ASU’s Biodesign Institute, who recently published an article in Nature Chemistry describing the evolution of threose nucleic acids, joined a multidisciplinary team of scientists from England, Belgium and Denmark to extend these properties to other so-called Xenonucleic acids or XNA’s.
The group demonstrates for the first time that six of these unnatural nucleic acid polymers are capable of sharing information with DNA. One of these XNAs, a molecule referred to as anhydrohexitol nucleic acid or HNA was capable of undergoing directed evolution and folding into biologically useful forms.
Their results appear in the current issue of Science.
The work sheds new light on questions concerning the origins of life and provides a range of practical applications for molecular medicine that were not previously available.
Nucleic acid aptamers, which have been engineered through in vitro selection to bind with various molecules, act in a manner similar to antibodies—latching onto their targets with high affinity and specificity. “This could be great for building new types of diagnostics and new types of biosensors,” Chaput says, pointing out that XNAs are heartier molecules, not recognized by the natural enzymes that tend to degrade DNA and RNA. New therapeutics may also arise from experimental Xenobiology.
Both RNA and DNA embed data in their sequences of four nucleotides—information vital for conferring hereditary traits and for supplying the coded recipe essential for building proteins from the 20 naturally occurring amino acids. Exactly how (and when) this system got its start however, remains one of the most intriguing and hotly contested areas of biology.
According to one hypothesis, the simpler RNA molecule preceded DNA as the original informational conduit. The RNA world hypothesis proposes that the earliest examples of life were based on RNA and simple proteins. Because of RNA’s great versatility—it is not only capable of carrying genetic information but also of catalyzing chemical reactions like an enzyme—it is believed by many to have supported pre-cellular life.
Nevertheless, the spontaneous arrival of RNA through a sequence of purely random mixing events of primitive chemicals was at the very least, an unlikely occurrence. “This is a big question,” Chaput says. “If the RNA world existed, how did it come into existence? Was it spontaneously produced, or was it the product of something that was even simpler than RNA?”
This pre-RNA world hypothesis has been gaining ground, largely through investigations into XNAs, which provide plausible alternatives to the current biological regime and could have acted as chemical stepping-stones to the eventual emergence of life. The current research strengthens the case that something like this may have taken place.
Threose nucleic acid or TNA for example, is one candidate for this critical intermediary role. “TNA does some interesting things,” Chaput says, noting the molecule’s capacity to bind with RNA through antiparallel Watson-Crick base pairing. “This property provides a model for how XNAs could have transferred information from the pre-RNA world to the RNA world.”
Nucleic acid molecules, including DNA and RNA consist of 3 chemical components: a sugar group, a triphosphate backbone and combinations of the four nucleic acids. By tinkering with these structural elements, researchers can engineer XNA molecules with unique properties. However, in order for any of these exotic molecules to have acted as a precursor to RNA in the pre-biotic epoch, they would need to have been able to transfer and recover their information from RNA. To do this, specialized enzymes, known as polymerases are required.
Nature has made DNA and RNA polymerases, capable of reading, transcribing and reverse transcribing normal nucleic acid sequences. For XNA molecules, however; no naturally occurring polymerases exist. So the group, led by Phil Holliger at the MRC in England, painstakingly evolved synthetic polymerases that could copy DNA into XNA and other polymerases that could copy XNA back into DNA. In the end, polymerases were discovered that transcribe and reverse-transcribe six different genetic systems: HNA, CeNA, LNA, ANA, FANA and TNA. The experiments demonstrated that these unnatural DNA sequences could be rendered into various XNAs when the polymerases were fed the appropriate XNA substrates.
Using these enzymes as tools for molecular evolution, the team evolved the first example of an HNA aptamer through iterative rounds of selection and amplification. Starting from a large pool of DNA sequences, a synthetic polymerase was used to copy the DNA library into HNA. The pool of HNA molecules was then incubated with an arbitrary target. The small fraction of molecules that bound the target were separated from the unbound pool, reverse transcribed back into DNA with a second synthetic enzyme and amplified by PCR. After many repeated rounds, HNAs were generated that bound HIV trans-activating response RNA (TAR) and hen egg lysosome (HEL), which were used as binding targets.) “This is a synthetic Darwinian process,” Chaput says. “The same thing happens inside our cells, but this is done in vitro.”
The method for producing XNA polymerases draws on the path-breaking work of Holliger, one of the lead authors of the current study. The elegant technique uses cell-like synthetic compartments of water/oil emulsion to conduct directed evolution of enzymes, particularly polymerases. By isolating self-replication reactions from each other, the process greatly improves the accuracy and efficiency of polymerase evolution and replication. “What nobody had really done before,” Chaput says, “is to take those technologies and apply them to unnatural nucleic acids. ”
Chaput also underlines the importance of an international collaboration for carrying out this type of research, particularly for the laborious effort of assembling the triphosphate substrates needed for each of the 6 XNA systems used in the study:
“What happened here is that a community of scientists came together and organized around this idea that we could find polymerases that could be used to open up biology to unnatural polymers. It would have been a tour de force for any lab to try to synthesize all the triphosphates, as none of these reagents are commercially available.”
The study advances the case for a pre-RNA world, while revealing a new class of XNA aptamers capable of fulfilling myriad useful roles. Although many questions surrounding the origins of life persist, Chaput is optimistic that solutions are coming into view: “Further down the road, through research like this, I think we’ll have enough information to begin to put the pieces of the puzzle together.”
The research group consisted of investigators from the Medical Research Council (MRC) Laboratory of Molecular Biology, Cambridge, led by Philipp Holliger; the Institute, Katholieke Universiteit Leuven, Belgium, led by Piet Herdewijn; the Nucleic Acid Center, Department of Physics and Chemistry, University of Southern Denmark, led by Jesper Wengel; and the Biodesign Institute at Arizona State University, led by John Chaput.
In addition to his appointment at the Biodesign Institute, John Chaput is an associate professor in the Department of Chemistry and Biochemistry, in the College of Liberal Arts & Sciences.
People with nonalcoholic fatty liver disease (NALFD) who consume alcohol in modest amounts – no more than one or two servings per day – are half as likely to develop hepatitis as non-drinkers with the same condition, reports a national team of scientists led by researchers at the University of California, San Diego School of Medicine.
The findings are published in the April 19, 2012 online issue of The Journal of Hepatology.
NALFD is the most common liver disease in the United States, affecting up to one third of American adults. It’s characterized by abnormal fat accumulation in the liver. The specific cause or causes is not known, though obesity and diabetes are risk factors. Most patients with NAFLD have few or no symptoms, but in its most progressive form, known as nonalcoholic steatohepatitis or NASH, there is a significantly heightened risk of cirrhosis, liver cancer and liver-related death.
NALFD is also a known risk factor for cardiovascular disease (CVD). Patients with NAFLD are approximately two times more likely to die from coronary heart disease than from liver disease. The study’s authors wanted to know if the well-documented heart-healthy benefits of modest alcohol consumption outweighed alcohol’s negative effects.
“We know a 50-year-old patient with NAFLD has a higher risk of CVD,” said Jeffrey Schwimmer, MD, associate professor of clinical pediatrics at UC San Diego, director of the Fatty Liver Clinic at Rady Children’s Hospital-San Diego and senior author. “Data would suggest modest alcohol consumption would be beneficial (in reducing the patient’s CVD risk) if you don’t take liver disease into account. When you do take liver disease into account, however, the usual medical recommendation is no alcohol whatsoever.”
Schwimmer and colleagues discovered that the benefits of modest alcohol consumption were compelling, at least in terms of reducing the odds of patients with NAFLD from developing more severe forms of the disease. Patients with NASH are 10 times more likely to progress to cirrhosis, the final phase of chronic liver disease. Cirrhosis is the 12th leading cause of death in the U.S., killing an estimated 27,000 Americans annually.
“Our study showed that those people with modest alcohol intake – two drinks or less daily – had half the odds of developing NASH than people who drank no alcohol,” said Schwimmer. “The reasons aren’t entirely clear. It’s known that alcohol can have beneficial effects on lipid levels, that it increases ‘good’ cholesterol, which tends to be low in NAFLD patients. Alcohol may improve insulin sensitivity, which has a role in NAFLD. And depending upon the type of alcohol, it may have anti-inflammatory effects.”
The study also found that in patients with NAFLD, modest drinkers experienced less severe liver scarring than did lifelong non-drinkers.
The study did not evaluate the effects of different types of alcohol, such as beer or spirits. Schwimmer said to do so would require a much larger study. Also, the study’s findings do not apply to children. All of the participants in the study were age 21 and older.
The current paper is based on analyses of 600 liver biopsies of patient’s with NAFLD by a national panel of pathologists who had no identifying clinical information about the samples. The study excluded anyone who averaged more than two alcoholic drinks per day or who reported consuming five or more drinks in a day (binge-drinking) at least once a month. All of the patients were at least 21 years of age.
Schwimmer said the findings indicate patients with liver disease should be treated individually, with nuance.
“For a patient with cirrhosis or viral hepatitis, the data says even small amounts of alcohol can be bad. But that may not be applicable to all forms of liver disease. Forty million Americans have NAFLD. Physicians need to look at their patient’s overall health, their CVD risk, their liver status, whether they’re already drinking modestly or not. They need to put all of these things into a framework to determine risk. I suspect modest alcohol consumption will be an appropriate recommendation for many patients, but clearly not all.”
Co-authors are Winston Dunn, departments of Pediatrics and Medicine, UC San Diego and Gastroenterology and Hepatology, Department of Medicine, University of Kansas Medical Center; Arun J. Sanyal, Division of Gastroenterology, Hepatology and Nutrition, Department of Internal Medicine, Virginia Commonwealth University Medical Center; Elizabeth M. Brunt, John Cochran VA Medical Center, Saint Louis and Division of Gastroenterology, Saint Louis University School of Medicine; Aynur Unalp-Arida, Department of Epidemilogy, Johns Hopkins Bloomberg School of Public Health; Michael Donohue, Division of Biostatics and Bioinformatics, Department of Family and Preventive Medicine, UC San Diego; and Arthur J. McCullough, Department of Gastroenterology and Hepatology, Cleveland Clinic.
Funding for this research came, in part, from the National Institute of Diabetes and Digestive and Kidney Diseases, the National Institute of Child Health and Human Development and the National Cancer Institute.
Study finds peoples’ relative niceness may reside in their genes
It turns out that the milk of human kindness is evoked by something besides mom’s good example.
Research by psychologists at the University at Buffalo and the University of California, Irvine, has found that at least part of the reason some people are kind and generous is because their genes nudge them toward it.
Michel Poulin, PhD, assistant professor of psychology at UB, is the principal author of the study “The Neurogenics of Niceness,” published in this month in Psychological Science, a journal of the Association for Psychological Science.
The study, co-authored by Anneke Buffone of UB and E. Alison Holman of the University of California, Irvine, looked at the behavior of study subjects who have versions of receptor genes for two hormones that, in laboratory and close relationship research, are associated with niceness. Previous laboratory studies have linked the hormones oxytocin and vasopressin to the way we treat one another, Poulin says.
In fact, they are known to make us nicer people, at least in close relationships. Oxytocin promotes maternal behavior, for example, and in the lab, subjects exposed to the hormone demonstrate greater sociability. An article in the usually staid Science magazine even used the terms “love drug” and “cuddle chemical” to describe oxytocin, Poulin points out.
Poulin says this study was an attempt to apply previous findings to social behaviors on a larger scale; to learn if these chemicals provoke in us other forms of pro-social behavior: urge to give to charity, for instance, or to more readily participate in such civic endeavors as paying taxes, reporting crime, giving blood or sitting on juries.
He explains that hormones work by binding to our cells through receptors that come in different forms. There are several genes that control the function of oxytocin and vasopressin receptors.
Subjects were surveyed as to their attitudes toward civic duty, other people and the world in general, and about their charitable activities. Study subjects took part in an Internet survey with questions about civic duty, such as whether people have a duty to report a crime or pay taxes; how they feel about the world, such as whether people are basically good or whether the world is more good than bad; and about their own charitable activities, like giving blood, working for charity or going to PTA meetings.
Of those surveyed, 711 subjects provided a sample of saliva for DNA analysis, which showed what form they had of the oxytocin and vasopressin receptors.
“The study found that these genes combined with people’s perceptions of the world as a more or less threatening place to predict generosity,” Poulin says.
“Specifically, study participants who found the world threatening were less likely to help others — unless they had versions of the receptor genes that are generally associated with niceness,” he says.
These “nicer” versions of the genes, says Poulin, “allow you to overcome feelings of the world being threatening and help other people in spite of those fears.
“The fact that the genes predicted behavior only in combination with people’s experiences and feelings about the world isn’t surprising,” Poulin says, “because most connections between DNA and social behavior are complex.
“So if one of your neighbors seems really generous, caring, civic-minded kind of person, while another seems more selfish, tight-fisted and not as interested in pitching in, their DNA may help explain why one of them is nicer than the other,” he says.
“We aren’t saying we’ve found the niceness gene,” he adds. “But we have found a gene that makes a contribution. What I find so interesting is the fact that it only makes a contribution in the presence of certain feelings people have about the world around them.”
A research team led by the University of South Florida’s Department of Psychiatry & Behavioral Neurosciences has found that a fragment of the amyloid precursor protein (APP) — known as sAPP-α and associated with Alzheimer’s disease — appears to regulate its own production. The finding may lead to ways to prevent or treat Alzheimer’s disease by controlling the regulation of APP.
Their preclinical study is published online today in Nature Communications.
“The purpose of this study was to help better understand why, in most cases of Alzheimer’s disease, the processing of APP becomes deregulated, which leads to the formation of protein deposits and neuron loss,” said study senior author Dr. Jun Tan, professor of psychiatry and the Robert A. Silver Chair, Rashid Laboratory for Developmental Neurobiology at the USF Silver Child Development Center. “The many risk factors for Alzheimer’s disease can change the way APP is processed, and these changes appear to promote plaque formation and neuron loss.”
Microscopic image showing the merging of the amyloid precursor protein fragment, sAPP-α, and the APP-converting enzyme BACE 1, in neuronal cells. This co-localization suggests that sAPP-α may serve as the body’s mechanism to inhibit BACE1 activity and thus lower production of the toxic amyloid beta characteristic of Alzheimer’s disease. (Credit : University of South Florida)
An estimated 30 million people worldwide and 5 million in the U.S. have Alzheimer’s. With the aging of the “Baby Boom” generation, the prevalence of the debilitating disease is expected to increase dramatically in the U.S. in the coming years. Currently, there are no disease-modifying treatments to prevent, reverse or halt the progression of Alzheimer’s disease, only medications that may improve symptoms for a short time.
“For the first time, we have direct evidence that a secreted portion of APP itself, so called ‘ sAPP-α,’ acts as an essential stop-gap mechanism,” said the study’s lead author Dr. Demian Obregon, a resident specializing in research in the Department of Psychiatry & Behavioral Neurosciences at USF Health. “Risk factors associated with Alzheimer’s disease lead to a decline in sAPP-α levels, which results in excessive activity of a key enzyme in Aβ formation.”
In initial studies using cells, and in follow-up studies using mice genetically engineered to mimic Alzheimer’s disease, the investigators found that the neutralization of sAPP-α leads to enhanced Aβ formation. This activity depended on sAPP-α’s ability to associate with the APP-converting enzyme, BACE1. When this interaction was blocked, Aβ formation was restored.
The authors suggest that through monitoring and correcting low sAPP-α levels, or through enhancing its association with BACE, Alzheimer’s disease may be prevented or treated.
Babies who are born small have a tendency to put on weight during childhood and adolescence if allowed free access to calories. However, a new animal model study at UCLA found when small babies were placed on a diet of moderately regulated calories during infancy, the propensity of becoming obese decreased.
Because this is an early study, UCLA researchers do not recommend that mothers of low-birth weight infants start restricting their child’s nutrition and suggest they consult with their child’s pediatrician regarding any feeding questions.
Previous studies have shown that growth restriction before birth may cause lasting changes of genes in certain insulin-sensitive organs like the pancreas, liver and skeletal muscle. Before birth, these changes may help the malnourished fetus use all available nutrients. However, after birth these changes may contribute to health problems such as obesity and diabetes.
“This study shows that if we match the level of caloric consumption after birth to the same level that the growth-restricted baby received in the womb, it results in a lean body type. However, if there is a mismatch where the baby is growth-restricted at birth but exposed to plenty of calories after birth, then that leads to obesity,” said the lead author, Dr. Sherin Devaskar, professor of pediatrics and executive chair of the department of pediatrics at Mattel Children’s Hospital UCLA. “While many trials that include exercise and various drug therapies have tried to reverse the tendency of low birth weight babies becoming obese, we have shown that a dietary intervention during early life can have long lasting effects into childhood, adolescence and adult life.”
The study appears in the June issue of the journal Diabetes and is currently available online.
About 10 percent of babies in the United States are born small, defined as less than the 10th percentile by weight for a given gestation period, said the study’s first author, Dr. Meena Garg, professor of pediatrics and a neonatologist and medical director of the neonatal intensive care unit at Mattel Children’s Hospital UCLA. She added that some organizations define low birth weight as less than 2,500 grams or 5 pounds, 5 ounces at term.
Low birth weight can be caused by malnutrition due to a mother’s homelessness or hunger or her desire not to gain too much weight during pregnancy. Additional causes include illness or infection, a reduction in placental blood, smoking or use of alcohol or drugs during pregnancy.
To conduct the study, researchers used rodent animal models and simulated a reduced calorie scenario during pregnancy. The results showed that low-birth weight offspring exposed to moderately tempered caloric intake during infancy and childhood resulted in lean and physically active adults related to high energy expenditure, as opposed to unrestricted intake of calories, which resulted in inactive and obese adults due to reduced energy expenditure. The authors concluded that early life dietary interventions have far reaching effects on the adult state.
Future studies will follow this study over the stages of aging to see if early regulation of calorie intake reverses diabetes and obesity while aging.
“This is an early pre-clinical trial that first needs to be tested in clinical trials before any form of guidelines can be developed,” Devaskar said. “More importantly, we must make sure that control of caloric intake during infancy and childhood does not have any unintended side effects before taking on clinical trials. More research is required to ensure that these metabolic advantages will persist later in life.”
The study was funded by the National Institute of Child Health and Human Development.
In addition to Devaskar and Garg, the study was conducted by a team of UCLA researchers including Manikkavasagar Thamotharan, Yun Dai, Shanthie Thamotharan, Bo Chul Shin and David Stout.
Source: University of California – Los Angeles Health Sciences
There is little argument among experts that autism spectrum disorders (ASD), complex developmental disabilities that vary widely in their severity, are caused by both genetic and environmental factors. Advances in genome sequencing now permit scientists to uncover specific mutations in DNA that are associated with ASD at unprecedented resolution. Such data are vital to understanding the genetic basis of the disorder.
A new study co-authored by UCLA researchers has led to a better understanding of the genetic contribution to autism using this new approach. By comparing siblings with and without ASD, the researchers have discovered a single instance in the affected siblings in which two independent mutations disrupt a gene called SCN2A.
Reporting in the April 4 edition of the journal Nature, Dr. Daniel Geschwind, a UCLA professor of neurology and psychiatry, and colleagues from Yale University, Carnegie Mellon University and the University of Pittsburgh completed “whole-exome sequencing” of 238 parent-child quartets. A quartet is defined as two parents and one child without ASD and one child with ASD.
Instead of the time-consuming process of searching the entire genome of an individual, the researchers turned to the newer technology of whole-exome sequencing, which searches only the protein-coding regions of the genome to pinpoint the mutation that causes a particular disorder.
The researchers compared mutation rates between unaffected individuals and those with ASD within a family, then compared the ASD mutations to the entire cohort. They found multiple variations between the unaffected and affected groups. Specifically, among a total of 279 coding mutations, they identified a single instance in individual children with ASD — and not in siblings — in which two independent mutations disrupt the gene SCN2A. That same mutation was found in all the unrelated children with ASD, confirming its importance.
In addition, the researchers found many other genes with similar mutations occurring only once — these also make promising new candidates for autism susceptibility. Finally, they were able to estimate that there are likely about 1,000 or more genes that contribute to autism risk.
“This work demonstrates that autism, in most cases, has a contribution from several genes, as the average risk imparted by one mutation is typically not sufficient,” said Geschwind, who holds UCLA’s Gordon and Virginia MacDonald Distinguished Chair in Human Genetics and directs the UCLA Center for Autism Research and Treatment. “Overall, these results substantially clarify the genomic architecture of ASD, and this is an important step in attempting to better understand the genetic basis of these disorders.”
A complete list of contributing authors and institutions is available in the Nature paper. Funding was provided by the Simons Foundation.
Autism is a complex brain disorder that strikes in early childhood. The condition disrupts a child’s ability to communicate and develop social relationships and is often accompanied by acute behavioral challenges. Autism spectrum disorders are diagnosed in one in 110 children in the United States, affecting four times as many boys as girls. Diagnoses have expanded tenfold in the last decade.
A new book by Barbara Rolls, professor of nutritional sciences and Helen A. Guthrie Chair in Nutrition at Penn State, aims to help people control their hunger while also losing weight. “The Ultimate Volumetrics Diet” will be available in stores and online on April 10.
“There is no magic way to get around the fact that to lose weight you must reduce the calories you consume to below the number you burn,” Rolls said. “However, cutting calories doesn’t have to leave you feeling hungry. You can carefully choose the foods you eat so that you feel full and satisfied on fewer calories.”
Rolls’ new book is based on her decades of research on diet and nutrition, which shows that lowering the calorie density — or calories per bite — of food can help people feel full while eating fewer calories. For example, in one study, she and her colleagues found that by using Volumetrics principles to reduce calories per bite by 30 percent and serving size by 25 percent, participants ate 800 calories less per day and never missed them.
The new book contains a 12-week diet plan with chapters on “Building Your Meal Around Fruits and Vegetables,” “Managing Fat and Sugar,” “Eating Away From Home,” and “Maintaining Your Volumetrics Lifestyle.” For example, the chapter on “Building Your Meal Around Fruits and Vegetables” includes advice on how to boost vegetable intake by sneaking them into favorite foods. Rolls’ research has shown that preschool children consume nearly twice as many vegetables and 11 percent fewer calories over the course of a day when pureed vegetables are added to their favorite foods.
In addition to a 12-week diet plan, the book also contains over 100 nutritionally balanced recipes that she and her staff — and even family members — created. Recipes include Greek Frittata, Caribbean Bean and Squash Soup, Zesty Roast Beef and Veggie Pocket, Pasta with Exploding Tomatoes and Arugula, and Alex’s Three-Layer Carrot Cake. Full-color photographs illustrate many of the recipes.
The new book builds upon Rolls’ two previous books about Volumetrics principles, one of which topped the New York Times Paperback Advice Bestseller List in 2007.
Images showing that in vivo delivery of imipramine blue yields decreased invasion of the tumor into healthy tissue. On the left is an untreated tumor and on the right is a tumor treated with imipramine blue. To quantify cellular invasion beyond the tumor border (blue dotted line), researchers count the number of glioma cells (green) per area of healthy tissue (red). (Credit: Georgia Tech/Jennifer Munson)
Treating invasive brain tumors with a combination of chemotherapy and radiation has improved clinical outcomes, but few patients survive longer than two years after diagnosis. The effectiveness of the treatment is limited by the tumor’s aggressive invasion of healthy brain tissue, which restricts chemotherapy access to the cancer cells and complicates surgical removal of the tumor.
To address this challenge, researchers from the Georgia Institute of Technology and Emory University have designed a new treatment approach that appears to halt the spread of cancer cells into normal brain tissue in animal models. The researchers treated animals possessing an invasive tumor with a vesicle carrying a molecule called imipramine blue, followed by conventional doxorubicin chemotherapy. The tumors ceased their invasion of healthy tissue and the animals survived longer than animals treated with chemotherapy alone.
“Our results show that imipramine blue stops tumor invasion into healthy tissue and enhances the efficacy of chemotherapy, which suggests that chemotherapy may be more effective when the target is stationary,” said Ravi Bellamkonda, a professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. “These results reveal a new strategy for treating brain cancer that could improve clinical outcomes.”
The results of this work were published on March 28, 2012 in the journal Science Translational Medicine. The research was supported primarily by the Ian’s Friends Foundation and partially by the Georgia Cancer Coalition, the Wallace H. Coulter Foundation and a National Science Foundation graduate research fellowship.
In addition to Bellamkonda, collaborators on the project include Jack Arbiser, a professor in the Emory University Department of Dermatology; Daniel Brat, a professor in the Emory University Department of Pathology and Laboratory Medicine; and the paper’s lead author, Jennifer Munson, a former Fulbright Scholar who was a bioengineering graduate student in the Georgia Tech School of Chemical & Biomolecular Engineering when the research was conducted.
Arbiser designed the novel imipramine blue compound, which is an organic triphenylmethane dye. After in vitro experiments showed that imipramine blue effectively inhibited movement of several cancer cell lines, the researchers tested the compound in an animal model of aggressive cancer that exhibited attributes similar to a human brain tumor called glioblastoma.
“There were many reasons why we chose to use the RT2 astrocytoma rat model for these experiments,” said Brat. “The tumor exhibited properties of aggressive growth, invasiveness, angiogenesis and necrosis that are similar to human glioblastoma; the model utilized an intact immune system, which is seen in the human disease; and the model enabled increased visualization by MRI because it was a rat model, rather than a mouse.”
Because imipramine blue is hydrophobic and doxorubicin is cytotoxic, the researchers encapsulated each compound in an artificially-prepared vesicle called a liposome so that the drugs would reach the brain. The liposomal drug delivery vehicle also ensured that the drugs would not be released into tissue until they passed through leaky blood vessel walls, which are only present where a tumor is growing.
Animals received one of the following four treatments: liposomes filled with saline, liposomes filled with imipramine blue, liposomes filled with doxorubicin chemotherapy, or liposomes filled with imipramine blue followed by liposomes filled with doxorubicin chemotherapy.
All of the animals that received the sequential treatment of imipramine blue followed by doxorubicin chemotherapy survived for 200 days — more than 6 months — with no observable tumor mass. Of the animals treated with doxorubicin chemotherapy alone, 33 percent were alive after 200 days with a median survival time of 44 days. Animals that received capsules filled with saline or imipramine blue – but no chemotherapy — did not survive more than 19 days.
“Our results show that the increased effectiveness of the chemotherapy treatment is not because of a synergistic toxicity between imipramine blue and doxorubicin. Imipramine blue is not making the doxorubicin more toxic, it’s simply stopping the movement of the cancer cells and containing the cancer so that the chemotherapy can do a better job,” explained Bellamkonda, who is also the Carol Ann and David D. Flanagan Chair in Biomedical Engineering and a Georgia Cancer Coalition Distinguished Cancer Scholar.
MRI results showed a reduction and compaction of the tumor in animals treated with imipramine blue followed by doxorubicin chemotherapy, while animals treated with chemotherapy alone presented with abnormal tissue and glioma cells. MRI also indicated that the blood-brain barrier breach often seen during tumor growth was present in the animals treated with chemotherapy alone, but not the group treated with chemotherapy and imipramine blue.
According to the researchers, imipramine blue appears to improve the outcome of brain cancer treatment by altering the regulation of actin, a protein found in all eukaryotic cells. Actin mediates a variety of essential biological functions, including the production of reactive oxygen species. Most cancer cells exhibit overproduction of reactive oxygen species, which are thought to stimulate cancer cells to invade healthy tissue. The dye’s reorganization of the actin cytoskeleton is thought to inhibit production of enzymes that produce reactive oxygen species.
“I formulated the imipramine blue compound as a triphenylmethane dye because I knew that another triphenylmethane dye, gentian violet, exhibited anti-cancer properties, and I decided to use imipramine — a drug used to treat depression — as the starting material because I knew it could get into the brain,” said Arbiser.
For future studies, the researchers are planning to test imipramine blue’s effect on animal models with invasive brain tumors, metastatic tumors, and other types of cancer such as prostate and breast.
“While we need to conduct future studies to determine if the effect of imipramine blue is the same for different types of cancer diagnosed at different stages, this initial study shows the possibility that imipramine blue may be useful as soon as any tumor is diagnosed, before anti-cancer treatment begins, to create a more treatable tumor and enhance clinical outcome,” noted Bellamkonda.
A new study led by Georgia Tech found that a lake’s ecological characteristics influence howDaphnia dentifera quickly evolve to survive epidemics of a virulent yeast parasiteMetschnikowia bicuspidata. TheDaphnia dentifera individuals on the top right and bottom middle of this image are uninfected; the other fourDaphniaare infected withMetschnikowia. (Credit: Georgia Tech/Meghan Duffy)
When battling an epidemic of a deadly parasite, less resistance can sometimes be better than more, a new study suggests.
A freshwater zooplankton species known as Daphnia dentifera endures periodic epidemics of a virulent yeast parasite that can infect more than 60 percent of the Daphnia population. During these epidemics, the Daphnia population evolves quickly, balancing infection resistance and reproduction.
A new study led by Georgia Institute of Technology researchers reveals that the number of vertebrate predators in the water and the amount of food available for Daphnia to eat influence the size of the epidemics and how these “water fleas” evolve during epidemics to survive.
The study shows that lakes with high nutrient concentrations and lower predation levels exhibit large epidemics and Daphnia that become more resistant to infection by the yeast Metschnikowia bicuspidata. However, in lakes with fewer resources and high predation, epidemics remain small and Daphnia evolve increased susceptibility to the parasite.
“It’s counterintuitive to think that hosts would ever evolve greater susceptibility to virulent parasites during an epidemic, but we found that ecological factors determine whether it is better for them to evolve enhanced resistance or susceptibility to infection,” said the study’s lead author Meghan Duffy, an assistant professor in the School of Biology at Georgia Tech. “There is a trade-off between resistance and reproduction because any resources an animal devotes to defense are not available for reproduction. When ecological factors favor small epidemics, it is better for hosts to invest in reproduction rather than defense.”
This study was published in the March 30, 2012 issue of the journal Science. The research was supported by the National Science Foundation and the James S. McDonnell Foundation.
In addition to Duffy, also contributing to this study were Indiana University Department of Biology associate professor Spencer Hall and graduate student David Civitello; Christopher Klausmeier, an associate professor in the Department of Plant Biology and W.K. Kellogg Biological Station at Michigan State University; and Georgia Tech research technician Jessica Housley Ochs and graduate student Rachel Penczykowski.
For the study, the researchers monitored the levels of nutritional resources, predation and parasitic infection in seven Indiana lakes on a weekly basis for a period of four months. They calculated infection prevalence visually on live hosts using established survey methods, estimated resources by measuring the levels of phosphorus and nitrogen in the water, and assessed predation by measuring the size of uninfected adult Daphnia.
The researchers also conducted infection assays in the laboratory on Daphnia collected from each of the seven lake populations at two time points: in late July before epidemics began and in mid-November as epidemics waned. The assays measured the zooplankton’s uptake of Metschnikowia bicuspidata and infectivity of the yeast once consumed.
The infection assays showed a significant evolutionary response of Daphnia to epidemics in six of the seven lake populations. The Daphnia population became significantly more resistant to infection in three lakes and significantly more susceptible to infection in three other lakes. The hosts in the seventh lake did not show a significant change in susceptibility, but trended toward increased resistance. In the six lake populations that showed a significant evolutionary response, epidemics were larger when lakes had lower predation and higher levels of total nitrogen.
“Daphnia became more susceptible to the yeast in lakes with fewer resources and higher vertebrate predation, but evolved toward increased resistance in lakes with increased resources and lower predation,” noted Duffy.
The study’s combination of observations, experiments and mathematical modeling support the researchers’ theoretical prediction that when hosts face a resistance-reproduction tradeoff, they evolve increased resistance to infection during larger epidemics and increased susceptibility during smaller ones. Ultimately, ecological gradients, through their effects on epidemic size, influence evolutionary outcomes of hosts during epidemics.
“While the occurrence and magnitude of disease outbreaks can strongly influence host evolution, this study suggests that altering predation pressure on hosts and productivity of ecosystems may also influence this evolution,” added Duffy.
The team plans to repeat the study this summer in the same Indiana lakes to examine whether the relationships between ecological factors, epidemic size and host evolution they found in this study can be corroborated.
The water flowing through Venice’s famous canals laps at buildings a little higher every year – and not only because of a rising sea level. Although previous studies had found that Venice has stabilized, new measurements indicate that the historic city continues to slowly sink, and even to tilt slightly to the east.
“Venice appears to be continuing to subside, at a rate of about 2 millimeters (.07 inches) a year,” said Yehuda Bock, a research geodesist with Scripps Institution of Oceanography at UC San Diego, and the lead author of the new research paper on the city’s downward drift. “It’s a small effect, but it’s important,” he added. Given that sea level is rising in the Venetian lagoon, also at 2 millimeters per year, the slight subsidence doubles the rate at which the heights of surrounding waters are increasing relative to the elevation of the city, he noted. In the next 20 years, if Venice and its immediate surroundings subsided steadily at the current rate, researchers would expect the land to sink up to 80 millimeters (3.2 inches) in that period of time, relative to the sea.
Bock worked with colleagues from the University of Miami in Florida and Italy’s Tele-Rilevamento Europa, a company that measures ground deformation, to analyze data collected by GPS and space-borne radar (InSAR) instruments regarding Venice and its lagoon. The GPS measurements provide absolute elevations, while the InSAR data are used to calculate elevations relative to other points. By combining the two datasets from the decade between 2000 and 2010, Bock and his colleagues found that the city of Venice was subsiding on average of 1 to 2 millimeters a year (0.04 to 0.08 inches per year). The patches of land in Venice’s lagoon (117 islands in all) are also sinking, they found, with northern sections of the lagoon dropping at a rate of 2 to 3 millimeters (0.08 to 0.12 inches) per year, and the southern lagoon subsiding at 3 to 4 millimeters (0.12 to 0.16 inches) per year.
The findings will be published March 28 in Geochemistry, Geophysics, Geosystems, a journal of the American Geophysical Union.
“Our combined GPS and InSAR analysis clearly captured the movements over the last decade that neither GPS nor InSAR could sense alone” said Shimon Wdowinski, associate research professor of Marine Geology and Geophysics at the University of Miami.
In the new study, using the GPS instruments, Bock and his colleagues were able to take absolute readings of the city and its surrounding lagoons. And not only did they find the sinking, but they found that the area was tilting a bit, about a millimeter or two eastward per year. That means the western part – where the city of Venice is – is higher than the eastern sections. Prior satellite analyses didn’t pick up on the tilt, Bock said, possibly because the scientists had been taking measurements using InSAR, which only provided the change elevation relative to other sites.
The relative nature of InSAR measurements might also explain why the new study detected Venice’s subsidence, while other recent studies did not, Bock conjectured.
Venice’s subsidence was recognized as a major issue decades ago, he noted, when scientists realized that pumping groundwater from beneath the city, combined with the ground’s compaction from centuries of building, was causing the city to settle. But officials put a stop to the groundwater pumping, and subsequent studies in the 2000s indicated that the subsidence had stopped, he said.
“It’s possible that it was stable in that decade, and started subsiding since then, but this is unlikely,” Bock said. The current subsidence is due to natural causes, which probably have been affecting the area for a long time.
A significant part of those natural causes are plate tectonics. The Adriatic plate, which includes Venice, subducts beneath the Apennines Mountains and causes the city and its environs to drop slightly in elevation. And although the groundwater pumping has stopped, the compaction of the sediments beneath Venice remains a factor.
The frequency of floods in Venice is increasing, Bock said, and now about four or five times a year residents have to walk on wooden planks to stay above the floodwaters in large parts of the city. A multi-billion-dollar effort to install flood-protection walls that can be raised to block incoming tides is nearing completion, he said. The adjustable barriers were designed to protect the city from tides that are coming in higher as overall sea levels are rising in response to climate change.
To ensure that the gates can hold back sufficient tidal water in the long run, their builders “have to take into account not only the rising of sea level, but also the subsidence,” Wdowinski said. The land on which those gates are being built is descending and taking the barriers down with it relative to the rising seas, he said, effectively doubling the amount of elevation change in store for Venice.
The patchy land in the surrounding lagoon could need shoring up as well. Over the next 40 years, the natural barriers that protect the Venice lagoon and city could drop by 150-200 millimeters, Wdowinski said, so officials may need to reinforce those sinking sediments as well.