Babies who are born small have a tendency to put on weight during childhood and adolescence if allowed free access to calories. However, a new animal model study at UCLA found when small babies were placed on a diet of moderately regulated calories during infancy, the propensity of becoming obese decreased.
Because this is an early study, UCLA researchers do not recommend that mothers of low-birth weight infants start restricting their child’s nutrition and suggest they consult with their child’s pediatrician regarding any feeding questions.
Previous studies have shown that growth restriction before birth may cause lasting changes of genes in certain insulin-sensitive organs like the pancreas, liver and skeletal muscle. Before birth, these changes may help the malnourished fetus use all available nutrients. However, after birth these changes may contribute to health problems such as obesity and diabetes.
“This study shows that if we match the level of caloric consumption after birth to the same level that the growth-restricted baby received in the womb, it results in a lean body type. However, if there is a mismatch where the baby is growth-restricted at birth but exposed to plenty of calories after birth, then that leads to obesity,” said the lead author, Dr. Sherin Devaskar, professor of pediatrics and executive chair of the department of pediatrics at Mattel Children’s Hospital UCLA. “While many trials that include exercise and various drug therapies have tried to reverse the tendency of low birth weight babies becoming obese, we have shown that a dietary intervention during early life can have long lasting effects into childhood, adolescence and adult life.”
The study appears in the June issue of the journal Diabetes and is currently available online.
About 10 percent of babies in the United States are born small, defined as less than the 10th percentile by weight for a given gestation period, said the study’s first author, Dr. Meena Garg, professor of pediatrics and a neonatologist and medical director of the neonatal intensive care unit at Mattel Children’s Hospital UCLA. She added that some organizations define low birth weight as less than 2,500 grams or 5 pounds, 5 ounces at term.
Low birth weight can be caused by malnutrition due to a mother’s homelessness or hunger or her desire not to gain too much weight during pregnancy. Additional causes include illness or infection, a reduction in placental blood, smoking or use of alcohol or drugs during pregnancy.
To conduct the study, researchers used rodent animal models and simulated a reduced calorie scenario during pregnancy. The results showed that low-birth weight offspring exposed to moderately tempered caloric intake during infancy and childhood resulted in lean and physically active adults related to high energy expenditure, as opposed to unrestricted intake of calories, which resulted in inactive and obese adults due to reduced energy expenditure. The authors concluded that early life dietary interventions have far reaching effects on the adult state.
Future studies will follow this study over the stages of aging to see if early regulation of calorie intake reverses diabetes and obesity while aging.
“This is an early pre-clinical trial that first needs to be tested in clinical trials before any form of guidelines can be developed,” Devaskar said. “More importantly, we must make sure that control of caloric intake during infancy and childhood does not have any unintended side effects before taking on clinical trials. More research is required to ensure that these metabolic advantages will persist later in life.”
The study was funded by the National Institute of Child Health and Human Development.
In addition to Devaskar and Garg, the study was conducted by a team of UCLA researchers including Manikkavasagar Thamotharan, Yun Dai, Shanthie Thamotharan, Bo Chul Shin and David Stout.
Source: University of California – Los Angeles Health Sciences
A new book by Barbara Rolls, professor of nutritional sciences and Helen A. Guthrie Chair in Nutrition at Penn State, aims to help people control their hunger while also losing weight. “The Ultimate Volumetrics Diet” will be available in stores and online on April 10.
“There is no magic way to get around the fact that to lose weight you must reduce the calories you consume to below the number you burn,” Rolls said. “However, cutting calories doesn’t have to leave you feeling hungry. You can carefully choose the foods you eat so that you feel full and satisfied on fewer calories.”
Rolls’ new book is based on her decades of research on diet and nutrition, which shows that lowering the calorie density — or calories per bite — of food can help people feel full while eating fewer calories. For example, in one study, she and her colleagues found that by using Volumetrics principles to reduce calories per bite by 30 percent and serving size by 25 percent, participants ate 800 calories less per day and never missed them.
The new book contains a 12-week diet plan with chapters on “Building Your Meal Around Fruits and Vegetables,” “Managing Fat and Sugar,” “Eating Away From Home,” and “Maintaining Your Volumetrics Lifestyle.” For example, the chapter on “Building Your Meal Around Fruits and Vegetables” includes advice on how to boost vegetable intake by sneaking them into favorite foods. Rolls’ research has shown that preschool children consume nearly twice as many vegetables and 11 percent fewer calories over the course of a day when pureed vegetables are added to their favorite foods.
In addition to a 12-week diet plan, the book also contains over 100 nutritionally balanced recipes that she and her staff — and even family members — created. Recipes include Greek Frittata, Caribbean Bean and Squash Soup, Zesty Roast Beef and Veggie Pocket, Pasta with Exploding Tomatoes and Arugula, and Alex’s Three-Layer Carrot Cake. Full-color photographs illustrate many of the recipes.
The new book builds upon Rolls’ two previous books about Volumetrics principles, one of which topped the New York Times Paperback Advice Bestseller List in 2007.
Images showing that in vivo delivery of imipramine blue yields decreased invasion of the tumor into healthy tissue. On the left is an untreated tumor and on the right is a tumor treated with imipramine blue. To quantify cellular invasion beyond the tumor border (blue dotted line), researchers count the number of glioma cells (green) per area of healthy tissue (red). (Credit: Georgia Tech/Jennifer Munson)
Treating invasive brain tumors with a combination of chemotherapy and radiation has improved clinical outcomes, but few patients survive longer than two years after diagnosis. The effectiveness of the treatment is limited by the tumor’s aggressive invasion of healthy brain tissue, which restricts chemotherapy access to the cancer cells and complicates surgical removal of the tumor.
To address this challenge, researchers from the Georgia Institute of Technology and Emory University have designed a new treatment approach that appears to halt the spread of cancer cells into normal brain tissue in animal models. The researchers treated animals possessing an invasive tumor with a vesicle carrying a molecule called imipramine blue, followed by conventional doxorubicin chemotherapy. The tumors ceased their invasion of healthy tissue and the animals survived longer than animals treated with chemotherapy alone.
“Our results show that imipramine blue stops tumor invasion into healthy tissue and enhances the efficacy of chemotherapy, which suggests that chemotherapy may be more effective when the target is stationary,” said Ravi Bellamkonda, a professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University. “These results reveal a new strategy for treating brain cancer that could improve clinical outcomes.”
The results of this work were published on March 28, 2012 in the journal Science Translational Medicine. The research was supported primarily by the Ian’s Friends Foundation and partially by the Georgia Cancer Coalition, the Wallace H. Coulter Foundation and a National Science Foundation graduate research fellowship.
In addition to Bellamkonda, collaborators on the project include Jack Arbiser, a professor in the Emory University Department of Dermatology; Daniel Brat, a professor in the Emory University Department of Pathology and Laboratory Medicine; and the paper’s lead author, Jennifer Munson, a former Fulbright Scholar who was a bioengineering graduate student in the Georgia Tech School of Chemical & Biomolecular Engineering when the research was conducted.
Arbiser designed the novel imipramine blue compound, which is an organic triphenylmethane dye. After in vitro experiments showed that imipramine blue effectively inhibited movement of several cancer cell lines, the researchers tested the compound in an animal model of aggressive cancer that exhibited attributes similar to a human brain tumor called glioblastoma.
“There were many reasons why we chose to use the RT2 astrocytoma rat model for these experiments,” said Brat. “The tumor exhibited properties of aggressive growth, invasiveness, angiogenesis and necrosis that are similar to human glioblastoma; the model utilized an intact immune system, which is seen in the human disease; and the model enabled increased visualization by MRI because it was a rat model, rather than a mouse.”
Because imipramine blue is hydrophobic and doxorubicin is cytotoxic, the researchers encapsulated each compound in an artificially-prepared vesicle called a liposome so that the drugs would reach the brain. The liposomal drug delivery vehicle also ensured that the drugs would not be released into tissue until they passed through leaky blood vessel walls, which are only present where a tumor is growing.
Animals received one of the following four treatments: liposomes filled with saline, liposomes filled with imipramine blue, liposomes filled with doxorubicin chemotherapy, or liposomes filled with imipramine blue followed by liposomes filled with doxorubicin chemotherapy.
All of the animals that received the sequential treatment of imipramine blue followed by doxorubicin chemotherapy survived for 200 days — more than 6 months — with no observable tumor mass. Of the animals treated with doxorubicin chemotherapy alone, 33 percent were alive after 200 days with a median survival time of 44 days. Animals that received capsules filled with saline or imipramine blue – but no chemotherapy — did not survive more than 19 days.
“Our results show that the increased effectiveness of the chemotherapy treatment is not because of a synergistic toxicity between imipramine blue and doxorubicin. Imipramine blue is not making the doxorubicin more toxic, it’s simply stopping the movement of the cancer cells and containing the cancer so that the chemotherapy can do a better job,” explained Bellamkonda, who is also the Carol Ann and David D. Flanagan Chair in Biomedical Engineering and a Georgia Cancer Coalition Distinguished Cancer Scholar.
MRI results showed a reduction and compaction of the tumor in animals treated with imipramine blue followed by doxorubicin chemotherapy, while animals treated with chemotherapy alone presented with abnormal tissue and glioma cells. MRI also indicated that the blood-brain barrier breach often seen during tumor growth was present in the animals treated with chemotherapy alone, but not the group treated with chemotherapy and imipramine blue.
According to the researchers, imipramine blue appears to improve the outcome of brain cancer treatment by altering the regulation of actin, a protein found in all eukaryotic cells. Actin mediates a variety of essential biological functions, including the production of reactive oxygen species. Most cancer cells exhibit overproduction of reactive oxygen species, which are thought to stimulate cancer cells to invade healthy tissue. The dye’s reorganization of the actin cytoskeleton is thought to inhibit production of enzymes that produce reactive oxygen species.
“I formulated the imipramine blue compound as a triphenylmethane dye because I knew that another triphenylmethane dye, gentian violet, exhibited anti-cancer properties, and I decided to use imipramine — a drug used to treat depression — as the starting material because I knew it could get into the brain,” said Arbiser.
For future studies, the researchers are planning to test imipramine blue’s effect on animal models with invasive brain tumors, metastatic tumors, and other types of cancer such as prostate and breast.
“While we need to conduct future studies to determine if the effect of imipramine blue is the same for different types of cancer diagnosed at different stages, this initial study shows the possibility that imipramine blue may be useful as soon as any tumor is diagnosed, before anti-cancer treatment begins, to create a more treatable tumor and enhance clinical outcome,” noted Bellamkonda.
A new study led by Georgia Tech found that a lake’s ecological characteristics influence howDaphnia dentifera quickly evolve to survive epidemics of a virulent yeast parasiteMetschnikowia bicuspidata. TheDaphnia dentifera individuals on the top right and bottom middle of this image are uninfected; the other fourDaphniaare infected withMetschnikowia. (Credit: Georgia Tech/Meghan Duffy)
When battling an epidemic of a deadly parasite, less resistance can sometimes be better than more, a new study suggests.
A freshwater zooplankton species known as Daphnia dentifera endures periodic epidemics of a virulent yeast parasite that can infect more than 60 percent of the Daphnia population. During these epidemics, the Daphnia population evolves quickly, balancing infection resistance and reproduction.
A new study led by Georgia Institute of Technology researchers reveals that the number of vertebrate predators in the water and the amount of food available for Daphnia to eat influence the size of the epidemics and how these “water fleas” evolve during epidemics to survive.
The study shows that lakes with high nutrient concentrations and lower predation levels exhibit large epidemics and Daphnia that become more resistant to infection by the yeast Metschnikowia bicuspidata. However, in lakes with fewer resources and high predation, epidemics remain small and Daphnia evolve increased susceptibility to the parasite.
“It’s counterintuitive to think that hosts would ever evolve greater susceptibility to virulent parasites during an epidemic, but we found that ecological factors determine whether it is better for them to evolve enhanced resistance or susceptibility to infection,” said the study’s lead author Meghan Duffy, an assistant professor in the School of Biology at Georgia Tech. “There is a trade-off between resistance and reproduction because any resources an animal devotes to defense are not available for reproduction. When ecological factors favor small epidemics, it is better for hosts to invest in reproduction rather than defense.”
This study was published in the March 30, 2012 issue of the journal Science. The research was supported by the National Science Foundation and the James S. McDonnell Foundation.
In addition to Duffy, also contributing to this study were Indiana University Department of Biology associate professor Spencer Hall and graduate student David Civitello; Christopher Klausmeier, an associate professor in the Department of Plant Biology and W.K. Kellogg Biological Station at Michigan State University; and Georgia Tech research technician Jessica Housley Ochs and graduate student Rachel Penczykowski.
For the study, the researchers monitored the levels of nutritional resources, predation and parasitic infection in seven Indiana lakes on a weekly basis for a period of four months. They calculated infection prevalence visually on live hosts using established survey methods, estimated resources by measuring the levels of phosphorus and nitrogen in the water, and assessed predation by measuring the size of uninfected adult Daphnia.
The researchers also conducted infection assays in the laboratory on Daphnia collected from each of the seven lake populations at two time points: in late July before epidemics began and in mid-November as epidemics waned. The assays measured the zooplankton’s uptake of Metschnikowia bicuspidata and infectivity of the yeast once consumed.
The infection assays showed a significant evolutionary response of Daphnia to epidemics in six of the seven lake populations. The Daphnia population became significantly more resistant to infection in three lakes and significantly more susceptible to infection in three other lakes. The hosts in the seventh lake did not show a significant change in susceptibility, but trended toward increased resistance. In the six lake populations that showed a significant evolutionary response, epidemics were larger when lakes had lower predation and higher levels of total nitrogen.
“Daphnia became more susceptible to the yeast in lakes with fewer resources and higher vertebrate predation, but evolved toward increased resistance in lakes with increased resources and lower predation,” noted Duffy.
The study’s combination of observations, experiments and mathematical modeling support the researchers’ theoretical prediction that when hosts face a resistance-reproduction tradeoff, they evolve increased resistance to infection during larger epidemics and increased susceptibility during smaller ones. Ultimately, ecological gradients, through their effects on epidemic size, influence evolutionary outcomes of hosts during epidemics.
“While the occurrence and magnitude of disease outbreaks can strongly influence host evolution, this study suggests that altering predation pressure on hosts and productivity of ecosystems may also influence this evolution,” added Duffy.
The team plans to repeat the study this summer in the same Indiana lakes to examine whether the relationships between ecological factors, epidemic size and host evolution they found in this study can be corroborated.
Chemists at the University of California, San Diego have produced the first high resolution structure of a molecule that when attached to the genetic material of the hepatitis C virus prevents it from reproducing.
Hepatitis C is a chronic infectious disease that affects some 170 million people worldwide and causes chronic liver disease and liver cancer. According to the Centers for Disease Control and Prevention, hepatitis C now kills more Americans each year than HIV.
The structure of the molecule, which was published in a paper in this week’s early online edition of the journal Proceedings of the National Academy of Sciences, provides a detailed blueprint for the design of drugs that can inhibit the replication of the hepatitis C virus, which proliferates by hijacking the cellular machinery in humans to manufacture duplicate viral particles.
Finding a way to stop that process could effectively treat viral infections of hepatitis C, for which no vaccine is currently available. But until now scientists have identified few inhibiting compounds that directly act on the virus’s ribonucleic acid (RNA) genome—the organism’s full complement of genetic material.
“This lack of detailed information on how inhibitors lock onto the viral genome target has hampered the development of better drugs,” said Thomas Hermann, an associate professor of chemistry and biochemistry at UC San Diego who headed the research team, which also included scientists from San Diego State University. The team detailed the structure of a molecule that induces the viral RNA to open up a portion of its hinge-like structure and encapsulate the inhibitor like a perfectly fit glove, blocking the ability of the hepatitis C virus to replicate.
The molecule is from a class of compounds called benzimidazoles, known to stop the production of viral proteins in infected human cells. Its three-dimensional atomic structure was determined by X-ray crystallography, a method of mapping the arrangement of atoms within a crystal, in which a beam of X-rays strikes a crystal and causes the beam of light to spread. The angles and intensities of the light beams allowed the scientists to calculate the structure of the viral RNA-inhibitor complex.
The molecule prompts the Hepatitis C’s viral RNA to open up a portion of its hinge-like structure and encapsulate the inhibitor like a perfectly fit glove.(Credit: Image courtesy of University of California – San Diego)
“This structure will guide approaches to rationally design better drug candidates and improve the known benzimidazole inhibitors,” said Hermann. “Also, the crystal structure demonstrates that the binding pocket for the inhibitors in the hepatitis C virus RNA resembles drug-binding pockets in proteins. This is important to help overcome the notion that RNA targets are so unlike traditional protein targets that drug discovery approaches with small molecule inhibitors are difficult to achieve for RNA.”
The study was supported by the National Institutes of Health and National Science Foundation.
The water flowing through Venice’s famous canals laps at buildings a little higher every year – and not only because of a rising sea level. Although previous studies had found that Venice has stabilized, new measurements indicate that the historic city continues to slowly sink, and even to tilt slightly to the east.
“Venice appears to be continuing to subside, at a rate of about 2 millimeters (.07 inches) a year,” said Yehuda Bock, a research geodesist with Scripps Institution of Oceanography at UC San Diego, and the lead author of the new research paper on the city’s downward drift. “It’s a small effect, but it’s important,” he added. Given that sea level is rising in the Venetian lagoon, also at 2 millimeters per year, the slight subsidence doubles the rate at which the heights of surrounding waters are increasing relative to the elevation of the city, he noted. In the next 20 years, if Venice and its immediate surroundings subsided steadily at the current rate, researchers would expect the land to sink up to 80 millimeters (3.2 inches) in that period of time, relative to the sea.
Bock worked with colleagues from the University of Miami in Florida and Italy’s Tele-Rilevamento Europa, a company that measures ground deformation, to analyze data collected by GPS and space-borne radar (InSAR) instruments regarding Venice and its lagoon. The GPS measurements provide absolute elevations, while the InSAR data are used to calculate elevations relative to other points. By combining the two datasets from the decade between 2000 and 2010, Bock and his colleagues found that the city of Venice was subsiding on average of 1 to 2 millimeters a year (0.04 to 0.08 inches per year). The patches of land in Venice’s lagoon (117 islands in all) are also sinking, they found, with northern sections of the lagoon dropping at a rate of 2 to 3 millimeters (0.08 to 0.12 inches) per year, and the southern lagoon subsiding at 3 to 4 millimeters (0.12 to 0.16 inches) per year.
The findings will be published March 28 in Geochemistry, Geophysics, Geosystems, a journal of the American Geophysical Union.
“Our combined GPS and InSAR analysis clearly captured the movements over the last decade that neither GPS nor InSAR could sense alone” said Shimon Wdowinski, associate research professor of Marine Geology and Geophysics at the University of Miami.
In the new study, using the GPS instruments, Bock and his colleagues were able to take absolute readings of the city and its surrounding lagoons. And not only did they find the sinking, but they found that the area was tilting a bit, about a millimeter or two eastward per year. That means the western part – where the city of Venice is – is higher than the eastern sections. Prior satellite analyses didn’t pick up on the tilt, Bock said, possibly because the scientists had been taking measurements using InSAR, which only provided the change elevation relative to other sites.
The relative nature of InSAR measurements might also explain why the new study detected Venice’s subsidence, while other recent studies did not, Bock conjectured.
Venice’s subsidence was recognized as a major issue decades ago, he noted, when scientists realized that pumping groundwater from beneath the city, combined with the ground’s compaction from centuries of building, was causing the city to settle. But officials put a stop to the groundwater pumping, and subsequent studies in the 2000s indicated that the subsidence had stopped, he said.
“It’s possible that it was stable in that decade, and started subsiding since then, but this is unlikely,” Bock said. The current subsidence is due to natural causes, which probably have been affecting the area for a long time.
A significant part of those natural causes are plate tectonics. The Adriatic plate, which includes Venice, subducts beneath the Apennines Mountains and causes the city and its environs to drop slightly in elevation. And although the groundwater pumping has stopped, the compaction of the sediments beneath Venice remains a factor.
The frequency of floods in Venice is increasing, Bock said, and now about four or five times a year residents have to walk on wooden planks to stay above the floodwaters in large parts of the city. A multi-billion-dollar effort to install flood-protection walls that can be raised to block incoming tides is nearing completion, he said. The adjustable barriers were designed to protect the city from tides that are coming in higher as overall sea levels are rising in response to climate change.
To ensure that the gates can hold back sufficient tidal water in the long run, their builders “have to take into account not only the rising of sea level, but also the subsidence,” Wdowinski said. The land on which those gates are being built is descending and taking the barriers down with it relative to the rising seas, he said, effectively doubling the amount of elevation change in store for Venice.
The patchy land in the surrounding lagoon could need shoring up as well. Over the next 40 years, the natural barriers that protect the Venice lagoon and city could drop by 150-200 millimeters, Wdowinski said, so officials may need to reinforce those sinking sediments as well.
UT Southwestern Medical Center cardiologists have uncovered how a specific protein’s previously unsuspected role contributes to the deterioration of heart muscle in patients with diabetes. Investigators in the mouse study also have found a way to reverse the damage caused by this protein.
The new research, available online and published in the March 1 issue of the Journal of Clinical Investigation, was carried out in the laboratory of Dr. Joseph Hill, director of the Harry S. Moss Heart Center at UT Southwestern.
“If we can protect the heart of diabetic patients, it would be a significant breakthrough,” said Dr. Hill, the study’s senior author who also serves as chief of cardiology at the medical center. “These are fundamental research findings that can be applied to a patient’s bedside.”
Cardiovascular disease is the leading cause of illness and death in patients with diabetes, which affects more than 180 million people around the world, according to the American Heart Association. Diabetes puts additional stress on the heart – above and beyond that provoked by risk factors such as high blood pressure or coronary artery disease, Dr. Hill said.
“Elevated glucose and the insulin-resistant diabetic state are both toxic to the heart,” he said.
Dr. Hill and his colleagues in this study were able to maintain heart function in mice exposed to a high fat diet by inactivating a protein called FoxO1. Previous investigations from Dr. Hill’s laboratory demonstrated that FoxO proteins, a class of proteins that govern gene expression and regulate cell size, viability and metabolism, are tightly linked to the development of heart disease in mice with type 2 diabetes.
“If you eliminate FoxO1, the heart is protected from the stress of diabetes and continues to function normally,” Dr. Hill said. “If we can prevent FoxO1 from being overactive, then there is a chance that we can protect the hearts of patients with diabetes.”
Other UT Southwestern investigators participating in the study were lead author Dr. Pavan Battiprolu, and Drs. Zhao Wang and Myriam Iglewski, all postdoctoral researchers in internal medicine; Dr. Berdymammet Hojayev, postdoctoral researcher in pathology; Nan Jiang and John Shelton, senior research scientists in internal medicine; Dr. Xiang Luo, instructor in internal medicine; Dr. Robert Gerard, associate professor of internal medicine and molecular biology; Dr. Beverly Rothermel, assistant professor of internal medicine and molecular biology; Dr. Thomas Gillette, assistant professor of internal medicine; and Dr. Sergio Lavandero, visiting professor of internal medicine.
The research was supported by grants from the National Institutes of Health, the American Heart Association, the American Diabetes Association and the Jon Holden DeHaan Foundation.
UT Southwestern Medical Center investigators have identified a genetic manipulation that increases the development of neurons in the brain during aging and enhances the effect of antidepressant drugs.
The research finds that deleting the Nf1 gene in mice results in long-lasting improvements in neurogenesis, which in turn makes those in the test group more sensitive to the effects of antidepressants.
“The significant implication of this work is that enhancing neurogenesis sensitizes mice to antidepressants – meaning they needed lower doses of the drugs to affect ‘mood’ – and also appears to have anti-depressive and anti-anxiety effects of its own that continue over time,” said Dr. Luis Parada, director of the Kent Waldrep Center for Basic Research on Nerve Growth and Regeneration and senior author of the study published in the Journal of Neuroscience.
Just as in people, mice produce new neurons throughout adulthood, although the rate declines with age and stress, said Dr. Parada, chairman of developmental biology at UT Southwestern. Studies have shown that learning, exercise, electroconvulsive therapy and some antidepressants can increase neurogenesis. The steps in the process are well known but the cellular mechanisms behind those steps are not.
“In neurogenesis, stem cells in the brain’s hippocampus give rise to neuronal precursor cells that eventually become young neurons, which continue on to become full-fledged neurons that integrate into the brain’s synapses,” said Dr. Parada, an elected member of the prestigious National Academy of Sciences, its Institute of Medicine, and the American Academy of Arts and Sciences.
The researchers used a sophisticated process to delete the gene that codes for the Nf1 protein only in the brains of mice, while production in other tissues continued normally. After showing that mice lacking Nf1 protein in the brain had greater neurogenesis than controls, the researchers administered behavioral tests designed to mimic situations that would spark a subdued mood or anxiety, such as observing grooming behavior in response to a small splash of sugar water.
The researchers found that the test group mice formed more neurons over time compared to controls, and that young mice lacking the Nf1 protein required much lower amounts of anti-depressants to counteract the effects of stress. Behavioral differences between the groups persisted at three months, six months and nine months. “Older mice lacking the protein responded as if they had been taking antidepressants all their lives,” said Dr. Parada.
“In summary, this work suggests that activating neural precursor cells could directly improve depression- and anxiety-like behaviors, and it provides a proof-of-principle regarding the feasibility of regulating behavior via direct manipulation of adult neurogenesis,” Dr. Parada said.
Dr. Parada’s laboratory has published a series of studies that link the Nf1 gene – best known for mutations that cause tumors to grow around nerves – to wide-ranging effects in several major tissues. For instance, in one study researchers identified ways that the body’s immune system promotes the growth of tumors, and in another study, they described how loss of the Nf1 protein in the circulatory system leads to hypertension and congenital heart disease.
The current study’s lead author is former graduate student Dr. Yun Li, now a postdoctoral researcher at the Massachusetts Institute of Technology. Other co-authors include Yanjiao Li, a research associate of developmental biology, Dr. Renée McKay, assistant professor of developmental biology, both of UT Southwestern, and Dr. Dieter Riethmacher of the University of Southampton in the United Kingdom.
The study was supported by the National Institutes of Health’s National Institute of Neurological Disorders and Stroke, and National Institute of Mental Health. Dr. Parada is an American Cancer Society Research Professor.
Astronomers at The Ohio State University have calculated the odds that, sometime during the next 50 years, a supernova occurring in our home galaxy will be visible from Earth.
The good news: they’ve calculated the odds to be nearly 100 percent that such a supernova would be visible to telescopes in the form of infrared radiation.
The bad news: the odds are much lower—dipping to 20 percent or less—that the shining stellar spectacle would be visible to the naked eye in the nighttime sky.
Yet, all this is great news to astronomers, who, unlike the rest of us, have high-powered infrared cameras to point at the sky at a moment’s notice. For them, this study suggests that they have a solid chance of doing something that’s never been done before: detect a supernova fast enough to witness what happens at the very beginning of a star’s demise. A massive star “goes supernova” at the moment when it’s used up all its nuclear fuel and its core collapses, just before it explodes violently and throws off most of its mass into space.
“We see all these stars go supernova in other galaxies, and we don’t fully understand how it happens. We think we know, we say we know, but that’s not actually 100 percent true,” said Christopher Kochanek, professor of astronomy at Ohio State and the Ohio Eminent Scholar in Observational Cosmology. “Today, technologies have advanced to the point that we can learn enormously more about supernovae if we can catch the next one in our galaxy and study it with all our available tools.”
The results will appear in an upcoming issue of The Astrophysical Journal.
First through calculations and then through computer models, generations of astronomers have worked out the physics of supernovae based on all available data, and today’s best models appear to match what they see in the skies. But actually witnessing a supernova—that is, for instance, actually measuring the changes in infrared radiation from start to finish while one was happening—could prove or disprove those ideas.
Kochanek explained how technology is making the study of Milky Way supernovae possible. Astronomers now have sensitive detectors for neutrinos (particles emitted from the core of a collapsing star) and gravitational waves (created by the vibrations of the star’s core) which can find any supernova occurring in our galaxy. The question is whether we can actually see light from the supernova because we live in a galaxy filled with dust—soot particles that Kochanek likened to those seen in diesel truck exhaust—that absorb the light and might hide a supernova from our view.
“Every few days, we have the chance to observe supernovae happening outside of our galaxy,” said doctoral student Scott Adams. “But there’s only so much you can learn from those, whereas a galactic supernova would show us so much more. Our neutrino detectors and gravitational wave detectors are only sensitive enough to take measurements inside our galaxy, where we believe that a supernova happens only once or twice a century.”
Adams continued: “Despite the ease with which astronomers find supernovae occurring outside our galaxy, it wasn’t obvious before that it would be possible to get complete observations of a supernova occurring within our galaxy. Soot dims the optical light from stars near the center of the galaxy by a factor of nearly a trillion by the time it gets to us. Fortunately, infrared light is not affected by this soot as much and is only dimmed by a factor of 20.”
By balancing all these factors, the astronomers determined that they have nearly a 100 percent chance of catching a prized Milky Way supernova during the next 50 years. Adams summarized the findings in an online video.
The astronomers’ plan takes advantage of the fact that supernovae issue neutrinos immediately after the explosion starts, but don’t brighten in infrared or visible light until minutes, hours, or even days later.
So, in the ideal scenario, neutrino detectors such as Super-Kamiokande (Super-K) in Japan would sound the alert the moment they detect neutrinos, and indicate the direction the particles were coming from. Then infrared detectors could target the location almost immediately, thus catching the supernova before the brightening begins. Gravitational wave observatories would do the same.
But given that not all neutrinos come from supernovae—some come from nuclear reactors, Earth’s atmosphere or the sun—how could a detector know the difference? A supernova would cause short bursts of neutrinos to be detected within a few seconds of each other. But rare glitches in the electronics can do the same thing, explained John Beacom, professor of physics and astronomy and director of the Center for Cosmology and Astro-Particle Physics at Ohio State.
“We need some way to tell immediately that a burst is due to a supernova,” Beacom said.
He and colleague Mark Vagins, an American neutrino expert working at Super-K, pointed out a decade ago how this could be done. Now Vagins and others have built a scale model of a special kind of neutrino detector in a new underground cave in Japan.
As coauthors on the Astrophysical Journal paper, Vagins and Beacom described the new detector, which they call EGADS for “Evaluating Gadolinium’s Action on Detector Systems.” At 200 tons, EGADS is much smaller than the 50,000-ton Super-K, but both consist of a tank of ultra-pure water.
In the case of EGADS, the water is spiked with a tiny amount of the element gadolinium, which helps register supernova neutrinos in a special way. When a neutrino from a Milky Way supernova enters the tank, it can collide with the water molecules and release energy, along with some neutrons. Gadolinium has a great affinity for neutrons, and will absorb them and then re-emit energy of its own. The result would be one detection signal followed by another a tiny fraction of a second later—a “heartbeat” signal inside the tank for each detected neutrino.
Vagins and Beacom hope that EGADS’ unmistakable heartbeat signal will enable neutrino detector teams to make more timely and confident announcements about supernova neutrino detections.
Vagins said that the experiment is going well so far, and he and the rest of the Super-K scientists may decide to add gadolinium to the tank as early as 2016. Because of its larger size, Super-K would also be able to measure the direction of the neutrinos. So the possibility of using Super-K to pinpoint a Milky Way supernova is on the rise.
For those of us who would hope to see a Milky Way supernova with our own eyes, however, the chances are lower and depend on our latitude on Earth. The last time it happened was in 1604, when Johannes Kepler spotted one some 20,000 light years away in the constellation Ophiuchus. He was in northern Italy at the time.
Could such a sighting happen again in the next half-century?
Adams did the math: the probability of a galactic supernova being visible with the unaided eye from somewhere on Earth within the next 50 years is approximately 20-50 percent, with people in the southern hemisphere getting the best of those odds, since they can see more of our galaxy in the night sky. The odds worsen as you go north; in Columbus, Ohio, for example, the chance could dip as low as 10 percent.
And Adams placed the odds that Ohioans would spy a truly dazzling supernova—like the one in 1604 that outshone all other stars in the sky—at only around 5 percent.
“The odds of seeing a spectacular display aren’t in our favor, but it is still an exciting possibility!” he concluded.
“With only one or two happening a century, the chance of a Milky Way supernova is small, but it would be a tragedy to miss it, and this work is designed to improve the chances of being ready for the scientific event of a lifetime,” Beacom concluded.
Schematic: HIV TAT (blue) permeates membrane, interacts with cytoskeleton (green).(Credit: Image courtesy of University of California – Los Angeles)
Further, these cell-penetrating peptides, or CPPs, can facilitate the cellular transfer of various molecular cargoes, from small chemical molecules to nano-sized particles and large fragments of DNA. Because of this ability, CPPs hold great potential as in vitro and in vivo delivery vehicles for use in research and for the targeted delivery of therapeutics to individual cells.
But exactly how cell-penetrating peptides — and particularly the HIV TAT peptide — accomplish these tasks has so far been a mystery.
“The HIV TAT peptide is special. People discovered that one can attach almost anything to this peptide and it could drag it across the cell,” said Gerard Wong, a professor of bioengineering and of chemistry and biochemistry at the UCLA Henry Samueli School of Engineering and Applied Science and the California NanoSystems Institute at UCLA. “So there are obvious beneficial drug-delivery and biotechnology applications.”
In a new study published in Proceedings of the National Academy of Science, UCLA Engineering researchers, including Wong and bioengineering professors Timothy Deming and Daniel Kamei, identify how HIV TAT peptides can have multiple interactions with the cell membrane, the actin cytoskeleton and specific cell-surface receptors to produce multiple pathways of translocation under different conditions.
Moreover, because the researchers now understand how cell-penetrating peptides work, they say it is possible to formulate a general recipe for reprograming normal peptides into CPPs.
“Prior to this, people didn’t really know how it all worked, but we found that the HIV TAT peptide is really kind of like a Swiss Army Knife molecule, in that it can interact very strongly with membranes, as well as with the cytoskeletons of cells,” said Wong, the study’s lead author. “The second part wasn’t well appreciated by the field.”
In addition to the membrane activity, researchers discovered that the HIV TAT peptide also creates its own binding site out of the membrane. This means the peptide can actually go through the membrane and induce the cytoskeleton directly to have an endocytotic event.
“We found that there are two channels of activity,” Wong said. “Because of the peculiar sequence of HIV TAT, it’s very good at being able to interact with membranes. Further, with the high-density packing of charged amino acids in the peptide, it can also interact very strongly with the cell’s cytoskeleton, as well as its receptors.”
In addition, the researchers noticed that small cargoes can be transferred directly, while cargoes larger than a few nanometers needed to be anchored to the membrane by the TAT peptide.
Deming, who specializes in synthetic methods, prepared the polypeptide samples for use in the experiments. Kamei, an expert in cellular trafficking, performed cell-based endocytosis experiments using inhibitor drugs and confocal microscopy to identify dominant mechanisms of endocytosis.
“This research is exciting because cell-penetrating peptides have been used in the area of drug delivery for some time,” Kamei said. “Gaining any additional understanding of these delivery agents will help in future drug-carrier designs.”
It is the group’s hope that the new understanding gained from their study will be used to engineer new molecules that are more effective in delivering therapeutic agents.
“This collaboration was important because it combined expertise in the areas of synthesis, characterization and cellular trafficking to address a very relevant problem,” Kamei said. “I definitely see more opportunity for combining these areas to tackle other problems in the growing field of biomaterials.”
The study was funded by the National Science Foundation and the National Institutes of Health.