logo

Breakthrough technique images breast tumors in 3-D with great clarity, reduced radiation

 

Breast tumor imaged in 3-D

 


 

 

Breast tumor (red) in 3-D: The red area represents a three-dimensional breast tumor.(Credit: ESRF-LMU/Emmanuel Brun)

 

 

Like cleaning the lenses of a foggy pair of glasses, scientists are now able to use a technique developed by UCLA researchers and their European colleagues to produce three-dimensional images of breast tissue that are two to three times sharper than those made using current CT scanners at hospitals. The technique also uses a lower dose of X-ray radiation than a mammogram.

 

These higher-quality images could allow breast tumors to be detected earlier and with much greater accuracy. One in eight women in the United States will be diagnosed with breast cancer during her lifetime.

 

The research is published the week of Oct. 22 in the early edition of the journal Proceedings of the National Academy of Sciences.

 

The most common breast cancer screening method used today is called dual-view digital mammography, but it isn’t always successful in identifying tumors, said Jianwei (John) Miao, a UCLA professor of physics and astronomy and researcher with the California NanoSystems Institute at UCLA.

 

“While commonly used, the limitation is that it provides only two images of the breast tissue, which can explain why 10 to 20 percent of breast tumors are not detectable on mammograms,” Miao said. “A three-dimensional view of the breast can be generated by a CT scan, but this is not frequently used clinically, as it requires a larger dose of radiation than a mammogram. It is very important to keep the dose low to prevent damage to this sensitive tissue during screening.”

 

Recognizing these limitations, the scientists went in a new direction. In collaboration with the European Synchrotron Radiation Facility in France and Germany’s Ludwig Maximilians University, Miao’s international colleagues used a special detection method known as phase contrast tomography to X-ray a human breast from multiple angles.

 

They then applied equally sloped tomography, or EST — a breakthrough computing algorithm developed by Miao’s UCLA team that enables high-quality image-reconstruction — to 512 of these images to produce 3-D images of the breast at a higher resolution than ever before. The process required less radiation than a mammogram.

 

 

In a blind evaluation, five independent radiologists from Ludwig Maximilians University ranked these images as having a higher sharpness, contrast and overall image quality than 3-D images of breast tissue created using other standard methods.

 

“Even small details of the breast tumor can be seen using this technique,” said Maximilian Reiser, director of the radiology department at Ludwig Maximilians University, who contributed his medical expertise to the research.

 

The technology commonly used today for mammograms or imaging a patient’s bones measures the difference in an X-ray’s intensity before and after it passes through the body. But the phase contrast X-ray tomography used in this study measures the difference in the way an X-ray oscillates through normal tissue rather than through slightly denser tissue like a tumor or bone. While a very small breast tumor might not absorb many X-rays, the way it changes the oscillation of an X-ray can be quite large, Miao said. Phase contrast tomography captures this difference in oscillation, and each image made using this technique contributes to the overall 3-D picture.

 

The computational algorithm EST developed by Miao’s UCLA team is a primary driver of this advance. Three-dimensional reconstructions, like the ones created in this research, are produced using sophisticated software and a powerful computer to combine many images into one 3-D image, much like various slices of an orange can be combined to form the whole. By rethinking the mathematic equations of the software in use today, Miao’s group developed a more powerful algorithm that requires fewer “slices” to get a clearer overall 3-D picture.

 

“The technology used in mammogram screenings has been around for more than 100 years,” said Paola Coan, a professor of X-ray imaging at Ludwig Maximilians University. “We want to see the difference between healthy tissue and the cancer using X-rays, and that difference can be very difficult to see, particularly in the breast, using standard techniques. The idea we used here was to combine phase contrast tomography with EST, and this combination is what gave us much higher quality 3-D images than ever before.”

 

While this new technology is like a key in a lock, the door will only swing open — bringing high-resolution 3-D imaging from the synchrotron facility to the clinic — with further technological advances, said Alberto Bravin, managing physicist of the biomedical research laboratory at the European Synchrotron Radiation Facility. He added that the technology is still in the research phase and will not be available to patients for some time.

 

“A high-quality X-ray source is an absolute requirement for this technique,” Bravin said. “While we can demonstrate the power of our technology, the X-ray source must come from a small enough device for it to become commonly used for breast cancer screening. Many research groups are actively working to develop this smaller X-ray source. Once this hurdle is cleared, our research is poised to make a big impact on society.”

 

These results represent the collaborative efforts of senior authors Miao, Bravin and Coan. Significant contributions were provided by co-first authors Yunzhe Zhao, a recent UCLA doctoral graduate in Miao’s laboratory, and Emmanuel Brun, a scientist working with Bravin and Coan. Other co-authors included Zhifeng Huang of UCLA and Aniko Sztrókay, Paul Claude Diemoz, Susanne Liebhardt, Alberto Mittone and Sergei Gasilov of Ludwig Maximilians University.

 

The research was funded by UC Discovery/Tomosoft Technologies; the National Institute of General Medical Sciences, a division of the National Institutes of Health; and the Deutsche Forschungsgemeinschaft–Cluster of Excellence Munich–Centre for Advanced Photonics.

 

 

 

Source: University of California, Los Angeles

 

Published on 24th October 2012

 

 

 

Related articles

 


… and Jie Shen of the University of Massachusetts Medical School. Other institutions that contributed included Roswell Park Cancer Institute, the University of North Carolina at Chapel Hill and Korea University …
University of Rochester Medical Center scientists discovered new genetic evidence linking cholesterol and cancer, raising the possibility that cholesterol medications could be useful in the future …
Men who have been treated for prostate cancer, either with surgery or radiation, could benefit from taking aspirin regularly, says a new study that includes a researcher at UT Southwestern Medical …
Eliminating the PSA test to screen for prostate cancer would be taking a big step backwards and would likely result in rising numbers of men with metastatic cancer at the time of diagnosis, predicted …
Compared to normal cells, cancer cells have a prodigious appetite for glucose, the result of a shift in cell metabolism known as aerobic glycolysis or the “Warburg effect.” Researchers focusing on …
Human Papilloma Virus (HPV), once almost exclusively associated with cancer of the cervix, is now linked to head and neck cancer. According to a new University at Buffalo study just published in the …
… health conditions including heart disease, stroke, type 2 diabetes and certain forms ofcancer,” says Dr. John DiBaise.  The authors further note that concentrations in the blood of lipopolysaccharides …
… to diagnose and track the progression of HIV, cancer and other diseases. “Current machines are very expensive, costing $100,000,” said Huang. “Using our innovations, we can develop a small one that could …
… the ability to shrink the size of cancerous tumors, even in the absence of any accompanying specific antigen.   In the past, the process of vaccine discovery involved the random selection of naturally …..
… of the treatment is limited by the tumor’s aggressive invasion of healthy brain tissue, which restricts chemotherapy access to the cancer cells and complicates surgical removal of the tumor.   To address …

 

 

Researchers Recover Recorder From Antarctic Waters Containing Critical Baseline on Acidification

 

A research team supported by the National Science Foundation (NSF) has retrieved data from a sensor in Antarctic waters that will provide critical baseline data on the changes in chemistry or acidification in those remote seas.

 


 

The all-female team–led by Gretchen Hofmann, a professor of ecology, evolution and marine biology at the University of California, Santa Barbara (UCSB)–retrieved the sensor intact earlier this month after the harsh polar winter near McMurdo Station, NSF’s logistics hub in Antarctica.

 

Hofmann’s team includes Pauline Yu, an NSF-funded postdoctoral research fellow at UCSB; Amanda Kelley, a graduate student at Portland State University; as well as Lydia Kapsenberg, a graduate student, and Olivia Turnross, an undergraduate, both at UCSB.

 

Deployed by divers under the sea ice and left in place at the end of the 2011-2012 Antarctic research season, the sensor gathered data through the month of June, which is the height of winter in the Southern Hemisphere. Data gathering ended when the instrument’s battery failed in the frigid waters.

 

The successful recovery will provide the first data of its kind about the relative acidity–expressed as pH–of the waters in McMurdo Sound.

 

While ship-borne sensors in Antarctic waters have made some pH measurements, data logged by the instrument is the first continuous record of pH in a coastal region under sea ice in the winter.

 

When combined with data collected in the summer, it will provide a more full picture of seasonal variations in pH.

 

Global acidification of the oceans is a concern to scientists, as increasing amounts of atmospheric carbon find their way into the seas, changing the water chemistry. Estimates are that 30-40 percent of carbon dioxide released into the atmosphere dissolves into the world’s oceans, rivers and lakes, changing the chemical balance of the water.

This change in ocean chemistry will have effects on marine life, making it difficult for some creatures to make protective shells and to reproduce and grow, for example. Impacts on individual species may then cascade to alter food chains and change how species interact with one another, potentially altering entire ecosystems.

 

Gathering data on global trends in ocean acidification is an NSF research priority. The agency’s Ocean Acidification program recently awarded new grants worth $12 million to a cohort of 16 institutions across the nation to address concerns for acidifying marine ecosystems.

 

The program is part of NSF’s Science, Engineering and Education for Sustainability portfolio. NSF’s Directorates for Geosciences and Biological Sciences, and Office of Polar Programs support the awards, the second round in this program.

 

While Hofmann’s Antarctic work was funded by NSF’s Office of Polar Programs independently from the agency’s Ocean Acidificiation program, numerous individual research projects led the scientific community to understand the need to measure the scope and effect of the acidification phenomenon globally, and provided the impetus for the agency’s broader support for acidification research.

 

Acquiring the data on Antarctic pH is crucial to understanding the current state of the ecosystem in order to place future measurements of pH in the region’s oceans in context, Hofmann noted.

 

Having a pH baseline will provide an important benchmark for scientists to begin to test whether certain species have the physiological and genetic characteristics to adapt to projected change.

 

Globally, “one of our central research challenges is to forecast whether species will be able to adapt to a rapidly changing environment,” Hofmann said. “It is critical to obtain current measurements of pH to help understand the environment that organisms will face in the future.”

 

The waters surrounding Antarctica are biologically prolific and a complex ecosystem. Species previously unknown to science continue to be found there and species exist in Antarctic waters that have adapted to the extremely cold waters and survive nowhere else. But only relatively recently–since the late 1950’s–have scientists explored the Southern Ocean in any systematic way.

Making pH measurements in the Antarctic has been difficult for a number of reasons, including the lack of a durable sensor that is able to withstand the stresses put on equipment by Southern Ocean sea ice cover.

 

The team was successful in gathering the data during the winter by deploying an ocean pH sensor called a SeaFET, which was developed through an NSF award from the Division of Ocean Sciences to Todd Martz, of the Scripps Institution of Oceanography at the University of California, San Diego. SeaFET development also was supported by funding from the David and Lucile Packard Foundation to Kenneth Johnson, of the Monterey Bay Aquarium Research Institute.

 

“Returning the first pH time series from such a remote and harsh environment is a true victory for all of the scientists involved; it represents a great example of technology developed through one NSF staff area, [the Directorate for Geosciences], enabling the research of another staff area, [the Office of Polar Programs ],” said Martz.

 

Source:- National Science Foundation.

 

Published on 20th October 2012

 

Starvation hormone markedly extends mouse life span, researchers report

Starve hormone Drs. Zhang, Mangelsdorf, Kliewer

Drs. David Mangelsdorf, Yuan Zhang, and Steven Kliewer (l-r) have demonstrated that a starvation hormone markedly extends life span in mice without the need for calorie restriction. (Credit: image courtesy of UT Southwestern Medical Center)
A study by UT Southwestern Medical Center researchers finds that a starvation hormone markedly extends life span in mice without the need for calorie restriction.

“Restricting food intake has been shown to extend lifespan in several different kinds of animals. In our study, we found transgenic mice that produced more of the hormone fibroblast growth factor-21 (FGF21) got the benefits of dieting without having to limit their food intake. Male mice that overproduced the hormone had about a 30 percent increase in average life span and female mice had about a 40 percent increase in average life span,” said senior author Dr. Steven Kliewer, professor of molecular biology and pharmacology.

The study published online in eLife – a new peer-reviewed, open access journal – defined average life span as the point at which half the members of a given test group remained alive. A study to determine differences in maximum life span is ongoing: While none of the untreated mice lived longer than about 3 years, some of the female mice that overproduced FGF21 were still alive at nearly 4 years, the researchers report.

FGF21 seems to provide its health benefits by increasing insulin sensitivity and blocking the growth hormone/insulin-like growth factor-1 signaling pathway.  When too abundant, growth hormone can contribute to insulin resistance, cancer, and other diseases, the researchers said.

FGF21 is a hormone secreted by the liver during fasting that helps the body adapt to starvation. It is one of three growth factors that are considered atypical because they behave like hormones, which are substances created by one part of the body that have effects in other parts, the researchers said.

“Prolonged overproduction of the hormone FGF21 causes mice to live extraordinary long lives without requiring a decrease in food intake. It mimics the health benefits of dieting without having to diet,” said co-author Dr. David Mangelsdorf, chairman of pharmacology and a Howard Hughes Medical Institute (HHMI) investigator at UT Southwestern.

“Aging and aging-related diseases represent an increasing burden on modern society. Drugs that slow the aging process would be very desirable. These findings raise the possibility of a hormone therapy to extend life span,” said Dr. Mangelsdorf, who runs a research laboratory with Dr. Kliewer. They first identified FGF21’s starvation-fighting effects in a 2007 study.

Lead author Dr. Yuan Zhang, an instructor of pharmacology, said the study was considered risky because all involved understood it would be at least two years – an average mouse life span – before there would be any evidence of whether elevated production of FGF21 would affect longevity.

Previous research has found that FGF21 can reduce weight in obese mice. The mice that overproduced FGF21 in this latest study were lean throughout their lives and remained lean even while eating slightly more than the wild-type mice, the researchers said.

The hormone does have some downsides: FGF21 overproducers tended to be smaller than wild-type mice and the female mice were infertile. While FGF21 overproducers had significantly lower bone density than wild-type mice, the FGF21-abundant mice exhibited no ill effects from the reduced bone density and remained active into old age without any broken bones, the researchers said.

“FGF21 is not affecting their mobility. These guys are spry. They live nice, long lives,” Dr. Kliewer said. “But the decreased bone density and female infertility will require additional research to determine if it is possible to separate out the hormone’s life span-extending effects from its effect on bone,” he added.

The study was supported by the National Institutes of Health, the Robert A. Welch Foundation, the Leona M. and Harry B. Helmsley Charitable Trust, and the HHMI.

UT Southwestern co-authors are Dr. Yang Xie, assistant professor of clinical sciences; Dr. Eric Berglund, postdoctoral researcher in the Division of Hypothalamic Research; Dr. Katie Colbert Coate, postdoctoral researcher in pharmacology; Dr. Tian Teng He, senior research associate in the Advanced Imaging Research Center; Dr. Takeshi Katafuchi, instructor of pharmacology; Dr. Guanghua Xiao, assistant professor of clinical sciences; Drs. Matthew Potthoff and Wei Wei, both postdoctoral researchers in pharmacology; and Dr. Yihong Wan, assistant professor of pharmacology. Drs. Ruth Yu and Ronald Evans of the Salk Institute in San Diego also participated in the research.

Source: UT Southwestern Medical Center
Published on 20th October 2012

Restricting nuclear power has little effect on the cost of climate policies

 

“Questions have been raised if restricting nuclear energy – an option considered by some countries after the accident in Fukushima, Japan – combined with climate policies might get extremely expensive. Our study is a first assessment of the consequences of a broad range of combinations of climate and nuclear policies,” lead author Nico Bauer says. Restrictions on nuclear power could be political decisions, but also regulations imposed by safety authorities. Power generation capacities would have to be replaced, but fossil fuels would become costly due to a price on CO2 emissions, this in sum is the main concern.

 


 

“However, in case of restricted use of nuclear power, the flexibility of allocating a long-term carbon budget over time enables higher near-term emissions due to increased power generation of natural gas,” Bauer says. Along with demand reductions and efficiency improvements, these provisions could help fill the gap on electricity. The price of natural gas is projected to decrease due to demand reductions, according to the study. Decommissioning existing plants will also avoid refurbishment costs for expanding lifetimes of old nuclear power plants.

 

As a result, early retirement of nuclear power plants would lead to cumulative global gross domestic product losses (GDP) that amount to about 10 percent of climate policy costs. If no new nuclear capacities are allowed, the costs would amount to 20 percent.

 

For their study, the scientists looked into different nuclear power policies. These cover a range of scenarios from “Renaissance”, with a full utilization of existing power plants, a possible refurbishment for a lifetime expansion and investments in new nuclear power capacities, to “Full exit”, with a decommissioning of existing power plants and no new investments. They contrasted each scenario with climate policies implemented via an inter-temporal global carbon budget which puts a price on carbon emissions. For the budget, the cumulative CO2 emissions from the global energy sector were limited to 300 gigatons of carbon from 2005 until the end of the century. This represents a climate mitigation policy consistent with the target of limiting global warming to 2 degrees Celsius.

 

“A surprising result of our study is the rather little difference between a ‘Renaissance’ or a ‘Full exit’ of nuclear power in combination with a carbon budget when it comes to GDP losses,” Bauer says. While the ‘no policy case’ with a nuclear phase-out and no carbon budget has only negligible effect on global GDP, the imposition of a carbon budget with no restrictions on nuclear policy implies a reduction of GDP that reaches 2.1 percent in 2050. The additional phase-out of nuclear power increases this loss by about 0.2 percent in 2050 and hence has only little additional impact on the economy, because the contribution of nuclear power to the electricity generation can be substituted relatively easy by alternative technology options, including the earlier deployment of renewables.

 

Source: Potsdam Institute for Climate Impact Research (PIK)

 

 

Published on 13th October 2012

 

Chaperone protein subverts removal of glaucoma-causing protein, USF-led study finds

 

The chaperone protein Grp94 can interfere with the clearance of another protein known to cause the glaucoma when mutated, a new study led by researchers at the University of South Florida has found. Using a cell model, the researchers also demonstrated that a new specific inhibitor of Grp94 facilitates clearance of the genetically-defective protein, called myocilin, from cells.

 


 

Reported online this month in JBC (The Journal of Biological Chemistry), the discoveries could lead to a new treatment for some hereditary cases of glaucoma, an eye disease that is a leading cause of blindness, said prinicipal investigator Chad Dickey, PhD, associate professor of molecular medicine at the USF Health Byrd Alzheimer’s Institute.

 

“When mutated, the glaucoma-causing protein becomes toxic to a cell network known as the trabecular meshwork cells that regulate pressure within the eye,” Dickey said. “Once these cells die, the ocular pressure increases, causing glaucoma.”

 

Genetic defects of myocilin account for approximately 8 to 36 percent of hereditary juvenile-onset glaucoma and 5 to 10 percent of adult-onset hereditary glaucoma.

 

The researchers suggest that mutant myocilin, triggered by an interaction with the chaperone Grp94, is highly resistant to degradation, thus clogging the protein quality control pathway and subverting efficient removal of the glaucoma-causing protein.  So, the development of targeted therapies to inhibit Grp94 may be beneficial for patients suffering from myocilin glaucoma.

 

Source:- University of South Florida, (USF Health)

 

Published on 13th October 2012

 

HOME-BASED ASSESSMENT TOOL FOR DEMENTIA SCREENING

ClockMe System 2

 


 

ClockReader is the actual test and is taken with a stylus and computer or tablet. The participant is given a specific time and instructed to draw a clock with numbers and the correct minute and hour hands. Once completed, the sketch is emailed to a clinician, who uses the ClockAnalyzer Application to score the test. The software checks for 13 traits. They include correct placement of numbers and hands without extra markings. People with cognitive impairment frequently draw clocks with missing or extra numbers. Digits are sometimes drawn outside of the clock. The time is often incorrect. (Credit: Georgia Institute of Technology)

 

With baby boomers approaching the age of 65 and new cases of Alzheimer’s disease expected to increase by 50 percent by the year 2030, Georgia Tech researchers have created a tool that allows adults to screen themselves for early signs of dementia. The home-based computer software is patterned after the paper-and-pencil Clock Drawing Test, one of health care’s most commonly used screening exams for cognitive impairment.

 

“Technology allows us to check our weight, blood-sugar levels and blood pressure, but not our own cognitive abilities,” said project leader Ellen Yi-Luen Do. “Our ClockMe System helps older adults identify early signs of impairment, while allowing clinicians to quickly analyze the test results and gain valuable insight into the patient’s thought processes.”

 

Georgia Tech’s ClockMe system eliminates the paper trail and computerizes the test into two main components: the ClockReader Application and the ClockAnalyzer Application. Click here to see a video demo.

 

ClockReader is the actual test and is taken with a stylus and computer or tablet. The participant is given a specific time and instructed to draw a clock with numbers and the correct minute and hour hands. Once completed, the sketch is emailed to a clinician, who uses the ClockAnalyzer Application to score the test. The software checks for 13 traits. They include correct placement of numbers and hands without extra markings. People with cognitive impairment frequently draw clocks with missing or extra numbers. Digits are sometimes drawn outside of the clock. The time is often incorrect.

 

In addition to scoring automatically and consistently, ClockAnalyzer records the duration of the test and the time between each stroke. The software also replays the drawing in real-time, allowing a clinician to watch the drawing being created to observe any behavior abnormality.

 

“The traditional paper-and-pencil test is usually overseen by a technician and later scored by a clinician, who scores the test based only on the finished drawing,” said Do, a professor in Georgia Tech’s Colleges of Computing and Architecture. “By looking at the sketch, the scorer is not able to decipher whether the person struggled to remember certain numbers while drawing the clock. The ClockMe system’s timing software highlights those delays.”

 

And, because they’re saved electronically, the drawings can be used to easily compare a person’s cognitive ability progress or regression over time. Do’s research found that traditional tests are often filed in a folder and are rarely used for future comparison.

 

The ClockMe system was initially tested at the Emory Alzheimer’s Disease Research Center in Atlanta, where it’s currently being used in addition to the traditional paper-and-pencil test. Despite a lack of computer literacy, all of the elderly patients who used the software during the study said they had no problems with the pen-based, computer technology.

 

“For this reason, as well as the ability to send the drawings directly to clinicians for convenient scoring, we envision ClockMe as a viable tool for home-based screening,” said Do. “America’s health care costs are expected to soar as baby boomers become senior citizens. If a screening tool can be used at home, unnecessary trips to clinics can be eliminated and medical expenses can be saved.”

 

Do and her colleagues are hoping to commercialize the project in the future. Their research was published in September’s Journal of Ambient Intelligence and Smart Environments.

 

This project is supported by the National Science Foundation (NSF) (Award Number SHB-1117665).

 

 

 

Source: Georgia Institute of Technology

 

Published on 7th October  2012

 

 

 

Related articles

Alzheimer’s Cognitive Decline Slows in Advanced Age


Alzheimer’s Gene Causes Brain’s Blood Vessels to Leak, Die


Scientists Gain New Understanding of Alzheimer’s Trigger


USF researchers find that Alzheimer’s precursor protein controls its own fate

 

ASU scientists bring the heat to refine renewable biofuel production

 

4560923371_0d198cf5d8_n

 


 

 

Roy Curtiss and Xinyao Liu have been genetically optimizing cyanobacteria for biofuel production. (Credit: Image courtesy of Arizona State University)

 

 

Perhaps inspired by Arizona’s blazing summers, Arizona State University scientists have developed a new method that relies on heat to improve the yield and lower the costs of high-energy biofuels production, making renewable energy production more of an everyday reality.

 

ASU has been at the forefront of algal research for renewable energy production. Since 2007, with support from federal, state and industry funding, ASU has spearheaded several projects that utilize photosynthetic microbes, called cyanobacteria, as a potential new source of renewable, carbon-neutral fuels. Efforts have focused on developing cyanobacteria as a feedstock for biodiesel production, as well as benchtop and large-scale photobioreactors to optimize growth and production.

 

ASU Biodesign Institute researcher Roy Curtiss, a microbiologist who uses genetic engineering of bacteria to develop new vaccines, has adapted a similar approach to make better biofuel-producing cyanobacteria.

 

“We keep trying to reach ever deeper into our genetic bag of tricks and optimize bacterial metabolic engineering to develop an economically viable, truly green route for biofuel production,” said Roy Curtiss, director of the Biodesign Institute’s Centers for Infectious Diseases and Vaccinology and Microbial Genetic Engineering as well as professor in the School of Life Sciences.

 

Cyanobacteria are like plants, dependent upon renewable ingredients including sunlight, carbon dioxide and water that, through genetic engineering, can be altered to favor biodiesel production. Cyanobacteria offer attractive advantages over the use of plants like corn or switchgrass, producing many times the energy yield with energy input from the sun and without the necessity of taking arable cropland out of production.

 

Colleague Xinyao Liu and Curtiss have spent the last few years modifying these microbes.  Their goal is to bypass costly processing steps (such as cell disruption, filtration) for optimal cyanobacterial biofuel production.

 

“We wanted to develop strains of cyanobacteria that basically can process themselves,” said Curtiss. “A couple of years ago, we developed a Green Recovery process that is triggered by removing carbon dioxide to control the synthesis of enzymes, called lipases, that degrade the cell membranes and release the microbes’ precious cargo of free fatty acids that can be converted to biofuels,”

 

However, when growth of cyanobacteria is scaled up to meet industrial needs, they become dense, and the self-shading that occurs in concentrated cultures, does not let in enough light to produce enough of the lipases to efficiently drive the process. Thus the original Green Recovery was light dependent and maximally efficient at sub-optimal culture densities.

 

Curtiss’ team looked again at nature to improve their Green Recovery method. The process uses enzymes found in nature called thermostable lipases synthesized by thermophilic organisms that grow at high temperatures such as in hot springs. These thermostable lipases break down fats and membrane lipids into the fatty acid biodiesel precursors, but only at high temperatures. The team’s new process, called thermorecovery, uses a heat-triggered, self-destruct system. By taking a culture and shifting to a high temperature, the lipases are called into action. This process occurs with concentrated cultures in the dark under conditions that would be very favorable for an industrial process.

 

They tested a total of 7 different lipases from microbes that thrive in hot springs under very high temperatures, a scorching 60-70 C (158F). The research team swapped each lipase gene into a cyanobacteria strain that grows normally at 30 C (86 F) and tested the new strains.

 

They found the Fnl lipase from Feridobacterium nodosum, an extremophile found in the hot springs of New Zealand, released the most fatty acids. The highest yield occurred when the carbon dioxide was removed from the cells for one day (to turn on the genes making the lipases), then treated at 46C (114F) for two days (for maximum lipase activity).

 

The yield was 15 percent higher than the Green Recovery method, and because there were less reagents used, time (one day for thermorecovery vs. one week for Green Recovery) and space for the recovery.  Thermorecovery resulted in an estimated 80% cost savings.

 

Furthermore, in a continuous semi-batch production experiment, the team showed that daily harvested cultures released could release a high level of fatty acid and the productivity could last for at least 20 days. Finally, the water critical to growing the cultures could be recycled to maintain the growth of the original culture.

 

“Our latest results are encouraging and we are confident of making further improvements to achieve enhanced productivity in strains currently under construction and development,” said Curtiss. “In addition, optimizing growth conditions associated with scale-up will also improve productivity.”

 

Source:-  Arizona State University

 

Published on 7th October  2012

 

 

Related articles

Bioengineered Marine Algae Expands Environments Where Biofuels Can Be Produced

 

New imaging technique homes in on electrocatalysis of nanoparticles


Termites’ digestive system could act as biofuel refinery

 

Patent Issued for Technology that Improves Eyesight Dramatically

 

A U.S. patent has been issued to the University of Rochester for technology that has boosted the eyesight of tens of thousands of people around the world to unprecedented levels and reduced the need for patients to undergo repeat surgeries.

 


 

The patent issued this week for work done by Scott MacRae, M.D., director of the Refractive Surgery Center at the Flaum Eye Institute, and Manoj Venkiteshwar, Ph.D., formerly a post-doctoral researcher at the University’s Center for Visual Science.

 

The pair invented the Rochester Nomogram, a complex formula that helps physicians determine how refractive surgery, such as LASIK, will affect a person’s eyesight. The Nomogram adjusts the way a laser interacts with a person’s eye tissue, vastly reducing the chances that the patient’s eyes will be near-sighted or far-sighted after the procedure.

 

Thanks to the Nomogram, MacRae’s team has been able to slash by two thirds the number of patients who need additional procedures to achieve the best vision possible. Currently, a remarkable 99.3 percent of MacRae’s patients see 20/20 or better after refractive surgery. He presented the data earlier this month at a meeting of the

 

European Society of Cataract and Refractive Surgeons in Italy.

 

“Eyesight is crucial to everyone’s quality of life,” said MacRae. “As a physician, I am required to do everything in my power to make sure each of my patients has the very best vision possible. There’s nothing like the feeling of having a patient sit up after surgery, look at the clock on the wall, and exclaim that it’s the first time in decades they’ve been able to tell the time without wearing glasses.

 

The technology has been licensed to Technolas Perfect Vision, a cataract and refractive laser company that is a product of a joint venture between Bausch + Lomb and 20/10 Perfect Vision AG. As a result, tens of thousands of people around the world have had vision procedures in which the Nomogram has played a role.

 

“It’s also gratifying that our work is benefitting not only our own patients but also others around the world,” added MacRae.

 

The patent is the latest development in a 20-year effort by University scientists and physicians to study and improve human vision.

 

In the early 1990s, scientist David Williams, Ph.D., director of the Center for Visual Science, began a series of experiments to look into the eye in unprecedented detail, not only to see the organ’s fine structures but also to understand how light moves around inside the eye.

 

His pioneering work opened the door, for the first time in history, to the possibility of fixing not only the three major flaws in the eye that reading glasses and contact lenses have corrected for decades, but also approximately 60 additional imperfections that were never known before. Nearly everyone has these flaws in their eyes to some extent; while most people don’t notice them, they hurt our quality of vision in subtle ways.

 

MacRae, an internationally recognized refractive surgeon, moved to Rochester in 2000 from Portland, Ore., to help bring the developments to the bedsides of patients and give them a quality of eyesight that was not possible before Williams’ work. Through a series of clinical trials and work in the laboratory, the Rochester team did just that.

 

The team helped to create a field known as customized ablation, a form of LASIK that corrects subtle imperfections, bringing about a super-crisp quality of eyesight. Beyond making vision on the order of 20/15 or 20/16 possible or even commonplace in some groups of patients, the technology also increases the eye’s ability to see in situations where there is low light or little contrast.

 

Physicians like MacRae plays a crucial link in Rochester’s thriving vision research, helping scientists working in the laboratory understand the challenges that practicing physicians and their patients face. Together teams at the Flaum Eye Institute and the Center for Visual Science tackle eye problems from many directions – MacRae and colleagues from the vantage point of clinical experience, and researchers from dozens of laboratories working on an array of experiments that are difficult for non-scientists to grasp. Sometimes, as with the Nomogram, the teams meet in a place that improves the quality of life for people around the globe.

 

 

Source: University of Rochester Medical Center

 

Published on 30th September 2012

 

Nanoparticles Glow Through Thick Layer of Tissue

 

An international research team has created unique photoluminescent nanoparticles that shine clearly through more than 3 centimeters of biological tissue — a depth that makes them a promising tool for deep-tissue optical bioimaging.


 

 

Though optical imaging is a robust and inexpensive technique commonly used in biomedical applications, current technologies lack the ability to look deep into tissue, the researchers said.


This creates a demand for the development of new approaches that provide high-resolution, high-contrast optical bioimaging that doctors and scientists could use to identify tumors or other anomalies deep beneath the skin.


The newly created nanoparticles consist of a nanocrystalline core containing thulium, sodium, ytterbium and fluorine, all encased inside a square, calcium-fluoride shell.


The particles are special for several reasons. First, they absorb and emit near-infrared light, with the emitted light having a much shorter wavelength than the absorbed light. This is different from how molecules in biological tissues absorb and emit light, which means that scientists can use the particles to obtain deeper, higher-contrast imaging than traditional fluorescence-based techniques.


Second, the material for the nanoparticles’ shell –calcium fluoride — is a substance found in bone and tooth mineral. This makes the particles compatible with human biology, reducing the risk of adverse effects. The shell is also found to significantly increase the photoluminescence efficiency.


To emit light, the particles employ a process called near-infrared-to-near-infrared up-conversion, or “NIR-to-NIR.” Through this process, the particles absorb pairs of photons and combine these into single, higher-energy photons that are then emitted.


One reason NIR-to-NIR is ideal for optical imaging is that the particles absorb and emit light in the near-infrared region of the electromagnetic spectrum, which helps reduce background interference. This region of the spectrum is known as the “window of optical transparency” for biological tissue, since the biological tissue absorbs and scatters light the least in this range.


The scientists tested the particles in experiments that included imaging them injected in mice, and imaging a capsule full of the particles through a slice of pork more than 3 centimeters thick. In each case, the researchers were able to obtain vibrant, high-contrast images of the particles shining through tissue.


The results of the study appeared online on Aug. 28 in the ACS Nano journal. The international collaboration included researchers from the University at Buffalo and other institutions in the U.S., China, South Korea and Sweden. It was co-led by Paras N. Prasad, a SUNY Distinguished Professor and executive director of UB’s Institute for Lasers, Photonics and Biophotonics (ILPB), and Gang Han, an assistant professor at University of Massachusetts Medical School.


“We expect that the unprecendented properties in the core/shell nanocrystals we designed will bridge numermous disconnections between in vitro and in vivo studies, and eventully lead to new discoveries in the fields of biology and medicine,” said Han, expressing his excitement about the research findings.


Study co-author Tymish Y. Ohulchanskyy, a deputy director of ILPB, believes the 3-centimeter optical imaging depth is unprecedented for nanoparticles that provide such high-contrast visualization.


“Medical imaging is an emerging area, and optical imaging is an important technique in this area,” said Ohulchanskyy. “Developing this new nanoplatform is a real step forward for deeper tissue optical bioimaging.”


The paper’s first authors were Guanying Chen, research assistant professor at ILPB and scientist at China’s Harbin Institute of Technology and Sweden’s Royal Institute of Technology and Jie Shen of the University of Massachusetts Medical School. Other institutions that contributed included Roswell Park Cancer Institute, the University of North Carolina at Chapel Hill and Korea University at Seoul.


The next step in the research is to explore ways of targeting the nanoparticles to cancer cells and other biological targets that could be imaged. Chen, Shen and Ohulchanskyy said the hope is for the nanoparticles to become a platform for multimodal bioimaging.


Source:- University at Buffalo

Published on 30th september 2012

 

 

 

Related articles

Rice unveils super-efficient solar-energy technology

 

Lava dots: Rice makes hollow, soft-shelled quantum dots

 

 

Researchers develop new ‘stamping’ process to pattern biomolecules at high resolution

 

‘Nanoresonators’ might improve cell phone performance


New UCLA Engineering research center to revolutionize nanoscale electromagnetic devices


 

Self-Assembling Nanocubes for Next Generation Antennas and Lenses


New imaging technique homes in on electrocatalysis of nanoparticles

 

Sleep apnea in obese pregnant women linked to poor maternal and neonatal outcomes

 

The newborns of obese pregnant women suffering from obstructive sleep apnea are more likely to be admitted to the neonatal intensive care unit than those born to obese mothers without the sleep disorder, reports a study published online today in the journal Obstetrics & Gynecology.

 


 

Sleep apnea, which causes repeated awakenings and pauses in breathing during the night, was also associated with higher rates of preeclampsia in the severely overweight pregnant women, the researchers found.

 

“Our findings show that obstructive sleep apnea can contribute to poor outcomes for both obese mothers and their babies,” said the study’s lead author Dr. Judette Louis, assistant professor of obstetrics and gynecology at the University of South Florida. “Its role as a risk factor for adverse pregnancy outcomes independent of obesity should be examined more closely.”

 

Dr. Louis, who holds a joint appointment in the USF College of Public Health’s Department of Community and Family Health, conducted the study while a faculty member at Case Western Reserve University’s School of Medicine. A specialist in maternal-fetal medicine, she worked with researchers from Case Western Reserve, the USF Health Morsani College of Medicine’s Center for Evidence-Based Medicine, and Harvard Medical School. She joined USF in April.

 

The researchers analyzed data for 175 obese pregnant women enrolled in a prospective observational study, which screened prenatal patients at Cleveland’s MetroHealth Medical Center for sleep-related breathing disorders.  The women were tested for obstructive sleep apnea using an in-home portable device at bedtime.

 

Perinatal and newborn outcomes for 158 live births, including indications for NICU admissions such as respiratory complications, prematurity and congenital defects, were also reviewed.

 

Among the study findings:

 

–  The prevalence of sleep apnea among study participants was 15.4 percent.

 

–  Compared to the women with no sleep apnea (control group), the group with sleep apnea was heavier and experienced more chronic high blood pressure. This finding was consistent with studies in the general population that have associated sleep-disordered breathing with high blood pressure and weight gain.

 

–  The women with sleep apnea were more likely than the control group to undergo a cesarean delivery and to develop preeclampsia, a medical condition in which high blood pressure in pregnancy is associated with loss of protein in the urine.   Preeclampsia remains one of the most common dangerous medical conditions for both moms and babies.

 

–  Despite having similar rates of preterm births, the women with sleep apnea delivered offspring more likely to be admitted to the NICU than did their counterparts without sleep apnea. Many of these admissions were due to respiratory distress.  The researchers suggest the higher NICU admission rates may be explained in part by the higher C-section rates among the women with sleep apnea, but more study is needed.

 

Approximately one in five women are obese when they become pregnant, meaning they have a body mass index of at least 30, according to research from the federal Centers for Disease Control and Prevention.  While numerous studies have examined complications associated with obesity in pregnancy – including high blood pressure, gestational diabetes and cesarean deliveries — sleep apnea has been underdiagnosed and understudied in this population of women.

 

The study authors suggest the best way to decrease obesity-related conditions that lead to poor pregnancy outcomes, including sleep apnea, would be to treat obesity before a woman becomes pregnant, but acknowledge that “losing weight is often difficult.”

 

Dr. Louis said the study also points to the need for better ways to screen and treat this common form of sleep-disordered breathing during pregnancy.

 

The study was supported by the Robert Wood Johnson Foundation Physician Faculty Scholars program, the Case Western Reserve/Cleveland Clinic CTSA, and the National Center for Research Resources, NIH.

 

Source:- University of South Florida, (USF Health)

 

Published on 25th September 2012