Showing posts with label SCIENCE. Show all posts
Showing posts with label SCIENCE. Show all posts

Tuesday, May 5, 2015

SAFEGUARDING CAPE FEAR RIVER BASIN WATER QUALITY (NSF VIDEO INCLUDED)


FROM:  NATIONAL SCIENCE FOUNDATION
Testing the waters: 1,4-Dioxane in North Carolina's Cape Fear River Basin
Environmental engineers play vital role in determining next steps to safeguard drinking water

It was an email from a colleague that tipped off environmental engineer Detlef Knappe of possible 1,4-dioxane contamination in the Cape Fear River Basin, North Carolina's largest watershed and a source of drinking water for communities across the state.

The Environmental Protection Agency (EPA) has classified 1,4-dioxane as a probable human carcinogen. It is an industrial solvent used in the textile and paper industries and a by-product of manufacturing processes involving ethylene oxide, such as the production of certain plastics and surfactants used in laundry detergents and shampoos.

With support from a National Science Foundation (NSF) RAPID grant, Knappe and his team at North Carolina State University (NCSU) have begun to identify 1,4-dioxane sources and how 1,4-dioxane impacts drinking water quality. (RAPID is short for Grants for Rapid Response Research.) Another research goal is to determine whether home filtration devices effectively remove 1, 4-dioxane from tap water and how long those filters will last.

Knappe is also working with managers at water treatment plants and state policymakers in North Carolina to improve testing and treatment standards for 1,4-dioxane. Knappe says so far, the research team has identified three North Carolina communities as key sources of 1,4-dioxane and those communities are now working with the state and NCSU to identify which facilities are the source of the contamination.

He says the team is also evaluating point-of-use devices, such as pitcher and refrigerator filters, and has identified a new adsorbent that is quite effective for 1,4-dioxane removal.

The research in this episode was supported by NSF award #1449768, RAPID; GOALI: Sources of 1,4-Dioxane in the Cape Fear River Watershed of North Carolina and Treatment Options for 1,4-Dioxane Control. GOALI is NSF's Grant Opportunities for Academic Liaison with Industry program.

Miles O'Brien, Science Nation Correspondent
Ann Kellan, Science Nation Producer

Sunday, May 3, 2015

CROWDSOURCED SCIENCE TRACKS "SUDDEN OAK DEATH"

FROM:  NATIONAL SCIENCE FOUNDATION

Trees turned to snags: "Sudden Oak Death" fells California oaks in their prime
Citizen scientists assist with research on infectious plant disease
Sudden oak death, beware.

Crowdsourced science is helping to predict the path of the deadly plant disease, demonstrating the contributions trained citizen scientists can make in large-scale geographic tracking projects.

That's the conclusion of a study of sudden oak death monitoring in California. The results are published in this month's issue of the journal Frontiers in Ecology and the Environment.

Ebola of the plant world

"Sudden oak death is the Ebola of the plant world, the most serious threat to non-agricultural plants," says lead paper author Ross Meentemeyer, director of the Center for Geospatial Analytics at North Carolina State University.

The disease, which has killed millions of oak and tanoak trees in California and Oregon, can infect up to 60 landscape plant species and spread from nursery stock to residential landscapes.

Starting in 2008, University of California, Berkeley, researchers expanded their sudden oak death monitoring efforts exponentially, thanks to observations from 1,600 trained volunteers who collected leaf samples from trees in metropolitan and urban-wildland areas.

Citizen scientists often needed in research

"To answer many science questions, we need the efforts of a large number of people--and the general public can help," says Sam Scheiner, National Science Foundation (NSF) director for the NSF-NIH-USDA Ecology and Evolution of Infectious Diseases Program, which funded the research.

"This study shows that asking local residents to report on the locations of outbreaks of sudden oak death can provide critical information. The result is a better understanding of the spread of this serious plant disease."

Adds Meentemeyer, "We were able to get data from backyards in the San Francisco Bay area, along with other locations.

"Those data were used to develop accurate computer models for the disease's spread, showing that properly trained and educated citizen scientists can collect data that's just as reliable as that of professionals."

Predictions allow for targeted treatments

Accurate predictions about sudden oak death's spread allow scientists to target treatments to the most vulnerable areas, says paper co-author and forest pathologist Matteo Garbelotto of UC Berkeley.

The annual Sudden Oak Death Blitz, which includes extensive publicity during peak periods for the disease, involves high school students, homeowners, tree specialists, firefighters, teachers and others.

Follow-up evaluation showed that trained citizen scientists were as effective as experts in identifying and collecting diseased tree leaves, whether or not they reported having a professional background in science.

Additional authors of the paper are Monica Dorning and John Vogler of NC State and Douglas Schmidt of UC Berkeley.

-- Cheryl Dybas, NSF
-- D'Lyn Ford, NCSU
Investigators
David Rizzo
Ross Meentemeyer
Related Institutions/Organizations
University of California-Davis
University of North Carolina at Charlotte

Friday, May 1, 2015

EXAMINING ENVIRONMENTAL STEWARDSHIP OF BAJA CALIFORNIA SUR REGION

FROM:  NATIONAL SCIENCE FOUNDATION
Scientists map sustainability challenges in Baja California Sur fisheries

Conduct review of ecological and social data for 12 regions around Baja
The waters surrounding Baja California Sur are teeming with fish. They're also favorite locales for fishers in pursuit of a living.

But like cross-currents in the ocean, the needs of humans and fish meet--and often clash--in this deep blue sea.

In results published this week in the journal Proceedings of the National Academy of Sciences (PNAS), scientists assessed marine sustainability in the Baja California Sur region by applying a framework that accounts for both ecological and human dimensions of environmental stewardship.

The resulting maps, the researchers say, are mosaics of regional sustainability outlooks with the potential to guide environmental policy.

The maps, according to lead PNAS paper author Heather Leslie of Brown University in Providence, R.I., show regional strengths and weaknesses that can help guide fishers, conservationists and other decision-makers as they consider steps to preserve the peninsula's coastal marine ecosystems.

"This framework enabled us to integrate many different types of data related to coupled human-environment systems," says Leslie.

"We were able to be more comprehensive in our analysis than we would have otherwise--for example, if we had focused solely on the data closest to our individual disciplines."

Dozen regions near Baja California Sur profiled

Leslie and the team--which included anthropologists, economists, ecologists, fisheries scientists, geographers and other social scientists--gathered data on 13 ecological and social variables for each of 12 regions around Baja California Sur.

They developed a profile of sustainability potential for each region in four "dimensions": social governance systems and actors, and ecological resource units and resource systems.

What became clear, says Leslie, is that each region's profile is different, suggesting that the most effective policies to achieve sustainability will be those tailored to shore up each area's weaknesses without undermining its strengths.

"More than one billion people worldwide rely on fish and other seafood as their primary source of nutrition, yet coastal resources are often poorly understood and at risk of long-term decline," says Sarah Ruth, a program director for the National Science Foundation's (NSF) Dynamics of Coupled Natural and Human Systems (CNH) Program, which funded the research.

"Fisheries involve people and coastal ecosystems interacting with one another in myriad ways," says Ruth. "This work increases our understanding of these complex systems, and ultimately will help us manage marine resources in a more sustainable way."

Magdalena Bay and Todos Santos: A tale of two regions

Take, for example, stories the researchers' maps illustrate about the Pacific coast regions of Magadalena Bay and Todos Santos. If Baja California Sur is a leg, then Magdalena Bay would be the knee and Todos Santos the front of the ankle.

In Magdalena Bay, fisheries are diverse and productive. But the area is also crowded with fishers who use different kinds of gear and don't necessarily all follow the same locally developed rules, Leslie says.

While the ecological foundation there for sustainability looks good, the social dimensions are considerably less promising, the scientists found.

"Depending on which type of data one musters regarding the potential for sustainable fisheries, Magdalena Bay could be scored as either well-endowed or quite weak," write the paper authors, which also include Xavier Basurto of Duke University and Octavio Aburto-Ooropeza of the University of California, San Diego.

In Todos Santos, local-scale fisheries institutions are strong. There is cooperation and compliance among fishers, based on the social science surveys the researchers conducted.

But the nearshore ocean is less productive and yields fewer fish species than Magdalena Bay, just 150 miles to the north.

While not far geographically, the regions are opposites in their social and ecological profiles, says Leslie.

In Magdalena Bay, the most productive strategy might be to cultivate stronger local institutions, Leslie says. In Todos Santos, maintaining existing local institutional strength might make more sense.

Are there fundamental trade-offs?

The diversity of regional strengths and weaknesses shown in the maps is borne out statistically.

The researchers looked for correlations among the four dimensions across Baja California Sur, but found few.

"Does something like this represent fundamental trade-offs?" Leslie asks. "What we found suggests that it does. There's nowhere that does well in every dimension."

The David and Lucile Packard Foundation, Brown University's Environmental Change Initiative and Voss Environmental Fellowship Program, and other organizations also funded the research.

-- Cheryl Dybas, NSF
-- David Orenstein, Brown University
Investigators
Heather Leslie
Sriniketh Nagavarapu
Related Institutions/Organizations
Brown University

Thursday, April 30, 2015

CDC REPORTS ON DOG-TO-HUMAN PNEUMONIC PLAGUE OUTBREAK IN COLORADO

FROM:   CENTERS FOR DISEASE CONTROL AND PREVENTION
Outbreak of Human Pneumonic Plague with Dog-to-Human and Possible Human-to-Human Transmission — Colorado, June–July 2014

This outbreak highlights 1) the need to consider plague in the differential diagnosis of sick domestic animals from plague endemic areas, including dogs, 2) the limitations of automated diagnostic systems for identifying rare bacteria such as Yersinia pestis, and 3) the potential for milder forms of illness in patients taking antimicrobial agents. Hospital laboratories in plague-endemic areas should be aware of the limitations of current diagnostic methodologies in diagnosing rare diseases such as plague. In July 2014, the Colorado Department of Public Health and Environment Laboratory identified Yersinia pestis in a blood sample collected from a middle-aged man hospitalized with pneumonia. An investigation led by Tri-County Health Department revealed that the man’s dog had been ill and was euthanized. The dog later tested positive for Y. pestis. Three additional persons with contact with the dog and/or patient were ill and tested positive for Y. pestis. One of the cases may have resulted through person-to-person transmission from the index patient, potentially the first such event in North America since 1924. Human illness due to plague remains an ongoing risk in endemic areas. Early recognition of plague, especially the pneumonic form, is critical to clinical management and a timely public health response.

Monday, April 27, 2015

POLIO ERADICATION CAN BE STRENGTHENED BY IMPROVING QUALITY OF SURVEILLANCE FOR POLIOVIRUSES

FROM:  CENTERS FOR DISEASE CONTROL AND PREVENTION
Tracking Progress Toward Polio Eradication — Worldwide, 2013–2014
CDC Media Relations

Improvement in the quality of surveillance for polioviruses is needed to help strengthen global polio eradication efforts. There are only three countries where poliovirus circulation has never been interrupted: Afghanistan, Nigeria, and Pakistan. Outbreaks occurred during 2013 and 2014 as a result of spread from these countries. Monitoring the progress of the Global Polio Eradication Initiative requires sensitive and timely polio surveillance. This report presents 2013 and 2014 poliovirus surveillance data, focusing on reports during 2010-2014 from 29 countries with at least one case of wild or circulating vaccine-derived poliovirus. In 2013, 25 of the 29 countries met the two primary surveillance quality indicators: sensitivity and timeliness; in 2014, the number decreased to 21. To complete and certify polio eradication, gaps in surveillance must be identified and surveillance activities, including supervision, monitoring, and proper specimen collection, must be further strengthened.

Sunday, April 26, 2015

THEORETICAL PHYSICIST LISA RANDALL

FROM:  NATIONAL SCIENCE FOUNDATION
After the lecture: Extra dimensions, interacting dark matter, and the power of uncertainty
A conversation with theoretical physicist Lisa Randall

In her most recent book, physicist Lisa Randall--Harvard professor, libretto composer, Lego figurine, star in the world of theoretical physics--writes that the universe repeatedly reveals itself to be cleverer than we are. This is not a submission to the mysteries of the universe; rather, it's a recognition that the more we discover about the fundamental nuts and bolts of this world, the more questions we have.

Randall works to uncover those fundamental nuts and bolts. She studies theoretical particle physics and cosmology, and her research has advanced our understanding of supersymmetry, models of extra dimensions, dark matter and more. She's made a career out of sharing these discoveries--what they are, how we know them and why they matter--with the public.

Randall is the author of three books and has appeared in dozens of media outlets--from Charlie Rose and The New York Times to The Colbert Report and Vogue. We sat down with Randall after her lecture "New ideas about dark matter" as part of the National Science Foundation's Distinguished Lecture Series in Math and Physical Sciences.

I liked doing math. And I liked understanding how things work. I took a physics class in high school, and I didn't really know for sure that I would be doing it [long term], but I kept going. I enjoyed it. I like that you got answers. I kind of liked that it was challenging.

I think it's important to explain these theories are evolving and what it means for the world. Uncertainty in science isn't actually a bad thing. It actually drives you forward. You can have a lot of certainty even with uncertainty at the edges.

Sometimes it's a question not just of saying 'I'm going to figure this out,' but just with being smart enough to recognize something interesting when it happens. When we found this warped geometry we hadn't been looking for it, it just was a solution. Then we realized what kind of implications it could have. Both in terms of solving the hierarchy problem and explaining particle masses, but also in terms of having an infinite extra dimension.

There's usually a moment when you realize it. Then there are a lot of moments when you think you're wrong and you go back.

I think there's just a lot of ideas about creativity that people don't fully appreciate for scientists. I think there's a lot of ideas about right and wrong that people don't fully appreciate, and how science advances.

I'd just written a book where you try so hard to do everything in a liner order. I'd just written Warped Passages and it was kind of nice the idea of just introducing ideas without having to explain them. And just have different voices. You sort of realize the richness of operas and just expressing ideas and just getting people familiar with something. You have music, you have art, you have words. It's very exciting.

I don't think anyone should just set themselves up to be a role model. I think every person is different, and certainly there's a few enough women that we're all different. But it is true that one of the small advantages you have as a woman is that you are doing something important beyond your work, which is just establishing that women can be out there doing these things. And it is definitely true that when I wrote my book I thought it's good to have someone out there in the public eye, so that people know there are women physicists. And in terms of the response, I can say that--both negative and positive--people do not realize there are women out there sometimes. So it was really important. But it also means you have to put up with a lot of distracting comments and questions sometimes that you wouldn't otherwise.

-- Jessica Arriens,
Investigators
Lisa Randall
Related Institutions/Organizations
Harvard University
Massachusetts Institute of Technology

Friday, April 24, 2015

DOD SECRETARY CARTER'S SEARCH FOR TECH PARTNERS

FROM:  U.S. DEFENSE DEPARTMENT

 Right:  Defense Secretary Ash Carter holds up a copy of his doctoral thesis as he delivers a lecture titled "Rewiring the Pentagon: Charting a New Path on Innovation and Cybersecurity," at Stanford University in Stanford, California, April 23, 2015. The lecture highlighted the Pentagon's new cyber strategy and innovation initiatives. 

Carter Seeks Tech-sector Partnerships for Innovation
By Cheryl Pellerin
DoD News, Defense Media Activity

WASHINGTON, April 23, 2015 – Defense Secretary Ash Carter announced new partnership initiatives today on the first day of a two-day visit to California’s Silicon Valley to learn from experts who run some of the highest-tech companies in one of the nation’s innovation hotspots.

At Stanford University, where he recently served as a distinguished visiting fellow at the Hoover Institution and a lecturer at the Freeman Spogli Institute for International Studies, Carter delivered the annual Drell Lecture, titled “Rewiring the Pentagon: Charting a New Path on Innovation and Cybersecurity.”

The lecture, sponsored by the Stanford Center for International Security and Cooperation, is named for Dr. Sidney Drell, a theoretical physicist and arms-control expert who cofounded the center.

When Carter became defense secretary, he told the audience, one of his top commitments was to the future -- to stay ahead of a changing world, to remain competitive, to attract new generations to the mission of serving the country, and to stay abreast of technology.

Commercially Driven Technology

To begin leveraging commercially driven technology, he said, the Defense Department wants “to partner with businesses on everything from autonomy to robotics to biomedical engineering and 3-D printing; power, energy, propulsion to distributed systems, data science [and] the Internet of things.”

Over the years, Carter said, products developed in Silicon Valley and across the tech community have enabled transformation, progress, opportunity and prosperity across all economic and social sectors, including national defense.

“It’s made many things easier, cheaper and safer,” he added.

“But in recent years it’s become clear that these same advances and technologies also present a degree of risk to the businesses, governments, militaries and individual[s] who rely on them every day … making it easier, cheaper and safer to threaten them,” the secretary said.

The same technologies DoD uses to target cruise missiles and jam enemy air defenses can be used against U.S. and allied forces, and they’re available to the highest bidders, he noted, asking, “How do we mitigate the risk that comes with technology while simultaneously unleashing its promise and potential?”

The answer, he said, is partnership.

Investments by DoD and government agencies have historically played a role in helping to spur ground-up technological innovation in Silicon Valley and on the Stanford campus, Carter said. Vint Cerf, father of the Internet, did that work and more while he was a Stanford assistant professor and a researcher at the Defense Advanced Research Projects Agency, the secretary said.

DoD-funded Innovation

The Global Positioning System began as a defense-driven project, work on Google’s search algorithm was funded by a National Science Foundation grant, and most technologies used throughout Silicon Valley can be traced back to government or DoD research and expenditures, Carter said. “Developers of multitouch [interaction] worked together through a fellowship funded by the National Science Foundation and the CIA,” he added.

“iOS’s Siri grew out of … decades of DARPA-driven research on artificial intelligence and voice recognition, [and] a specific DARPA project funded through [SRI International, formerly the Stanford Research Institute] to help develop a virtual assistant for military personnel,” the secretary said. “And Google’s self-driving cars grew out of a DARPA Grand Challenge.”

DoD, other federal agencies and tech companies helped to ignite the spark, Carter said, but Silicon Valley companies nurtured the flame and created unimaginable applications.

But the Defense Department still makes up half of federal research and development -- about $72 billion this year, he said. And $12 billion in R&D funds support breakthrough science and technology research at universities, companies and DoD labs across the tech community.

For example, he said, several Stanford scientists have worked with DARPA, and over the past three years, DARPA has partnered with nearly 50 different public- and private-sector research entities in Silicon Valley.

“These relationships are really valuable to us,” Carter added, “and I intend to continue to nurture them.”

Disaster-response Robots

In June, the results of such partnerships will come together during the DARPA Robotics Challenge Finals in Pomona, California.

At the competition, work on smaller sensors, pattern-recognition technology, big-data analysis and autonomous systems with human decision support will fuel a competition of 25 robots from around the world. Each human-robot team will try to navigate a simulated disaster area so that during future disasters such technology may be ready to help without putting people at risk.

But to stay competitive and stay ahead of threats, DoD must do even more, Carter said, “and that starts with our people, who are our most important asset both in Silicon Valley and in the military.”

Who they are and where they are affects the department’s ability to innovate, the secretary said, and that’s the rationale behind some initial steps he’s taking starting today to help the department attract new people with talent and expertise “and who want to contribute to the Force of the Future, even if only for a time or a project.”

In one such effort, the department is establishing a DoD branch of the U.S. Digital Service, an outgrowth of the tech team that helped to rescue healthcare.gov, the secretary said.

The team will help to solve some of DoD’s most intractable IT and data problems, Carter said, noting that “we have our very first team … already in the Pentagon working on transferring electronic health records from DoD to the [Department of Veterans Affairs].”

U.S. Digital Service

Another initiative Carter announced today takes advantage of the elements that make Silicon Valley “a nexus for innovation” -- an experimental Silicon Valley partnership called the Defense Innovation Unit-X, or DIUX. The unit will scout emerging and breakthrough technologies and build direct relationships to DoD.

This is “first-of-a-kind [partnership] for us, staffed by some of our best active-duty and military personnel, plus key people from the reserves who live here, who are some our best technical talent,” Carter said.

Building New Relationships

The team will strengthen existing relationships and build new ones while functioning as a local interface node for the rest of the department, the secretary said. Down the road, he added, “they can help startups find new work to do with DoD.”

Next, Carter said, the department will open a door in the other direction, from our best government technologies to industry and then back.

An existing program called Secretary of Defense Corporate Fellows sends about 15 DoD people a year out to commercial companies such as Oracle, Cisco, FedEx and others, he said.

“Right now we don’t effectively harness what they’ve learned when they come back, … so we’re going to try expanding that fellows program into a two-year gig -– one year in a company and one year in a part of DoD with comparable business practices,” the secretary said. “That way, we have a better chance to bring the private sector’s best practices back into the department.”

To invest in the most promising emerging technologies, Carter said, the department needs the creativity and innovation that comes from startups and small businesses.

“This is particularly important, because startups are the leading edge of commercial innovation,” he said, “and right now, DoD researchers don’t have enough promising ways to transition technologies that they come up with to application. I want to fix that too.”

Borrowing on the success of an intelligence community partnership with the independent nonprofit startup-backer In-Q-Tel, Carter said the department has proposed and In-Q-Tel has accepted a pilot project to provide innovative solutions to DoD’s most challenging problems.

The department will make a small investment with In-Q-Tel to leverage the nonprofit’s proven relationships and apply its approach to DoD, he added.

The Best Partners

“As secretary of defense, my mission is to make sure our military can defend our country … and we’re at our best when we have the best partners,” Carter said. “Knowing how we’ve worked together in the past and how critical your work is to our country, strengthening this partnership is very important to me.

“We have a unique opportunity to build bridges and rebuild bridges [in the commercial tech sector] and renew trust,” he continued. “That’s why I’m visiting some other companies here this afternoon and meeting with a group of tech leaders tomorrow. I want to learn how in the years to come a new level of partnership can lead to great things. That’s what’s possible through partnership.”

Thursday, April 23, 2015

EX-IM NAMES DROPLET MEASUREMENT TECHNOLOGIES AS SMALL BUSINESS EXPORTER OF THE YEAR

FROM:  U.S. EXPORT-IMPORT BANK
Export-Import Bank Names Droplet Measurement Technologies as Small Business Exporter of the Year
 Boulder, CO Company Employs a Team of 45 and Exports to Over 47 Countries

Washington, D.C. – Today, the Export-Import Bank of the United States (Ex-Im Bank), announced that Droplet Measurement Technologies (DMT), a Boulder, Colorado small business that manufactures cutting-edge cloud and aerosol measurement devices to customers around the globe, has been named its Small Business Exporter of the Year. An award will be presented to the company at the Bank's Annual Conference in Washington, D.C. on April 23rd.

“Equipping small businesses like Droplet Measurement Technologies to grow and successfully compete on the global stage is at the core of the Bank’s mission of reducing risk and unleashing opportunity,” said Ex-Im Bank Chairman and President Fred P. Hochberg. “When innovative American small businesses like DMT have a level playing field, they can enter new markets, sell their made-in-America goods and create jobs here at home.”

“Being named Small Business Exporter of the Year is an honor in that our small technical instruments company makes the tools scientists and researchers use to study the environment,” said Droplet Measurement Technologies CEO Robert McAllister. “This award is recognition of the global nature of the science we support. It supports our mission of providing scientists worldwide with the quality instruments they need to do the research we all rely on.”

Founded in 1987 by Dr. Darrel Baumgardner and Mr. Bill Dawson, DMT works with scientists and researchers around the world to expand research and development into new products that allow countries to measure the atmospheric changes taking place related to global warming, atmospheric ozone, and other areas of particle research. DMT produces the cloud probes that are utilized for climate and weather research, aircraft icing studies, and other atmospheric research.

As a leader in scientific cloud and aerosol measurements for more than 27 years, DMT began using Ex-Im’s insurance policy over four years ago to safeguard its international accounts receivable. The company’s export sales have risen 17.5 percent and their workforce has grown by 20 percent as a result. With 60 to 70 percent of their annual sales now exports, DMT relies heavily on the insurance provided by Ex-Im.

Ex-Im Bank's 2015 Annual Conference will feature remarks and panel discussions with some of the world’s leading voices in business and trade, including IMF Managing Director Christine Lagarde; National Security Advisor Susan Rice; US Secretary of Commerce Penny Pritzker; Doug Oberhelman, chairman and CEO of Caterpillar Inc.; W. James McNerney, Jr., chairman and CEO of The Boeing Company; Steven Rattner, chairman of Willett Advisors LLC; Joe Kaeser, president and CEO of Siemens AG; Stephen S. Poloz, governor of the Bank of Canada; Jacqueline Hinman, chairman and CEO of CH2M Hill; Ambassador of the United Arab Emirates to the United States, Yousef Al Otaiba; and Dr. Mo Ibrahim, founder and chairman of the Mo Ibrahim Foundation.

Wednesday, April 22, 2015

NEW PARTICLE ACCELERATORS COULD EXPAND USE TO SCIENCE AND INDUSTRY

FROM:  NATIONAL SCIENCE FOUNDATION
Smaller and cheaper particle accelerators?

Scientists developing technology that could expand use for medicine, national security, materials science, industry and high energy physics research
Traditionally, particle accelerators have relied on electric fields generated by radio waves to drive electrons and other particles close to the speed of light. But in radio-frequency machines there is an upper limit on the electric field before the walls of the accelerator "break down," causing it to not perform properly, and leading to equipment damage.

In recent years, however, scientists experimenting with so-called "plasma wakefields" have found that accelerating electrons on waves of plasma, or ionized gas, is not only more efficient, but also allows for the use of an electric field a thousand or more times higher than those of a conventional accelerator.

And most importantly, the technique, where electrons gain energy by "surfing" on a wave of electrons within the ionized gas, raises the potential for a new generation of smaller and less expensive particle accelerators.

"The big picture application is a future high energy physics collider," says Warren Mori, a professor of physics, astronomy and electrical engineering at University of California, Los Angeles (UCLA), who has been working on this project. "Typically, these cost tens of billions of dollars to build. The motivation is to try to develop a technology that would reduce the size and the cost of the next collider."

The National Science Foundation (NSF)-funded scientist and his collaborators believe the next generation of smaller and cheaper accelerators could enhance their value, expanding their use in medicine, national security, materials science, industry and high-energy physics research.

"Accelerators are also used for sources of radiation. When a high energy particle wiggles up and down, it generates X-rays, so you could also use smaller accelerators to make smaller radiation sources to probe a container to see whether there is nuclear material inside, or to probe biological samples," Mori says. "Short bursts of X-rays are currently being used to watch chemical bonds form and to study the inner structure of proteins, and viruses."

Just as important, albeit on a more abstract level, "the goal of the future of high-energy physics is to understand the fundamental particles of matter," he says. "To have the field continue, we need these expensive, large, and complex tools for discovery."

NSF has supported basic research in a series of grants in recent years totaling $4 million, including computational resources. The Department of Energy (DOE) has provided the bulk of the funding for experimental facilities and experiments, and has contributed to theory and simulations support.

"Mori's work is the perfect example of an innovative approach to advancing the science and technology frontiers that can come about when the deep understanding of fundamental laws of nature, of the collective behavior of charged particles that we call a plasma, is combined with state-of-the-art numerical modeling algorithms and simulation tools," says Vyacheslav (Slava) Lukin, program director in NSF's physics division.

Using DOE's SLAC National Accelerator Laboratory, the scientists from SLAC and UCLA increased clusters of electrons to energies 400 to 500 times higher than what they could reach traveling the same distance in a conventional accelerator. Equally important, the energy transfer was much more efficient than that of earlier experiments, a first to show this combination of energy and efficiency using "plasma wakefields."

In the experiments, the scientists sent pairs of electron bunches containing 5 billion to 6 billion electrons each into a laser-generated column of plasma inside an oven of hot lithium gas. The first bunch in each pair was the "drive" bunch; it blasted all the free electrons away from the lithium atoms, leaving the positively charged lithium nuclei behind, a configuration known as the "blowout regime." The blasted electrons then fell back in behind the second bunch of electrons, known as the "trailing" bunch to form a "plasma wake" that thrust the trailer electrons to higher energy.

While earlier experiments had demonstrated high-field acceleration in plasma wakes, the SLAC/UCLA team was the first to demonstrate simultaneously high efficiency and high accelerating fields using a drive and trailer bunch combination in the strong "blowout" regime. Furthermore, the accelerated electrons ended up with a relatively small energy spread.

"Because it's a plasma, there is no breakdown field limit," Mori says. "The medium itself is fully ionized, so you don't have to worry about breakdown. Therefore, the electric field in a plasma device can be pushed to a thousand or more times higher amplitude than that in a conventional accelerator."

Chandrashekhar Joshi, UCLA professor of electrical engineering, led the team that developed the plasma source used in the experiment. Joshi, the director of the Neptune Facility for Advanced Accelerator Research at UCLA is the UCLA principal investigator for this research and is a long-time collaborator with the SLAC group. The team also is made up of SLAC accelerator physicists, including Mike Litos and Mark Hogan; Mori leads the group that developed the computer simulations used in the experiments. Their findings appeared last fall in the journal Nature.

"The near term goal of this research is to produce compact accelerators for use in universities and industry, while a longer term goal remains developing a high energy collider operating at the energy frontier of particle physics," Mori says.

-- Marlene Cimons, National Science Foundation
Investigators
Warren Mori
Frank Tsung
Viktor Decyk
Russel Caflisch
Michail Tzoufras
Philip Pritchett
Related Institutions/Organizations
University of California-Los Angeles

Tuesday, April 21, 2015

CYBERLEARNING SCIENCE

FROM:  NATIONAL SCIENCE FOUNDATION
Classroom as virtual phenomenon
University of Illinois professor uses cyberlearning to bring scientific processes to life

In the schools where Tom Moher works, classrooms are imbued with science through simulated earthquakes, virtual bugs in the walls and digital portholes to the solar system.

"I want to immerse students in the physical space and time of scientific phenomena," said Moher, an associate professor of computer science, learning sciences, and education at the University of Illinois at Chicago. "Sometimes I use the term 'marinating' the kids. Time affords me the opportunity for surprise. Nature happens when it happens, not just because it happens to be science period."

In his talk at the National Science Foundation (NSF) last spring as part of the "Designing Disruptive Learning Technologies" series, Moher showcased projects that use "embedded phenomena" to bring scientific processes into the classroom and to foster learning from those experiences.

One of the projects he described (supported by an award from NSF) was RoomQuake, an earthquake simulation system where the classroom itself becomes an active seismic area.

At unpredictable times throughout the unit, rumbles emanating from speakers attached to simulated seismographs signal to the class that an earthquake is occurring.

Students rush to terminals around the classroom, read the data from seismograms and use that information to determine the magnitude of the event, the distance of the event from the recording stations and eventually and the epicenter of the earthquake. Over the course of six weeks and dozens of earthquakes, students discover a "fault line" emerging.

Moher's immersive learning experiences bring technological richness and narrative drama to the classroom. This is true not only of RoomQuake, but also HelioRoom, where students are asked to imagine that the sun is located in the center of their classroom, and Wallcology, where tablets adjacent to the walls of classrooms serve as viewports into an imaginary space inside the walls filled with the virtual fauna.

The projects also highlight the role of computing and data analysis in domains from seismology to astronomy.

In Moher's most recent project, HungerGames, students learn about animal foraging behaviors using stuffed animals with embedded RFID tags that act as tangible avatars to represent their foraging among patches of food (with camouflaged RFID readers) distributed around a classroom.

During a two-period pilot enactment of the unit, Moher and his team demonstrated the feasibility of the design for classroom use, finding evidence of emotional relationships between learners and avatars, and the emergence of unanticipated behaviors that promoted new questions about the science phenomena. The results provisionally support the effectiveness of the activity as a science learning environment.

Moher's team presented their results at the Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction in 2014.

Whether it is students peering into a secret insect habitat or rushing to locate the epicenter of an earthquake, "the kids and teachers are our willing accomplices," Moher said.

"It's their imagination, along with a little bit of technology, that brings the room alive."

-- Aaron Dubrow, NSF
Investigators
Thomas Moher
Related Institutions/Organizations
University of Illinois at Chicago

Sunday, April 19, 2015

NSF REPORTS SCIENCE AND ENGINEERING GRAD SCHOOL ENROLLMENT UP

FROM:  NATIONAL SCIENCE FOUNDATION
Science and engineering graduate school enrollment increases
Rise largely fueled by influx of foreign students

After remaining essentially flat for the past two years, the number of full-time graduate students enrolled in science and engineering programs rose by 2.4 percent in 2013, to nearly 425,000 students, according to a new InfoBrief from the National Science Foundation's (NSF) National Center for Science and Engineering Statistics (NCSES).

NCSES found the increase was largely due to a 7.9 percent rise in full-time enrollment of foreign graduate students on temporary visas. Foreign enrollment hit an all-time high of 168,297 students in 2013, or 39.6 percent of the full-time science and engineering graduate student population--up from 35.9 percent in 2008.

In contrast, full-time enrollment for U.S. science and engineering graduate students fell for the third year in a row. But while overall enrollment by U.S. citizens and permanent residents declined, the number of U.S. students of Hispanic or Latino descent has climbed steadily since 2008, resulting in 25.8 percent in growth.

NCSES found that among U.S. graduate students, enrollment continued to become more diverse. Of the total students enrolled in science and engineering graduate programs:

8.9 percent were Asian and Native Hawaiian or Other Pacific Islanders.
8.6 percent were Hispanic or Latino.
8.1 percent were Black or African American.
2.1 percent reported they were more than one race.
0.6 percent were American Indian or Alaska Native.
Those groups made up 28 percent of total graduate enrollments in science and engineering, including U.S. and foreign students. In 2008, they accounted for less than a quarter of students who were U.S. citizens and permanent residents.

The study also found that a decade-long decline continued in postdocs conducting research in the sciences. Between 2010 and 2013, the number of postdocs in science fields dropped by 2.8 percent, with the largest decreases in the two biggest science fields: biological sciences and physical sciences. Over the same period, the number of postdocs in engineering fields rose by 2 percent, with the largest increases in chemical engineering, biomedical engineering and electrical engineering.

Saturday, April 18, 2015

ADVANCING AEROSERVOELATIC TECHNOLOGY

FROM NASA  

The X-56A Multi-Utility Technology Testbed (MUTT) is greeted on an Edwards Air Force Base runway by a U.S. Air Force Research Laboratory (AFRL) team member. NASA’s Armstrong Flight Research Center and the AFRL, along with participants from Langley Research Center and Glenn Research Center, and support from Lockheed Martin, are using the second X-56A (dubbed “Buckeye”) to check out aircraft systems, evaluate handling qualities, characterize and expand the airplane’s performance envelope, and verify pre-flight predictions regarding aircraft behavior. The 20-minute flight marked the beginning of a research effort designed to yield significant advances in aeroservoelastic technology using a low-cost, modular, remotely piloted aerial vehicle. Image Credit: NASA-Ken Ulbrich.



Friday, April 17, 2015

FTC BARS COMPANY FROM MAKING FALSE CLAIMS ABOUT APP THAT CAN DIAGNOSE MELONOMA RISK

FROM:  U.S. FEDERAL TRADE COMMISSION
FTC Approves Final Order Barring Misleading Claims about App’s Ability to Diagnose or Assess the Risk of Melanoma

Following a public comment period, the Federal Trade Commission has approved a final consent order barring Health Discovery Corporation from making deceptive or unsupported claims that its app, MelApp, could help diagnose or assess consumers’ melanoma risk.

According to the FTC’s February 2015 complaint, MelApp instructed users to photograph a mole with a smartphone camera and input other information. It would then supposedly calculate the mole’s melanoma risk as low, medium, or high. The FTC charged that Health Discovery Corporation deceptively claimed the app accurately analyzed melanoma risk and could assess such risk in early stages, and that its accuracy was scientifically proven.

The final order settling the action bars the company from claiming that any device detects or diagnoses melanoma or its risk factors, or increases users’ chances of early detection, unless the representation is not misleading and is supported by competent and reliable scientific evidence. It also prohibits Health Discovery Corporation from making any other deceptive claims about a device’s health benefits or efficacy, or about the scientific support for any product or service, and requires the company to pay $17,963.

The Commission vote approving the final consent order and was 4-1, with Commissioner Maureen Ohlhausen voting no

Friday, April 10, 2015

DOCTORS TRAIN WITH HUMAN PATIENT SIMULATOR

FROM:  NATIONAL SCIENCE FOUNDATION
How robots can help build better doctors
Research seeks to make better 'human patient simulators'

A young doctor leans over a patient who has been in a serious car accident and invariably must be experiencing pain. The doctor's trauma team examines the patient's pelvis and rolls her onto her side to check her spine. They scan the patient's abdomen with a rapid ultrasound machine, finding fluid. They insert a tube in her nose. Throughout the procedure, the patient's face remains rigid, showing no signs of pain.

The patient's facial demeanor isn't a result of stoicism--it's a robot, not a person. The trauma team is training on a "human patient simulator," (HPS) a training tool which enables clinicians to practice their skills before treating real patients. HPS systems have evolved over the past several decades from mannequins into machines that can breathe, bleed and expel fluids. Some models have pupils that contract when hit by light. Others have entire physiologies that can change. They come in life-sized forms that resemble both children and adults.

But they could be better, said Laurel D. Riek, a computer science and engineering professor at the University of Notre Dame. As remarkable as modern patient simulators are, they have two major limitations.

"Their faces don't actually move, and they are unable to sense or respond to the environment," she said.

Riek, a roboticist, is designing the next generation of HPS systems. Her NSF-supported research explores new means for the robots to exhibit realistic, clinically relevant facial expressions and respond automatically to clinicians in real time.

"This work will enable hundreds of thousands of doctors, nurses, EMTs, firefighters and combat medics to practice their treatment and diagnostic skills extensively and safely on robots before treating real patients," she said.

One novel aspect of Riek's research is the development of new algorithms that use data from real patients to generate simulated facial characteristics. For example, Riek and her students have recently completed a pain simulation project and are the first research group to synthesize pain using patient data. This work won them best overall paper and best student paper at the International Meeting on Simulation in Healthcare, the top medical simulation conference.

Riek's team is now working on an interactive stroke simulator that can automatically sense and respond to learners as they work through a case. Stroke is the fifth leading cause of death in the United States, yet many of these deaths could be prevented through faster diagnosis and treatment.

"With current technology, clinicians are sometimes not learning the right skills. They are not able to read diagnostic clues from the face," she said.

Yet learning to read those clues could yield lifesaving results. Preventable medical errors in hospitals are the third-leading cause of death in the United States.

"What's really striking about this is that these deaths are completely preventable," Riek said.

One factor contributing to those accidents is clinicians missing clues and going down incorrect diagnostic paths, using incorrect treatments or wasting time. Reading facial expressions, Riek said, can help doctors improve those diagnoses. It is important that their training reflects this.

In addition to modeling and synthesizing patient facial expressions, Riek and her team are building a new, fully-expressive robot head. By employing 3-D printing, they are working to produce a robot that is low-cost and will be one day available to both researchers and hobbyists in addition to clinicians.

The team has engineered the robot to have interchangeable skins, so that the robot's age, race and gender can be easily changed. This will enable researchers to explore social factors or "cultural competency" in new ways.

"Clinicians can create different patient histories and backgrounds and can look at subtle differences in how healthcare workers treat different kinds of patients," Riek said.

Riek's work has the potential to help address the patient safety problem, enabling clinicians to take part in simulations otherwise impossible with existing technology.

-- Rob Margetta,
Investigators
Laurel Riek
Related Institutions/Organizations
University of Notre Dame

Thursday, April 9, 2015

USING 3-D DIGITAL LASER MICROSCOPY TO RECOVER DATA FROM DAMAGED OPTICAL MEDIA

FROM:  NATIONAL SCIENCE FOUNDATION
Restoring lost data

Researchers developing 3-D digital laser microscopy to create visual roadmap
It can be disheartening to learn that something precious, such as a one-of-a-kind family photo, has disappeared from a scratched or broken CD or DVD. It also can become serious, dangerous and potentially costly if it happens to a disc containing criminal forensic evidence, corporate records or scientific data.

But there may be a way in the future to bring the material back.

Optical media, that is, CDs and DVDs, have been in widespread use for the past two decades, and burners are in many homes and elsewhere, making it possible for consumers to create any number of their own personal discs. But the products have not turned out to be as durable as originally believed, a situation that can turn tragic if a disc containing the only available copy of important material has become damaged.

Although still a work-in-progress, researchers have developed a process with the potential to restore much of what was thought to be lost.

"While we haven't solved all of the challenges necessary for efficient data recovery from damaged optical media, we have moved forward in terms of refining what is possible," says Greg Gogolin, a professor of information security and intelligence at Michigan's Ferris State University, stressing that the work of his team at this point was aimed solely at demonstrating "proof of concept."

Equally important, the development of new security techniques to ensure that data is unrecoverable, similar to advances that now prevent the retrieval of information on destroyed paper documents.

"A common way to destroy a paper document used to be to burn it," Gogolin says. "Forensic techniques, however, were developed that allowed for the recovery of information that was on a burned document. Document destruction techniques were then improved."

The National Science Foundation (NSF)-funded researcher, in collaboration with colleagues James Jones, associate professor in the computer forensics program at George Mason University; Charles Bacon, a Ferris professor of physical science; Tracey Boncher, a Ferris associate professor of pharmacy; and Derek Brower, a Ferris graduate student at the time of the research, theorized that using three-dimensional digital laser microscopy to capture 3-D image of the disc could provide a visual roadmap of the data. This and a special computer algorithm capable of recognizing its patterns then could aid in recovering the vast majority of it.

"If a disc is broken in half, you've still got 99 percent of the data still there," Gogolin says. "The media is quite elastic and the data is pretty much intact up to the cut line. There is, of course, a region that is destroyed near where the disc has been cut. But for most part, you didn't destroy the data, you just made it unreadable because you can't spin the disc."

The researchers, funded by a $356,318 grant awarded in 2011 from NSF's Division of Computer and Network Systems, tested their idea by breaking a disc, putting it together and taking a picture of it using the high-powered 3-D digital laser microscope.

"It was like sticking it into a kind of copy machine," Gogolin says. "There are patterns to represent the data, that is, the different letters." After determining the data patterns, "we then read them with a computer program to determine what data was on the disc."

At this early stage of development, the computer program the scientists wrote contains and recognizes only simple alphabetic encoding.

But, "there are many different types of data that could be on there, including letters, numbers and special characters," he says. "There is a huge range of possible data elements, and we don't have recognition set up for all of them, only for a subset, part of the alphabet."

The ultimate goal is "to expand the capabilities of the recovery program to be able to recognize all the different types of data and encoding that could be present on an optical disc," he adds, stressing that big hurdles remain. "It's a huge deal in that there are many different combinations and variations of data that make it a significant challenge to be able to recognize everything that would be on an optical disc. You can have different types of discs and Blu Ray discs. Different manufacturers use different inks. You could have encryption. There are a lot of variables."

Also, the larger the file, the more difficult the recovery, he says.

"If it's a small file, the chances of recovering it are much better than if it's a large file, because the chances of the file running into that ‘destruction zone' are greater," he says. "If you need a complete file to affect the recovery, and it's a large file, it becomes a bigger issue."

The researchers now are trying to decide if they want to test their ideas on other types of memory, such as flash drives "like that in your phone," he says, or solid state drives, rather than hard drives. "That's where everything is going," he says. "Would time be better spent trying to perfect a way to recover material from a flash or finishing what is needed for optical?"

The team is a long way from making the process widely available. Nevertheless, "we wanted to prove the concept that it could be done, so that every time you see a broken disc, you won't necessarily think, 'oh, it's lost forever,"' Gogolin says.

-- Marlene Cimons, National Science Foundation
Investigators
Greg Gogolin
James Jones
Charles Bacon
Tracey Boncher
Barbara Ciaramitaro
Related Institutions/Organizations
Ferris State University

Tuesday, April 7, 2015

HIGH-SPEED NETWORKING IN FLINT, MICHIGAN

FROM:  NATIONAL SCIENCE FOUNDATION
Igniting change in Vehicle City
Kettering University leads effort to improve city services in Flint, Mich., through high-speed networking
March 24, 2015

Flint, Mich., the former home of General Motors, is on the rebound these days. Leaders there believe they have hit on a winning formula--connecting the city's institutions to high-speed networks that support new, game-changing capabilities.

Through grants from the National Science Foundation (NSF) and the U.S. Department of Justice (DoJ), Flint is beginning to lay the groundwork for an information technology-driven transformation.

In June 2012, Flint was one of 16 initial cities that were part of US Ignite, a public-private partnership designed to capitalize on the possibilities of ultra-fast broadband networks and "ignite" the development of next-generation Internet applications and services with societal benefits.

Kettering University--formerly General Motors Institute--was designated the lead research institution for the city.

High-speed networking wasn't really on the radar of John Geske, a professor of computer science at Kettering University, before Flint joined US Ignite.

"I was busy running a computer science department and the last thing on my mind was networking applications," Geske said.

But because of the US Ignite award, he started attending application summits and other meetings and realized the possibilities that were available at his doorstep.

"The community that you start to create and the contacts you make are just invaluable," he said.

Connected schools

Phase One of making the city a gigabit hotbed involved taking a step back and uncovering what was already available in the community.

Geske learned that the entire city school system, as well as the schools in 21 schools districts in neighboring Genesee County, had formed the GenNET consortium in 1995 and were already connected by a high-speed, fiber-optic network. Moreover, the schools were connected to the city's four higher education institutions via the Flint Area Network for Educational Telecommunications.

With these capabilities in place, students in the school district experienced unique learning opportunities. For instance, students were able to remotely control an exploratory submarine in real time near the Barrier Reef and communicate with astronauts on the space shuttle. Genesee County students were even able to dissect a sheep's brain via a telemedicine class remotely led by a doctor at Northern Michigan University.

"The GenNET fiber-optic network allows us to reduce the cost of technology services while providing a powerful platform for delivering virtual learning," said Luke Wittum, executive director of Technology and Media Services in the Genesee Intermediate School District.

US Ignite extended this already capable base and provided dedicated 10 gigabit-per-second network connections to the universities and libraries in Flint and to other gigabit cities around the nation, on unique, programmable hardware.

With ultra-high-speed, high-capability Information Technology in place, technology leaders hope to leverage the existing fiber-optic networks to provide immersive virtual reality learning to all students in Flint and Genesee County.

"What if a student could step inside of a human cell, stand at the bottom of the Grand Canyon, or visit a historical place?" Wittum asked. "This opportunity could make learning more engaging and also provide some students who may have never been outside of the county or state to visit another part of the world."

Safer, smarter neighborhoods

Flint areas schools aren't the only target for transformation through high-speed networks. Early meetings with city officials and university representatives determined that public safety could be a focus of the US Ignite networking projects, too.

Together with stakeholders in the Mayor's office and the university, they envisioned a university corridor where improved surveillance, responsive lighting and targeted policing could lower crime and encourage development.

"US Ignite provides the city of Flint with opportunities to make the community safer by automating utilities--turning on all lights in an area where a crime has been reported, for example, or by giving law enforcement access to high-speed, real-time, high-definition video on demand," said Kettering University President Robert K. McMahan.

"We may not be able to have a full smart city yet, but a smart neighborhood is entirely possible," noted Geske.

These forms of "smart policing" rely on networks of sensors, cameras and analytical tools that require fast networking and access to powerful computing. The Kettering project has these in the form of GENI hardware.

GENI is an NSF-funded experimental, ultra-high-speed, programmable networking testbed that allows researchers to test new networking ideas at-scale.

There are more than 180 GENI sites around the world--and Flint is one of a few cities that is already leveraging its GENI connections to advance application concepts and prototypes for public benefit and in support of the US Ignite initiative.

In part because of the strength of the existing resources and the community of stakeholders they had developed, Kettering University was awarded a $1 million grant in 2014 from the U.S. Department of Justice as part of DoJ's Byrne Criminal Justice Innovation Program. The grant helps Flint develop and implement sustainable crime prevention strategies in the University Avenue Corridor in order to convert the neighborhood into a vibrant region. One strategy is data-driven policing.

"There are lots of pockets of information and it's hard for a researcher to gather it all together to find out if there are certain patterns," said Geske. "Once you pinpoint that, you can look at the area, figure out what's going on and decide what to do."

In an early collaboration with Flint police, Kettering researchers identified a particular pattern of criminal activity along the corridor. In this case, using data analytics, the university identified a property as a magnet for robberies--and purchased and revamped it to reduce crime in the area.

Geske hopes to enable this kind of smart policing citywide by building a cloud computing platform that enables the city to amass crime statistics and provides public access to the data.

In the future, officials imagine the avenue wired with lighting, air quality sensors, smart lighting and even autonomous vehicles or drones connected to the high-speed network. The GENI equipment will be used as a testbed to explore some of these possibilities.

Networked care

A third focus area for Flint is medicine, where Kettering is spearheading a partnership with the University of Michigan-Flint and Mott Community College, as well as with three major medical centers near the city.

Through this partnership, students, faculty, clinicians and researchers in the Flint area will be able to collaborate with instructors from around the country and have direct access to new tools to provide exceptional patient care. Officials even hope to use high-speed networking technologies to bring specialists together in a virtual office to make diagnoses.

With such technology in place, President McMahan says "individual patients seeking medical care at our partners in Flint will always have access to the latest advancements in healthcare no matter where in our country they originate or reside."

With the city as a testbed for creative technological solutions to civic problems, it will be interesting to see how advanced IT can impact education, policing and health care in the city.

Said Erwin Gianchandani, deputy division director for computer and network systems at NSF, "Pilot projects like those in Flint and other cities across the country are demonstrating the value of ultra-high-speed, programmable networks in our communities and helping the nation envision the possibilities of a faster, safer, smarter future Internet."

-- Aaron Dubrow, NS

Sunday, April 5, 2015

DROUGHT AND THE BABOONS

FROM:  NATIONAL SCIENCE FOUNDATION
Born during a drought: Bad news for baboons
Findings have implications for human health

The saying "what doesn't kill you makes you stronger" may not hold up to scientific scrutiny.

After the plains of southern Kenya experienced a severe drought in 2009 that took a terrible toll on wildlife, researchers looked at how 50 wild baboons coped with the drought, and whether the conditions they faced in infancy played a role.

The semi-arid savanna of southern Kenya usually receives an average of 14 inches of rain a year--akin to much of Nebraska or Kansas--but in 2009 it fell to five inches, less than the Mojave Desert.

The year before wasn't much better: rainfall in 2008 dropped to half normal levels.

Grasslands withered

The grasslands the animals depend on for food dried up and watering holes disappeared, leaving many animals starving or weak from hunger.

"We lost 98 percent of the wildebeest population, 75 percent of the zebra population and 30 percent of the elephant population," said Susan Alberts, a biologist at Duke University. "It was impossible to go anywhere without smelling death."

Most baboons made it, but the drought left them underweight and many females stopped ovulating.

In a forthcoming paper in the journal American Naturalist, the researchers compared two groups of females--one group born during low rainfall years, the other born during normal rainfall years.

Born in a drought

All females in the study were adults by time of the 2009 drought, but those born in lean times fared worse in 2009 than those born in times of plenty, the researchers found.

"This study demonstrates lifetime fertility reductions for baboons born during stressful conditions or to low-ranking mothers," said George Gilchrist, program director in the National Science Foundation's (NSF) Division of Environmental Biology, which funded the research along with NSF's Divisions of Integrative Organismal Systems and Behavioral and Cognitive Sciences.

"These 'disadvantaged' early life experiences are linked with less resilience to stressful conditions experienced as adults."

During the 2009 drought, baboons born during low rainfall years were 60 percent less likely to become pregnant, whereas pregnancy rates dipped by only 10 percent for females born during normal rainfall years.

Drought babies born to higher-status mothers were less affected by the 2009 event.

"It might be that baboons born to higher-ranked moms have better access to food, or suffer lower levels of social stress," Alberts said.

Implications for human health

The findings also help explain why people who are malnourished in early childhood go on to have higher rates of obesity, diabetes and heart disease as adults.

Some researchers argue that human babies conceived or born in lean times are programmed for food shortages later in life.

They develop a "thrifty metabolism," aimed at storing fat and conserving energy in order to survive starvation.

Things go awry, the thinking goes, only when the environments they experienced as infants and as adults don't match, such as when a child conceived in famine grows up and eats an excess of cheeseburgers, said paper co-author Amanda Lea, a biologist at Duke.

But the baboon fertility study lends support to another idea, namely that kids who don't get enough to eat during their first year of life are simply less resilient as adults than their counterparts.

"The data suggest that early adversity carries lifelong costs," said co-author Jenny Tung, a biologist at Duke.

"It's bad to be born in bad times, but with the right social or economic environment, that can be mitigated," Alberts added.

Jeanne Altmann of Princeton University is also a co-author of the paper.

In addition to NSF, the National Institute on Aging in Bethesda, Md.; Duke University; Princeton University; and the Max Planck Institute for Demographic Research supported the research.

-- Cheryl Dybas, NSF
-- Robin Ann Smith, Duke University
Investigators
Jenny Tung
Susan Alberts
Related Institutions/Organizations
Duke University

Search This Blog

Translate

White House.gov Press Office Feed