Showing posts with label RESEARCH. Show all posts
Showing posts with label RESEARCH. Show all posts

Thursday, March 19, 2015

LIFE BENEATH THE SEAFLOOR

FROM:  NATIONAL SCIENCE FOUNDATION
No limit to life in deep sediment of ocean's "deadest" region

Marine scientists find microbes from seafloor to igneous basement below
"Who in his wildest dreams could have imagined that, beneath the crust of our Earth, there could exist a real ocean...a sea that has given shelter to species unknown?"

So wrote Jules Verne almost 150 years ago in A Journey to the Center of the Earth.

He was correct: Ocean deeps are anything but dead.

Now, scientists have found oxygen and oxygen-breathing microbes all the way through the sediment from the seafloor to the igneous basement at seven sites in the South Pacific gyre, considered the "deadest" location in the ocean.

Findings contrast with previous studies

Their findings contrast with previous discoveries that oxygen was absent from all but the top few millimeters to decimeters of sediment in biologically productive regions of the ocean.

The results are published today in a paper in the journal Nature Geoscience.

"Our objective was to understand the microbial community and microbial habitability of sediment in the deadest part of the ocean," said scientist Steven D'Hondt of the University of Rhode Island Graduate School of Oceanography, lead author of the paper.

"Our results overturn a 60-year-old conclusion that the depth limit to life is in the sediment just meters below the seafloor in such regions.

"We found that there is no limit to life in this sediment. Oxygen and aerobic microbes hang in there all the way to the igneous basement, to at least 75 meters below the seafloor."

Under the seafloor, life all the way down

Based on the researchers' predictive model and core samples they collected in 2010 from the research drillship JOIDES Resolution, they believe that oxygen and aerobic microbes occur throughout the sediment in up to 37 percent of the world's oceans and 44 percent of the Pacific Ocean.

They found that the best indicators of oxygen penetration to the igneous basement are a low sedimentation accumulation rate and a relatively thin sediment layer.

Sediment accumulates at just a few decimeters to meters per million years in the regions where the core samples were collected.

In the remaining 63 percent of the ocean, most of the sediment beneath the seafloor is expected to lack dissolved oxygen and to contain anaerobic communities.

While the researchers found evidence of life throughout the sediment, they did not detect a great deal of it.

Life in the slow lane

The team found extremely slow rates of respiration and approximately 1,000 cells per cubic centimeter of subseafloor sediment in the South Pacific gyre--rates and quantities that had been nearly undetectable.

"It's really hard to find life when it's not very active and is in extremely low concentrations," said D'Hondt.

According to D'Hondt and co-author Fumio Inagaki of the Japan Agency for Marine-Earth Science and Technology, the discovery of oxygen throughout the sediment may have significant implications for Earth's chemical evolution.

The oxidized sediment is likely carried into the mantle at subduction zones, regions of the seafloor where tectonic plates collide, forcing one plate beneath the other.

"Subduction of these big regions where oxygen penetrates through the sediment and into the igneous basement introduces oxidized minerals to the mantle, which may affect the chemistry of the upper mantle and the long-term evolution of Earth's surface oxidation," D'Hondt said.

Holistic approach to study of subseafloor biosphere

The principal research funders were the U.S. National Science Foundation (NSF) and Japan's Ministry of Education, Culture, Sports, Science and Technology.

"We take a holistic approach to the subseafloor biosphere," said Rick Murray, co-author of the paper. Murray is on leave from Boston University, currently serving as director of the NSF Division of Ocean Sciences.

"Our team includes microbiologists, geochemists, sedimentologists, physical properties specialists and others--a hallmark of interdisciplinary research."

The research involved 35 scientists from 12 countries.

The project is part of the NSF-funded Center for Dark Energy Biosphere Investigations (C-DEBI), which explores life beneath the seafloor.

The research is also part of the Deep Carbon Observatory, a decade-long international science initiative to investigate the 90 percent of Earth's carbon located deep inside the planet.

The Nature Geoscience paper is available online.

-NSF-


Wednesday, March 11, 2015

SCIENTISTS LOOKING FOR PLANETS

FROM:  NATIONAL SCIENCE FOUNDATION
Techniques to prove or disprove existence of other planets
Astronomers developed technology while studying Gilese 581

Astronomers long have sought to find planets that can sustain life as humans know it. Four years ago, they thought they had one, possibly even two, pointing to signs that suggested that at least one rocky planet located in the "habitable zone" was revolving around Gliese 581, a faint dwarf star located 20 light-years from Earth.

Recently, however, National Science Foundation (NSF)-funded scientists, while developing technology they believe will better detect exoplanets, as they are known, determined that the suspected planets, known as Gliese 581g and 581d, did not exist.

Some of the signals, initially thought to be coming from two planets orbiting the star at a distance where liquid water could exist, actually were coming from the star itself, not from the "Goldilocks" planets, so-named because conditions on them are just right for supporting life.

The definition of the habitable zone of a star is whether liquid water can survive on its surface, given that life exists virtually wherever there is liquid water on Earth. Too far from a star, and a world is too cold, freezing all its water; too close to a star, and a world is too hot, boiling off all of its water.

Astronomers have found more than 1,000 planets orbiting stars, many discovered indirectly by the gravitational tug and pull that its mass exerts on the star during its orbit; most were found in close-in orbits to their stars, and unlikely to support life. But many scientists believe that there are a large number of planets, probably rocky like Earth, capable of doing so.

In recent years, scientists detected as many as six planets around Gliese 581, although one was later rescinded by the team that first announced it, but only two were thought to be in the habitable zone.

To be sure, it was disappointing to disprove the habitable zone planets in the Gliese 581 system; nevertheless, their research opens the way to valuable new methods for identifying such planets in the future.

"Bittersweet describes it pretty well," says Suvrath Mahadevan, an assistant professor of astronomy and astrophysics at Pennsylvania State University, describing how he felt about their findings. Still, "these discoveries occur in incremental steps," he adds. "With more powerful instruments and surveys coming on line, we will be finding low mass planets at the right distance to stars in the habitable zone. This is where the field is going."

But Doppler shifts of a star's "absorption lines," which are dark bands where atoms or molecules absorb light, also can result from magnetic events like sunspots within the star itself and can emit signals of planets that do not exist.

"It's possible for things like magnetic activity on the star itself to create Doppler shifts that can be mistaken as planets," Robertson says. "This is a problem we are very concerned about. As we push toward detection of smaller and smaller signals, like those produced by Earth, it becomes more likely that the star will be creating signals that either can hide planets we are looking for, or create false positive planet signals."

The less massive the planet, the smaller is this stellar motion, and the more difficult are the measurements. Thus, observing planets as small as Earth must be conducted with spectrographs and spectral calibration of extreme precision.

This is what the two researchers are working on: a new near-infrared spectrograph called the "Habitable Zone Planet Finder," or HPF. Also, in collaboration with colleagues at the National Institute of Standards and Technology, they are developing a frequency stabilized laser comb calibration system that will enable scientists to detect terrestrial-mass exoplanets by improving the ability to precisely measure velocities.

These will be deployed in 2016 on the 10 meter Hobby-Eberly Telescope, located at the University of Texas at Austin.

The researchers, concerned about the impact of stellar activity on finding planets, consider Gliese 581 "a great test case," Robertson says. "It has this network of low mass planets, including the possibility of planets in the habitable zone, and I was curious as to whether a really good stellar activity analysis might shed some light one way or the other on planet detections around that star."

NSF's Division of Astronomical Sciences supports their work.

The researchers analyzed Doppler shifts in existing spectroscopic observations of the star Gliese 581 obtained with two spectrographs, the ESO HARPS (High Accuracy Radial velocity Planet Searcher), at the European Southern Observatory in La Silla, Chile and the Keck HIRES (High Resolution Echelle Spectrometer) at the W. M. Keck Observatory in Mauna Kea, Hawaii.

They focused on absorption lines that were most sensitive to magnetic activity, including looking specifically at one particular line, the "hydrogen alpha absorption line," which exists in all stars and is known to be sensitive to stellar magnetic activity, that is, its strength increases or decreases as a star's magnetic activity changes.

They boosted the signals of the three innermost planets around the star, but the ones attributable to the two likely candidate planets disappeared, becoming indistinguishable from measurement noise. They concluded that the star itself produced the earlier signals through its activity and rotation, and they did not result from the presence of these two suspected planets. But they confirmed the existence of the three additional planets, although none is located in the habitable zone.

"It was disappointing to find out that these potentially exciting planets were not real," Robertson says. Still, "with so much dispute about the system, we were very satisfied to have a definite answer. There is not a lot of confusion left about the origin of these signals, which is a silver lining. The improved signal strength of the real planets is the positive from this work, and will motivate studies in the future, including our own."

Mahadevan agrees. "We are all curious about how many worlds are out there that can support life, and where the closest ones are," he says, adding: "We realize that the results of our work here will be at first disappointing, because we disproved two planets initially thought to be in the habitable zone. But the techniques we have developed will help us find new candidates for planets in the habitable zone, and we likely will use it more to prove, rather than disprove, that these planets exist."

-- Marlene Cimons, National Science Foundation
Investigators
Jason Wright
Michael Endl
James Kasting
Lawrence Ramsey
Suvrath Mahadevan
Related Institutions/Organizations
Pennsylvania State Univ University Park

Monday, March 9, 2015

NSF VIDEO: THINKING CAPS?

REMARKS BY FRANK A. ROSE ON U.S.-INDIA SPACE SECURITY COOPERATION

FROM:  U.S. STATE DEPARTMENT
03/06/2015 10:49 AM EST
U.S.-India Space Security Cooperation: A Partnership for the 21st Century
Remarks
Frank A. Rose
Assistant Secretary, Bureau of Arms Control, Verification and Compliance
Observer Research Foundation
New Delhi, India
March 5, 2015
Thank you very much.

Again, my name is Frank Rose. It’s an honor to return to India in my new role as U.S. Assistant Secretary of State for Arms Control, Verification and Compliance.

I’d like to thank the Observer Research Foundation and my good friend Raji for inviting me to speak with you today.

A Renewed U.S.-India Partnership

At the State Department, my colleagues and I are focused on the tools needed to promote global security as well as stable, strategic relationships with friends and partners around the world.

As the world’s two largest democracies, the U.S.-India Partnership is indispensable to global peace, prosperity, and stability.

Prime Minister Modi’s visit to Washington in September and President Obama’s visit to India this January were critical steps towards strengthening and expanding the U.S.-India strategic partnership.

We’ve seen tremendous movement and progress made in all areas of our relationship—infrastructure and investment, civil nuclear cooperation, climate change, defense cooperation and defense trade, health, and global issues like women’s rights and nonproliferation.

But it’s also important to remember that our partnership has deep roots.

As our leaders wrote in their joint op-ed in the Washington Post, “As nations committed to democracy, liberty, diversity, and enterprise, India and the United States are bound by common values and mutual interests. We have each shaped the positive trajectory of human history, and through our joint efforts, our natural and unique partnership can help shape international security and peace for years to come.”

Space Security Cooperation

As we deepen our strategic relationship, we share an interest in addressing the emerging security challenges of the 21st century.

Ensuring the long-term sustainability and security of the outer space environment is one of those challenges, and one that the United States and India are uniquely situated to address together.

Between ISRO and NASA, our two nations have done tremendous work in our exploration of outer space.

I would like to congratulate India on being one of just four space agencies to have reached Mars’ orbit and for being the first Asian nation to do so. It was a pleasant coincidence that NASA’s MAVEN spacecraft and ISRO’s Mars Orbiter Mission entered the orbit of Mars within a couple of days of each other.

We’re also pleased that ISRO and NASA have established a Mars Working Group to explore how our separate Mars missions can work together and coordinate their efforts. This is just one area of the nearly 15 years of strong civil space cooperation between India and the United States. We look forward to the continued growth across all areas of our space cooperation, potentially including India’s participation in research aboard the International Space Station.

U.S.-India civil cooperation in space has not led to extensive cooperation on space security, at least to date.

But I believe that just as this is a time of transformation and progress for our strategic partnership, so too is it a time of growth for our space security relationship.

Our governments recognize the importance of space security; in September our President and Prime Minister called for the establishment of a dialogue to address this important issue. I’m proud to chair that dialogue here next week.

Bilateral Space Security Cooperation

In September of last year, our leaders committed to a new mantra for our relationship, “Chalein saath saath; forward together we go.” I believe this is true for our space security relationship as well.

As we begin bilateral cooperation on space security, it is important we have an open dialogue where we share information, discuss areas in which we disagree as well as those where we agree, and identify areas for cooperation.

I am excited to start that conversation here in New Delhi.

We also need to identify areas of concrete collaboration.

Collaboration in space situational awareness and collision avoidance, as identified by the U.S.-India Joint Statement of September 2014, is one such potential area.

As we all know, space situational awareness, or SSA, is a foundational capability for spaceflight safety and preventing collisions in space. International cooperation on SSA is greatly beneficial, as international partnerships bring the resources, capabilities, and geographical advantages to enhance SSA upon which we increasingly depend.

The Department of State works closely with the Department of Defense on SSA information sharing agreements with foreign partners.

Establishing an arrangement to share information between the United States and India would be one possible way to begin bilateral collaboration.

Another area of potential bilateral collaboration could be on the utilization of space assets for maritime domain awareness.

Maritime domain awareness is greatly enhanced when data from ground- and sea-based sensors and local human observations are combined with data from space-based sensors, whether those data are from Automatic Identification Systems or Earth-observation satellites.

As both of our countries have a strong interest in promoting maritime security, and have developed robust and multi-layered maritime domain awareness architectures which utilize satellite information, I believe it would be worthwhile to explore cooperation and information exchanges in this area.

Multilateral Space Security Cooperation

There is much that our nations can do together in the multilateral arena as well.

Today, India, the United States, and the world all rely on satellites for communications, for disaster management and relief, for treaty monitoring, and for sustainable development, among many other things.

But there are risks and dangers to operating in space. As the United States Director of National Intelligence noted in January 2014, threats to space services are increasing as potential adversaries pursue disruptive and destructive counter-space capabilities. For example, Chinese military writings highlight the need to interfere with, damage, and destroy reconnaissance, navigation, and communication satellites. China has satellite jamming capabilities and is pursuing antisatellite systems.

The United States and India are both strong believers in transparency and rules based on international law and customs. Our Declaration of Friendship released during the President’s visit in January specifically mentions our mutual respect for “an open, just, sustainable, and inclusive rule-based global order.”

Given the threats and risks, and our national principles and laws, I believe that one of the most obvious and most beneficial areas of cooperation between our countries is in the establishment of rules of the road for outer space activities.

As established space-faring nations, India and the United States should work together to clearly and publicly define what behavior the international community should find both acceptable and unacceptable.

Transparency and confidence-building measures, or TCBMs, such as the proposed International Code of Conduct for Outer Space Activities, can contribute to everyone’s awareness of the space environment.

Among the Code’s commitments for signatories is to refrain from any action which brings about, directly or indirectly, damage, or destruction, of space objects and to minimize, to the greatest extent possible, the creation of space debris, in particular, the creation of long-lived space debris.

Political commitments such as the International Code of Conduct are complemented by work on guidelines on space operations and collaborative space situational awareness in multilateral fora such as the United Nations Committee on the Peaceful Uses of Outer Space, or COPUOS.

The Working Group on the Long-Term Sustainability of Outer Space Activities, a part of COPUOS’ Scientific and Technical Subcommittee, which just concluded its meeting in Vienna last month, is doing important work to move forward in the development of new international long-term sustainability guidelines.

Initiatives like the establishment of TCBMs, the Code of Conduct, and the work of UNCOPUOS cannot be successful without the support and active participation of India.

But Indian support for these or other rules of the road initiatives only gets us half-way there. I firmly believe that with U.S.-India collaboration in establishing norms of responsible behavior and Indian leadership in multilateral fora, we can make these and future initiatives even better.

Conclusion

There is much we can do as global partners to ensure the long-term sustainability and security of the outer space environment. Cooperation on space is just one piece of a strategic U.S.-India relationship in the 21st century. As President Obama said in this very city a little more than one month ago, “our nations will be more secure, and the world will be a safer and more just place when our two democracies stand together.”

Thank you again for hosting me here today, and I look forward to your questions and to our first official space security dialogue with India.

Thank you.

Sunday, March 8, 2015

NASA | MEARSURING MARS' ANCIENT OCEAN

NSF VIDEO: FUTURISTIC DIGITAL DISPLAYS - CES 2015

"JUSTI IN TIME SUPPORT" FOR CS TEACERS

FROM:  NATIONAL SCIENCE FOUNDATION
Just in time: Tips for computer science teachers when they need it

NSF's supported professional development program provides online tools and resources for new and experienced computer science educators
Imagine yourself as a young (or not so young) high school teacher who is asked to teach a computer science (CS) class for the first time.

You may have taken a few CS classes in college, or maybe not--and even if you majored in it, you've never taught it to a class full of students before. Until this year, there wasn't even a CS class to teach in your school.

How do you prepare? How does the education system help prepare you?

Aman Yadav has a unique background that allows him to see the problem of training CS teachers from a number of vantage points.

"I was a programmer in the College of Education at Michigan State University, where I'm now a faculty in the Educational Psychology and Educational Technology program," Yadav said. "We were developing an online learning environment for teachers and as I was programming it, I started wondering: what are the benefits of this on preservice teachers?"

When, years later, after getting a Ph.D. in Education, he applied for a grant from the National Science Foundation (NSF) to study teacher professional development, along with colleagues Susanne Hambrusch, Tim Korb and James Lehman from Purdue University, it was almost as if circumstances had come full circle.

"Having the background in CS and then getting a doctorate in Education allowed me to connect those two areas," he said. "CS education is a perfect blend that gets at all these interests: educational research, professional development, plus that little bug in me that still loves programming."

Hambrusch, a CS professor, has worked with Yadav for the last six years and their collaboration highlights the importance of interdisciplinary research.

"It has changed how I teach and it has made me explore new questions on how learning happens in CS," Hambrusch said. "Computer science education research is an important emerging research area with tough challenges to solve, and collaborations between CS and Education researchers is a key to making lasting progress."

For the last two years, Yadav and his colleagues have been developing a set of new online professional development resources to provide "just-in-time" support for new CS teachers and for experienced CS teachers teaching new courses for the first time.

Instead of relying solely on day- or week-long summer workshops to prepare teachers, the program provides instruction to teachers the way they expect to get information these days: anytime and anywhere, via the web.

The project is part of the CS 10K effort, which aims to build a talent pool of future computer scientists by developing new high school curricula in computing and getting that curricula into 10,000 high schools, taught by 10,000 well-prepared teachers.

NSF, along with the many other organizations supporting the CS 10K effort, recognizes that for the United States to thrive, it needs to cultivate a diverse workforce equipped with the computational skills to contribute to a technologically-driven economy. In recent years, NSF has invested more than $110 million to expand access to and broaden the diversity of students participating in computer science courses.

But training 10,000 teachers is no small feat. In fact, it likely cannot be done in the way that professional development has traditionally been delivered, both because of the scale of the need and the unique circumstances of CS teachers.

Teachers usually receive pre-service training in a given subject, either as a student in an education program or in a workshop or summer setting. However, CS teachers are atypical in that they often do not have pre-service training as computer scientists. Instead, CS teachers have math, business or technology backgrounds and must transition to CS.

"We believe that in order to meet the goal of 10,000 CS teachers, we really need to not only target the pre-service teachers, but also the in-service teachers," Yadav said. "Those teachers are in the classrooms and not all of them have CS backgrounds. So when they want to teach a CS course, we need to develop both their content knowledge as well as their knowledge of how to teach that content."

To supplement teachers' typical training, Yadav and his team created a collection of materials for new or still-learning CS teachers. These materials include videos, written instructions, slideshows and links to activities or additional tools. Teachers are encouraged to become students themselves, using their skills to code projects before asking their students to do the same.

The materials cover both the subject matter and common misconceptions and how to avoid them. Yadav says that in many cases, the teachers end up adapting the professional development materials for classroom use or for student assignments.

"We're developing the materials for teachers, but teachers are also finding that the materials can be used with their students as well," he said.

After creating hundreds of hours of professional development content for teachers, the team partnered with Project Lead the Way (a leading non-profit organization that develops STEM curricula for schools) in 2013 to pilot the program.

The first group of CS teachers was trained using the just-in-time professional development materials last year. Among that first cohort of teachers was Tim Velegol, the engineering department coordinator and career and technical education department chair at Durham Public Schools in North Carolina.

"As a rookie CS teacher, I needed all the resources I could get my hands on to deliver meaningful instruction to a very diverse group of high school learners," Velegol explained.

Using tools like Scratch and Python strengthened and updated Velegol's grasp of core CS concepts that he was originally exposed to years ago in Fortran and C++ programming languages.

"While there is a lot of information on the Internet, Dr. Yadav's group gave me great mini-activity ideas to help reinforce many of the fundamentals of programming--puzzlers that don't require a lot of code, but that force students to think about how to develop algorithms," he said.

In addition to the learning materials, teachers are able to ask questions of Yadav and his team, as well as other teachers-in-training, through an interactive Q&A platform, Piazza.

The researchers are going over the data from the pilot group, incorporating their feedback and preparing an article synthesizing their findings. They have expanded the program to include 30 new teachers, with whom they will not only gather impressions, but also test the efficacy of the professional development on the teachers and their students' performance.

At the same time, they have been conducting interviews with current CS teachers to get a sense of the common classroom challenges instructors face and the ways they have found to overcome them.

"We're collecting pilot data this year and will collect teacher outcomes as well as student outcomes," Yadav said. "Are the students interested in computing? Do they see the role of computing in their future careers? Once we have the data, we'll open it up to anyone who wants access to our materials."

[Yadav will lead a workshop on CS education research on March 6 at SIGCSE, the ACM technical symposium on computer science education.]

-- Aaron Dubrow,
Investigators
Aman Yadav
Susanne Hambrusch
John Korb
James Lehman
Brian French
Related Institutions/Organizations
Michigan State University
Purdue University
Project Lead the Way

Thursday, March 5, 2015

THE QUEST FOR ECO-FRIENDLY PLASTICS

FROM:  NATIONAL SCIENCE FOUNDATION
From tea bags to Miatas, bioplastics are on the rise
Chemists and other researchers are working up new formulas for greener plastic
March 3, 2015

It's no longer common to hear, "Paper or plastic?" at the supermarket. In many jurisdictions, the plastic option is curbed. Hundreds of local governments around the world--even entire countries, such as China and India--ban or tax lightweight, single-use plastic bags.

Every year in the United States, more governments enact such restrictions, which are part of a larger shift away from petroleum-based plastic. As people grow more concerned about throwaways destined for landfills (or worse, for the open ocean) and the problems associated with fossil fuels, businesses of all sizes are looking beyond "traditional," petroleum-based plastics to alternatives derived from plants, or even synthesized by microorganisms.

The bioplastic revolution

Bioplastics are made wholly or in part from renewable biomass sources such as sugarcane and corn, or from the digest of microbes such as yeast. Some bioplastics are biodegradable or even compostable, under the right conditions.

These new, more eco-friendly plastics are cropping up in all sorts of places, from tea bags to 3D printing media to medical implants.

In Finland, for example, consumers can now buy milk in cartons, made by Tetra Pak, that are 100 percent plant-based. In the United States, a small company called Iris Industries used Kickstarter to get off the ground with "Denimite," a marbleized blue composite made of recycled denim and a thermoset resin binding agent that is partially bio-based. And NSF-funded Ecovative makes a packing material called "Myco Foam" that's designed to replace polystyrene packaging, that bane of environmentally aware consumers who nevertheless buy take-out meals.

The bioplastic revolution

Bio-based plastics are on the rise. The thriving European market for bioplastics is growing by more than 20 percent per year. Global demand is expected to rise by 19 percent annually through 2017, according to market research group Freedonia. Global production capacities are set to increase by 400 percent by 2018, with most bioplastics being produced in Asia, according to European Bioplastics (EUBP), an association that represents the interests of the industry in Europe.

Packaging has been, and still is, one of the most common uses for bioplastics, but there is growth in other areas, such as textiles and automotive applications.

"From functional sports garments with enhanced breathability to fuel lines, bioplastics are constantly spreading into new markets," said EUBP chairman François de Bie.

Even the sports-car market appreciates bioplastics. Mazda announced late last year that it would use a new bioplastic in the interior (and, eventually, exterior) of its MX5 Miata. In a December 2014 press release, the company says the plant-based plastic it developed with Mitsubishi Chemical Corp. can be dyed and has a higher-quality, less-toxic finish than traditional painted surfaces.

Likewise, the Ford Motor Co. said last July that it will work with Heinz to make plastic out of leftover tomato skins, for use in car wiring brackets and storage bins.

How plastics are born

All of this activity is exciting, but most of today's plastic still comes from a nonrenewable resource: crude oil deposits in the earth. The oil is extracted and sent to a refinery to be distilled and yield an intermediate product called naphtha. Intense heat helps "crack" the naphtha into smaller hydrocarbon molecules such as ethylene and propylene. These chemicals are combined with a catalyst and polymerized to form chains of many linked molecules--the materials we know as plastics.

Different kinds of plastic will have varying polymer structures and distinct properties (toughness, stiffness, strength, transparency, etc.). Manufacturers then buy those bulk polymer pellets, granules or liquids for creating plastic in different shapes using processes such as extrusion or injection molding.

The push to use alternative, more renewable feed stocks rests on increasing concerns about the impact of petrochemicals on health and the environment, as well as the wariness people feel about relying on finite fossil-fuel resources. Many petroleum-based plastics don't break down for hundreds, or even thousands, of years--the carbon-carbon bonds that form the polymers are that durable. According to the U.S. Environmental Protection Agency (EPA), in 2012, the U.S. generated almost 32 million tons of plastic waste, but only 9 percent of that was recovered for recycling, leaving about 29 million tons. Much of the rest ends up in landfills, as ground litter or in the ocean.

In addition, petro-based plastics have been linked to health concerns such as endocrine disruption, and studies show some potentially harmful plastic chemicals accumulate in the human body.

Planting the next plastic crop

To spur solutions, some governments are promoting global and national bio-based economies or so-called bioeconomies. In 2012, the Obama administration released a National Bioeconomy Blueprint that calls for increased research and development, technology transfer, training and other steps to drive the nation's bioeconomy. Businesses are interested in following that lead--in fact, they may actually be ahead of consumers, some of whom aren't willing to pay a premium for greener plastics.

"The consumers want these materials, and they want to be more sustainable," said Marc Hillmyer, director of the University of Minnesota's Center for Sustainable Polymers (CSP). "But they're generally not going to do it at a cost. What we hear from industry is, 'Yes, we obviously have businesses that rely on petrochemical feed stocks, and we obviously want to be profitable in those businesses. But we want to be part of the future as well," Hillmyer added.

Nearly three dozen company affiliates support the CSP's work, including 3M, Ashland, BASF, Coca-Cola, General Mills, Henkel, Kimberly-Clark, Natureworks and Schlumberger, which make up the center's Industrial Advisory Board.

Coca-Cola has been one of the big-business leaders in bioplastics development, with a recyclable "PlantBottle" that is made partially from PET (polyethylene terephthalate) derived from sugarcane. PlantBottle packaging accounts for 30 percent of the company's packaging in North America and 7 percent globally, "making Coke the world's largest bioplastics end user," the company has said. The company has also said it wants its bottles to be 100 percent made of plant-based plastic by 2020.

Alternative plastics also show up in niche products. For example, last year, wine cork maker Nomacorc released a recyclable cork made of renewable plant-based polyethylene, and a Finnish company called Ahlstrom sells tea bags made of polylactic acid (PLA), which is derived from resources such as corn starch and sugarcane, and is one of the most commonly used bioplastics.

The cost of green

Researchers working with businesses are challenged to make a material that will not only be biodegradable and nontoxic, but also cost-effective.

"Many people, including us, are very good at making expensive polymers that help us advance basic science but that are not economically all that viable," Hillmyer said. "And so, what we're really trying to emphasize in the center, again with industrial input, is how do we do it economically?"

To date, Hillmyer and his colleagues have had several success stories:

The center developed a biodegradable adhesive, made from PLA and a menthol-based polymer, which could one day make sticky-note recycling more efficient and environmentally friendly. (Most sticky notes are petroleum-based and tend to gum up recycling equipment.)
The center has identified a way to use additives to improve the toughness of PLA by a factor of more than 10.
They've discovered a new high-performance bio-based elastomer (an elastic polymer resembling rubber) that could be an economic, drop-in replacement for current petroleum-based materials.
There are many other challenges in developing new materials and getting them from the lab to the market.

"Our undergraduates, graduate students and postdocs all regularly hear from industry about the challenges that [companies] face when trying to introduce a new material into the marketplace," said CSP Managing Director Laura Seifert. "Can it be scaled up to an industrial process in an economically viable way? Can the material be used in existing infrastructure, or do we have to build an entirely new plant in order to adopt this new technology? And at the end of life … is it going to cause more harm than good to introduce this into our recycling stream?"

"These are hard problems," said Hillmyer. "If it was easy, somebody would have done it."

While the polymer industry is not going to shift overnight, in the long run change is inevitable, he added. "The graduate students and postdoctoral researchers and undergraduates...in the center, they're driven by these principles. So we are not having a hard time convincing them that this is something they should do. They're growing up in this world [asking] 'How do we make our world more sustainable?'"

-- Jacqueline Conciatore, National Science Foundation jconciat@associates.nsf.gov
Investigators
Marc Hillmyer
Related Institutions/Organizations
University of Minnesota-Twin Cities

Tuesday, March 3, 2015

LANL: ADVANCED MODELING, SIMULATION TECH USED IN LIGHT-WATER REACTOR RESEARCH

FROM:  LOS ALAMOS NATIONAL LABORATORY
Los Alamos Boosts Light-Water Reactor Research with Advanced Modeling and Simulation Technology
Simulated nuclear reactor project benefits from funding extension

LOS ALAMOS, N.M., March 2, 2015, 2014—Hard on the heels of a five-year funding renewal, modeling and simulation (M&S) technology developed at Los Alamos National Laboratory as part of the Consortium for the Advanced Simulation of Light Water Reactors (CASL) will now be deployed to industry and academia under a new inter-institutional agreement for intellectual property.

“This agreement streamlines access to the reactor simulation research tools,” said Kathleen McDonald, software business development executive for the Laboratory, “and with a single contact through UT-Battelle, we have a more transparent release process, the culmination of a lengthy effort on the part of all the code authors,” she said.

CASL is a US Department of Energy “Energy Innovation Hub” established in 2010 to develop advanced M&S capabilities that serve as a virtual version of existing, operating nuclear reactors. As announced by DOE in January, the hub would receive up to $121.5 million over five years, subject to congressional appropriations. Over the next five years, CASL researchers will focus on extending the M&S technology built during its first phase to include additional nuclear reactor designs, including boiling water reactors and pressurized water reactor-based small modular reactors.

CASL’s Virtual Environment for Reactor Applications (VERA) – essentially a “virtual” reactor – has currently been deployed for testing to CASL’s industrial partners. Created with CASL Funding, VERA consists of CASL Physics Codes and the software that couples CASL Physics Codes to create the computer models to predict and simulate light water reactor (LWR) nuclear power plant operations. VERA is being validated with data from a variety of sources, including operating pressurized water reactors such as the Watts Bar Unit 1 Nuclear Plant in Tennessee, operated by the Tennessee Valley Authority (TVA)

As one of the original founding CASL partners, Los Alamos will continue to play an important role in Phase 2 of CASL.  Specifically, Los Alamos has leadership roles in three technical focus areas: Thermal Hydraulics Methods (THM), Fuel, Materials and Chemistry (FMC) and Validation and Modeling Applications (VMA).

Thermal-Hydraulics applications range from fluid-structure interaction to boiling multiphase flows. The Los Alamos-led THM team is targeting a number of industry-defined CASL “challenge problems” related to corrosion, fretting and departure from nucleate boiling.

The Fuel, Materials and Chemistry (FMC) Focus Area aims to develop improved materials performance models for fuel and cladding, and integrate those models via constitutive relations and behavioral models into VERA.  In particular, Los Alamos will bring to bear experience in structure-property relations, mechanical deformation and chemical kinetics to address several key aspects of nuclear fuel performance.

The Validation and Modeling Applications (VMA) Focus Area applies the products developed by CASL to address essential industry issues for achieving the CASL objectives of power uprates, lifetime extension, and fuel burn up limit increases, while ensuring the fuel performance and safety limits are met.

Los Alamos will continue to provide functions that are essential for achieving credible, science-based predictive modeling and simulation capabilities, including verification, validation, calibration through data assimilation, sensitivity analysis, discretization error analysis and control, and uncertainty quantification.

The new IIA agreement makes one of the Los Alamos-developed software tools, MAMBA, available for research, subject to agreements through the consortium partners. In addition, the Hydra-TH application is provided under an open-source license in VERA for advanced, scalable single and multiphase computational fluid dynamics simulations.

CASL, which is led by and headquartered at Oak Ridge National Laboratory (ORNL), has created hundreds of technical reports and publications and wide engagement with nuclear reactor technology vendors, utilities, and the advanced computing industry.

Doug Kothe, CASL Director at ORNL, notes that “CASL has benefitted tremendously from the innovative technical contributions and leadership provided by Los Alamos technical staff and is fortunate to have these contributions continuing as CASL moves into its second five-years of execution.”

Sunday, March 1, 2015

DOD VIDEO: PROSTHETIC LIMBS WITH SENSATIONS


NSF ON THE 'ENERGY INTERNET'

FROM:  NATIONAL SCIENCE FOUNDATION
Creating the energy Internet
How leaders in research, industry and engineering education are working to create the energy network of the future

It only takes a power outage of a few minutes in the middle of a busy workday to drive home the hazards of relying on an energy infrastructure rooted in the Industrial Age. Without the electricity delivered over the nation's power grid, commerce would grind to a halt, communication networks would fail, transportation would stop and cities would go dark.

Simply put, nothing would work.

Plus, blackouts aren't easy to contain. Because the power grid is a vast interconnected network, the failure of one part can have a cascading effect, triggering successive outages down the line.

"The power grid is based on technology from the early 20th century," says Iqbal Husain, a professor of electrical and computer engineering at North Carolina State University. "That needs to change."

Husain is director of the FREEDM Systems Center, a collaboration of leaders in research, industry and engineering education working to envision and then create the energy network of the future. With funding from the National Science Foundation (NSF) leveraged by additional industry support, the Engineering Research Center has sparked the growth of dozens of clean energy businesses in Raleigh's Research Triangle, making the region an epicenter of smart grid development.

"We're trying to create a new electric grid infrastructure that we call the energy Internet," says Alex Huang, an NC State researcher and co-inventor of a newly patented soft-switch single-stage AC-DC converter. "We're looking at the whole distribution system. That's a huge engineering system. It's very, very complex."

According to the U.S. Department of Energy, the smart grid will be more efficient and capable of meeting increased consumer demand without adding infrastructure. It also will be more intelligent, sensing system overloads and rerouting power to prevent or to minimize a potential outage. It will accept energy from virtually any fuel source and--building on NSF-funded research--offer improved security and resiliency in case of a natural disaster or threat. It also will allow real-time communication between the consumer and utility, ushering in a new era of consumer choice.

Energy innovation

From its headquarters on NC State's Centennial Campus, FREEDM (short for Future Renewable Electric Energy Delivery and Management) is coming at the challenge on many fronts, from the creation of new devices that will allow energy to flow in more than one direction to the development of the software architecture that will give the smart grid its brainpower.

The facility boasts a 1-megawatt demonstration hub and real-time digital simulation lab, as well as labs specializing in computer science, power electronics, energy storage and motor drive technology. Under the FREEDM umbrella, researchers and students are tackling more than a dozen research projects in partnership with colleagues at Arizona State University, Florida State University, Florida A&M University and Missouri University of Science and Technology.

That's just this year. In seven years, the center has launched dozens of projects in fields ranging from systems theory to intelligent energy management.

The result is one innovation after another. Researchers have developed a technique that allows a common electronic component to handle voltages almost seven times higher than existing components; created an ultra-fast fault detection, isolation and restoration system; and invented a new solid-state transformer to replace the 100-year-old electromagnetic transformer.

These innovations hold promise for making the power grid more resilient, fostering sustainable energy technologies that play an important role in the nation's energy infrastructure, and driving economic growth.

Startups spawn new technologies

For example, the startup company General Capacitor is focused on developing energy storage products based on the "ultracapacitor" discoveries made by Jim Zheng, a professor at Florida A&M University and Florida State University who serves on FREEDM's leadership team.

Zheng's ultracapacitors open the door to a new generation of energy storage technologies that can be used to help stabilize the flow of energy from renewable sources--such as solar power--into the grid. This would have the effect of making renewable sources more viable, while also making the grid itself more resilient.

For the future power grid, incorporating these new technologies will be like plugging in a lamp. The smart grid will be able to collect and process thousands or even millions of bits of data and intelligently manage the flow of power across the network, ideally doing most of its work at the edge of the grid, close to the customer. This kind of system--called distributed generation--is potentially more efficient and environmentally sustainable than the existing system.

A mile from the NC State campus in Raleigh, a startup company called GridBridge is working to commercialize FREEDM technology in the form of a smart grid router that can integrate renewables and energy storage devices, including electric vehicles, into the grid. GridBridge was funded by the NSF Small Business Innovation Research program.

"We don't expect the utility companies to rip out their existing infrastructure," says CEO Chad Eckhardt. "But they need products that can help the infrastructure operate better and more efficiently."

Another FREEDM partner, energy giant ABB, is working to perfect the technology behind microgrids, which could significantly enhance grid security and reliability.

A microgrid essentially simulates the operations of the larger grid but, as the name suggests, provides power on a smaller scale, serving a town, military base or university, for example. Microgrids can operate independently of the main grid or run parallel to it. ABB's microgrid is designed to seamlessly integrate renewables, with their fluctuating energy profiles, and output reliable power. If the main grid goes down, its microgrid system isolates itself from the larger grid and continues to provide power to its customers. When the larger grid comes back online, the connection is re-established.

"Anything that produces power could potentially be a microgrid," says Brad Luyster, vice president and general manager of ABB's Microgrid Regional Execution Center. "If the power goes off from the main grid, the microgrid has its own generation on site."

The global marketplace

GridBridge and ABB aren't the only companies in the region eyeing the opportunities for energy innovation.

A recent study identified 169 firms within the 13-county Triangle region, including 16 Fortune 500 companies, working to develop sustainable solutions to the world's energy needs. The sector, called cleantech by the industry, spans every county in the region.

Lee Anne Nance, executive director of the Research Triangle Regional Partnership, spearheads a collaborative network called the Research Triangle Cleantech Cluster that promotes the region's competitive edge in the global marketplace. Its members include some of the industry's biggest players, including Duke Energy, Siemens Energy, ABB Inc. and Schneider Electric, as well as major high-tech companies such as SAS, Cisco, Power Analytics, Sensus, Power Secure, RTI International and Field2Base.

Combined, they pack a powerful punch, employing thousands of high-skill workers and driving innovation in energy management, water, transportation, data analytics, information technology, renewable energy, electronics and engineering.

"This is a disruptive and transformational time in infrastructure delivery throughout the world, and our region is leading the way," Nance says. "We're right in the middle of the action and that's good for the economy, the people who work here and the people who live here."

-- David Hunt, North Carolina State University
Investigators
Jim Zheng
Alex Huang
Gerald Heydt
Iqbal Husain
Mariesa Crow
Steinar Dale
Chad Eckhardt
Christopher Edrington
Related Institutions/Organizations
GridBridge, Inc
North Carolina State University

Saturday, February 28, 2015

GENE EDITITING AND REGULATION TO IMPROVE IMMUNE SYSTEM

FROM:  NATIONAL SCIENCE FOUNDATION
Rewriting genetic information to prevent disease

Breakthrough Prize winner harnesses CRISPR to improve immune system
For the last few years, scientists have been studying an ancient but only recently understood mechanism of bacterial immunity that has the potential to provide immeasurable benefits to plant and animal health.

The phenomenon known as CRISPR (for Clustered Regularly Interspaced Short Palindromic Repeats) is a natural immune system found in many bacteria with the ability to identify and destroy the genomes of invading viruses and plasmids.

Researchers are trying to harness this system for gene editing and regulation, a process that could transform "the genome of plants or animals in ways that will improve their health, or introduce genetic changes that will resist disease of climate change," says Jennifer Doudna, a Howard Hughes Medical Institute investigator and professor of biochemistry, biophysics and structural biology at the University of California, Berkeley. "The explosion of research using this technique has been amazing."

Doudna, collaborating with Emmanuelle Charpentier of Sweden's Helmholtz Center for Infection Research and Umeå University, identified how the system works and engineered it in new ways that broadened its scope. The two researchers, who described their work in a 2012 paper in the journal Science, developed a technique that enables the rewriting of genetic information and the correction of mutations that otherwise can cause disease, and also can knock out the cell's ability to make harmful proteins, she says.

"Many labs have shown in principle that this can be used to correct such mutations as those that occur in cystic fibrosis, or sickle cell disease," she says. "They are showing it in cell lines and lab animals. We're still some period of time away from using this in humans, but the pace in the field has been truly remarkable, and really exciting to see."

Many bacteria have this CRISPR-based immune system capable of identifying and destroying hostile invaders. Doudna and Charpentier showed that, in doing so, CRISPR produces the protein Cas9, a DNA-cutting enzyme guided by RNA, which relies on two short RNA guide sequences to find foreign DNA, then cleaves, or cuts, the target sequences, thereby muting the genes of the invaders.

Cas9 has evolved to provide protection against viruses that could infect the bacterium, and uses pieces of RNA derived from CRISPRS to direct its activity. The system is specific and efficient enough to stave off viral infections in bacteria.

Doudna and her colleagues programmed the process so that it can be directed by a single short RNA molecule; researchers who use it to edit genomes can customize the RNA so that it sends Cas9 to cleave, like "scissors," at their chosen location in the genome.

"When we figured out how it worked, we realized we could alter the design of RNA and program Cas9 to recognize any DNA sequence," she says. "One can therefore target Cas9 to any region of a genome simply by providing a short guide RNA that can pair with the region of interest. Once targeted, different versions of Cas9 can be used to activate or inhibit genes, as well as make target cuts within the genome. Depending on the experimental design, research can use these latter cuts to either disrupt genes or replace them with newly engineered versions."

Recently Douda and Charpentier and four other scientists received the Breakthrough Prize in life sciences, which honors transformative advances toward understanding living systems and extending human life. The prizes recognize pioneering work in physics, genetics, cosmology, neurology and mathematics, and carry a $3 million award for each researcher. The Breakthrough committee specifically cited Doudna and Charpentier for their advances in understanding the CRISPR mechanism.

Doudna has been the recipient of several National Science Foundation (NSF) grants to support her research in recent years totaling more than $1.5 million. In 2000, she received NSF's prestigious $500,000 Alan T. Waterman Award, which recognizes an outstanding young researcher in any field of science or engineering supported by NSF.

She also was a founder of the Innovative Genomics Initiative, established in 2014 at the Li Ka Shing Center for Genomic Engineering at UC Berkeley. Its goal is to promote and support genome editing research and technology in both academic and commercial research communities.

"We have a team of scientists working with various collaborative partners," she says. "We want to ensure that the technology gets into as many hands as possible, and explore ways to make it even better. We are trying to bring about fundamental change in biological and biomedical research by enabling scientists to read and write in genomes with equal ease. It's a bold new effort that embraces a new era in genomic engineering."

-- Marlene Cimons, National Science Foundation
Investigators
Jennifer Doudna
Related Institutions/Organizations
University of California-Berkeley

Thursday, February 26, 2015

DOD LOOKS TO 'LONG-RANGE RESEARCH AND DEVELOPMENT PLAN'

FROM:  U.S. DEFENSE DEPARTMENT

Right:  An Air Force KC-135 Stratotanker refueling aircraft from the 909th Air Refueling Squadron refuels an Air Force E-8C Surveillance Target Attack Radar System aircraft Jan. 29, 2013. U.S. Air Force photo by Airman 1st Class Tyler Prince.  

DoD Seeks Novel Ideas to Shape its Technological Future
By Cheryl Pellerin
DoD News, Defense Media Activity

WASHINGTON, Feb. 24, 2015 – The Defense Department is seeking novel ideas to shape its future, and officials are looking to industry, small business, academia, start-ups, the public –- anyone, really –- to boost its ability to prevail against adversaries whose access to technology grows daily.

The program, called the Long-Range Research and Development Plan, or LRRDP, began with an Oct. 29 memo by DoD acquisition chief Frank Kendall.
The memo said the LRRDP will identify high-payoff enabling technology investments that could help shape future U.S. materiel investments and the trajectory of future competition for technical superiority. The plan will focus on technology that can be moved into development programs within the next five years.

Full and Immediate Support

“This effort is of the highest priority and requires full and immediate support from across the department,” Kendall wrote.

On Jan. 28, the department published a request for information, seeking to identify current and emerging technologies or projections of technology-enabled concepts that “could provide significant military advantage to the United States and its partners and allies in the 2030 timeframe.”

During a recent media roundtable here, LRRDP program lead Stephen P. Welby, deputy assistant secretary of defense for systems engineering, said the RFI deadline has twice been extended, and that more than 300 responses have come in.

“We have gotten some very talented folks replying to the RFI,” Welby said. Ideas are coming from small businesses, from traditional defense sources, and “some from surprising places we hadn't thought might respond,” Kendall said. “And that's exactly what we're hoping to get from this,” he added.

Defense Innovation Initiative

The LRRDP is part of the larger Defense Innovation Initiative, an effort to harness the brightest minds and cutting-edge technology to accelerate the way the department innovates and operates.

Deputy Defense Secretary Bob Work is managing and integrating the initiative’s five technology areas, one of which is the LRRDP. In a summer meeting, Welby said, Work “introduced and drew out a historical analogy to where we are today.”

In 1973, the nation was moving out of the Vietnam War, where the military had been focused on counterinsurgency. Budgets were declining. And the Soviets, among other things, gradually had begun to build up their strategic nuclear forces, Work said during a January speech.

In the summer of 1973, with the dangers of nuclear escalation growing, what would later become the Defense Advanced Research Projects Agency, or DARPA, launched the first LRRDP program to give the president and the joint force better tools for responding to a Warsaw Pact attack, the deputy secretary said.

The group recommended going after conventional weapons with near-zero miss capability -- “a very simple idea that had profound implications throughout the entire defense program,” he added.

In 1977, the DoD leadership directed DARPA to integrate all of the promising military technologies into a system of systems for deep attack. The program, Assault Breaker, called for aircraft with light-area-sensor cueing and surface-to-surface ballistic missiles that could dispense a blanket of anti-armor submunitions.
Picking a Competitive Advantag
e

Assault Breaker demonstrated its capabilities in 1982 at the White Sands Missile Range in New Mexico, and Work said the Soviets were watching.

“The implications of that single demonstration … really caused them to pause,” he added.

Ultimately, Assault Breaker led to development of the Air Force’s 17 E-8 Joint Surveillance Target and Attack Radar System, or JSTARS, aircraft, its air-to-ground BLU-108 sensor-fuzed weapon with terminally guided submunitions, and the long-range, surface-to-surface Army Tactical Missile System called ATACMS.

“We had picked a competitive advantage that we knew our adversary, the Soviets, could not duplicate and therefore injected uncertainty in their minds, changing their war-fighting calculus,” Work explained.

The joint force took over Assault Breaker, the deputy secretary said, “and we continued to build [the advanced capability] even in an era of declining budgets, starting in 1985.”

Demonstrating the Capability

U.S. forces demonstrated the capability, including that of the E-8C JSTARS side-looking airborne radar system with moving target indication, to the rest of the world in 1990 and 1991. This was during Operation Desert Storm, Work said, “when the Iraqi heavy formations built on the Soviet model were virtually reduced to an array of targets.”

Forty-two years after the plan’s inception, the second iteration of LRRDP is still accepting idea submissions, Welby said, noting that the LRRDP program page at the department’s Innovation Marketplace website features a conspicuously placed green box that says, “Share your ideas.”

Submissions should focus on technology-enabled capabilities that could to enter formal development in the next five to 10 years, the RFI says, offering military advantage during the 2025 to 2030 timeframe.

The LRRDP is looking for relatively mature technologies that can be applied in novel ways for a new kind of system capability, emerging technologies that can quickly be turned to new military capabilities, or technologies for nondefense applications that can offer new military capabilities.

Technology Priorities

Five technology priority areas include space, undersea technology, air dominance and strike, air and missile defense, and other technology-driven concepts.

When program officials find an idea interesting, one of five teams will be sent to speak with the submitting person or company, Welby said, adding that in mid-summer, the best ideas will be shared with Defense Secretary Ash Carter.
“The customer for this is the leadership of the department,” he said, “to help them think through the future and think differently about what the world's going to look like.”

Thursday, February 19, 2015

TWO VIEWS OF CERES

FROM:  NASA 



These two views of Ceres were acquired by NASA's Dawn spacecraft on Feb. 12, 2015, from a distance of about 52,000 miles (83,000 kilometers) as the dwarf planet rotated. The images have been magnified from their original size. The Dawn spacecraft is due to arrive at Ceres on March 6, 2015. Dawn's mission to Vesta and Ceres is managed by the Jet Propulsion Laboratory for NASA's Science Mission Directorate in Washington. Dawn is a project of the directorate's Discovery Program, managed by NASA's Marshall Space Flight Center in Huntsville, Alabama. UCLA is responsible for overall Dawn mission science. Orbital ATK, Inc., of Dulles, Virginia, designed and built the spacecraft. JPL is managed for NASA by the California Institute of Technology in Pasadena. The framing cameras were provided by the Max Planck Institute for Solar System Research, Göttingen, Germany, with significant contributions by the German Aerospace Center (DLR) Institute of Planetary Research, Berlin, and in coordination with the Institute of Computer and Communication Network Engineering, Braunschweig. The visible and infrared mapping spectrometer was provided by the Italian Space Agency and the Italian National Institute for Astrophysics, built by Selex ES, and is managed and operated by the Italian Institute for Space Astrophysics and Planetology, Rome. The gamma ray and neutron detector was built by Los Alamos National Laboratory, New Mexico, and is operated by the Planetary Science Institute, Tucson, Arizona. Image Credit: NASA/JPL-Caltech/UCLA/MPS/DLR/IDA.

Monday, February 16, 2015

NSF VIDEO: ENGINEERING THE SPARK THAT STARTS WILDFIRES

THE HOUSEHOLD BOT

FROM:  NATIONAL SCIENCE FOUNDATION
Human insights inspire solutions for household robots
New algorithms designed by Berkeley and UMass researchers allow autonomous systems to deal with uncertainty

People typically consider doing the laundry to be a boring chore. But laundry is far from boring for artificial intelligence (AI) researchers like Siddharth Srivastava, a scientist at the United Technologies Research Center, Berkeley.

To AI experts, programming a robot to do the laundry represents a challenging planning problem because current sensing and manipulation technology is not good enough to identify precisely the number of clothing pieces that are in a pile and the number that are picked up with each grasp. People can easily cope with this type of uncertainty and come up with a simple plan. But roboticists for decades have struggled to design an autonomous system able to do what we do so casually--clean our clothes.

In work done at the University of California, Berkeley, and presented at the Association for Advancement of Artificial Intelligence conference in Austin, Srivastava (working with Abhishek Gupta, Pieter Abbeel and Stuart Russell from UC Berkeley and Shlomo Zilberstein from University of Massachusetts, Amherst) demonstrated a robot that is capable of doing laundry without any specific knowledge of what it has to wash.

Earlier work by Abbeel's group had demonstrated solutions for the sorting and folding of clothes. The laundry task serves as an example for a wide-range of daily tasks that we do without thinking but that have, until now, proved difficult for automated tools assisting humans.

"The widely imagined helper robots of the future are expected to 'clear the table,' 'do laundry' or perform day-to-day tasks with ease," Srivastava said. "Currently however, computing the required behavior for such tasks is a challenging problem--particularly when there's uncertainty in resource or object quantities."

Humans, on the other hand, solve such problems with barely a conscious effort. In their work, the researchers showed how to compute correct solutions to problems by using some assumptions about the uncertainty.

"The main issue is how to develop what we call 'generalized plans,'" said Zilberstein, a professor of computer science and director of the Resource Bound Reasoning Lab at UMass Amherst. "These are plans that don't just work in a particular situation that is very well defined and gets you to a particular goal that is also well defined, but rather ones that work on a whole range of situations and you may not even know certain things about it."

The researchers' key insight was to use human behavior--the almost unconscious action of pulling, stuffing, folding and piling--as a template, adapting both the repetitive and thoughtful aspects of human problem-solving to handle uncertainty in their computed solutions.

By doing so, they enabled a PR2 robot to do the laundry without knowing how many and what type of clothes needed to be washed.

Out of the 13 or so tasks involved in the laundry problem, the team's system was able to complete more than half of them autonomously and nearly completed the rest--by far the most effective demonstration of laundering AI to date.

The framework that Srivastava and his team developed combines several popular planning paradigms that have been developed in the past using complex control structures such as loops and branches and optimizes them to run efficiently on modern hardware. It also incorporates an effective approach for computing plans by learning from examples, rather than through rigid instructions or programs.

"What's particularly exciting is that these methods provide a way forward in a problem that's well known to be computationally unsolvable in the worst case," Srivastava said. "We identified a simpler formulation that is solvable and also covers many useful scenarios."

"It is exciting to see how this breakthrough builds upon NSF-funded efforts tackling a variety of basic-research problems including planning, uncertainty, and task repetition," said Héctor Muñoz-Avila, program director at NSF's Robust Intelligence cluster.

Though laundry robots are an impressive, and potentially time-saving, application of AI, the framework that Srivastava and his team developed can be applied to a range of problems. From manufacturing to space exploration to search-and-rescue operations, any situation where artificially intelligent systems must act, despite some degree of uncertainty, can be addressed with their method.

"Using this approach, solutions to high-level planning can be generated automatically," Srivastava said. "There's more work to be done in this direction, but eventually we hope such methods will replace tedious and error-prone task-specific programming for robots."

-- Aaron Dubrow, NSF
-- Siddharth Srivastava, United Technologies Research Center
Investigators
Siddharth Srivastava
Shlomo Zilberstein
Related Institutions/Organizations
United Technologies Research Center
University of Massachusetts Amherst
Locations
Berkeley , California
Amherst , Massachusetts
Related Programs
Robust Intelligence
Related Awards
#0915071 RI: Small: Foundations and Applications of Generalized Planning
Years Research Conducted
2009 - 2015

Total Grants
$503,519

Friday, February 13, 2015

TERMITES vs. DESERTS

FROM:  THE NATIONAL SCIENCE FOUNDATION
Dirt mounds made by termites in Africa, South America, Asia could prevent spread of deserts
Termites create oases of moisture, plant life
February 5, 2015

Termites might not top the list of humanity's favorite insects, but new research suggests that their large dirt mounds are crucial to stopping deserts from spreading into semi-arid ecosystems.

The results indicate that termite mounds could make these areas more resilient to climate change.

The findings could also inspire a change in how scientists determine the possible effects of climate change on ecosystems.

In the parched grasslands and savannas, or drylands, of Africa, South America and Asia, termite mounds store nutrients and moisture and via internal tunnels, allow water to better penetrate the soil.

As a result, vegetation flourishes on and near termite mounds in ecosystems that are otherwise vulnerable to desertification.

Researchers report in this week's issue of the journal Science that termites slow the spread of deserts into drylands by providing a moist refuge for vegetation on and around their mounds.

Drylands with termite mounds can survive on significantly less rain than those without termite mounds.

Not all termites are pests

"This study demonstrates that termite mounds create important refugia for plants and help to protect vast landscapes in Africa from the effects of drought," said Doug Levey, program director in the National Science Foundation's Division of Environmental Biology, which funded the research.

"Clearly," said Levey, "not all termites are pests."

The research was inspired by the fungus-growing termite species, Odontotermes, but the results apply to all types of termites that increase resource availability on or around their mounds.

Corresponding author Corina Tarnita, a Princeton University ecologist and evolutionary biologist, said that termite mounds also preserve seeds and plant life, which helps surrounding areas rebound faster once rainfall resumes.

"Because termites allow water to penetrate the soil better, plants grow on or near the mounds as if there were more rain," said Tarnita. "The vegetation on and around termite mounds persists longer and declines slower.

"Even when you get to harsh conditions where vegetation disappears from the mounds, re-vegetation is still easier. As long as the mounds are there the ecosystem has a better chance to recover."

The stages of desertification: Where termites fit in

In grasslands and savannas, five stages mark the transition to desert, each having a distinct pattern of plant growth.

The researchers found that these plant growth patterns exist on a much smaller scale than previously thought. Overlaying them is the pattern of termite mounds covered by dense vegetation.

The termite-mound pattern, however, looks deceptively similar to the last and most critical of the five stages that mark the transition of drylands to desert.

Vegetation patterns that might be interpreted as the onset of desertification could mean the opposite: that plants are persevering thanks to termite mounds.

Termite mounds help grassland plants persevere

Robert Pringle, an ecologist and evolutionary biologist at Princeton and co-author of the paper, said that the unexpected function of termites in savannas and grasslands suggests that ants, prairie dogs, gophers and other mound-building creatures could also have important roles in ecosystem health.

"This phenomenon and these patterned landscape features are common," Pringle said.

"Exactly what each type of animal does for vegetation is hard to know in advance. You'd have to get into a system and determine what is building the mounds and what the properties of the mounds are.

"I like to think of termites as linchpins of the ecosystem in more than one way. They increase the productivity of the system, but they also make it more stable and more resilient."

Termites: Linchpins of the ecosystem

A mathematical model developed for the work determines how these linchpins affect plant growth.

The scientists applied tools from physics and mathematical and numerical analysis to understand a biological phenomenon, said paper first author Juan Bonachela of Strathclyde University in Scotland.

The model allowed the researchers to apply small-scale data to understand how rainfall influences vegetation growth and persistence in the presence and absence of termites across an entire ecosystem.

"Similar studies would be extremely challenging to perform in the field and would require very long-term experiments," Bonachela said.

"Models such as this allow us to study the system with almost no constraint of time or space and explore a wide range of environmental conditions with a level of detail that can't be attained in the field."

Additional support for the research was provided by a Princeton Environmental Institute Grand Challenges grant, the National Geographic Society, the Andrew W. Mellon Foundation and a John Templeton Foundation Foundational Questions in Evolutionary Biology grant.

-NSF-
Media Contacts
Cheryl Dybas, NSF

Search This Blog

Translate

White House.gov Press Office Feed