Showing posts with label RESEARCH. Show all posts
Showing posts with label RESEARCH. Show all posts

Sunday, May 31, 2015

NASA VIDEO: ScienceCasts: Roundworms Have the Right Stuff

SCIENCE/MATH HAVE LIMITS PREDICTING NATURAL DISASTERS LIKE EARTHQUAKES

FROM:  NATIONAL SCIENCE FOUNDATION
Earthquakes expose limits of scientific predictions

But math and science are refining ways to predict, limit impact of disasters
In 2012, six Italian seismologists were sent to prison because they failed to predict the 2009 L'Aquila 6.3 magnitude earthquake.

To some that may seem absurd but it points to the faith so many have come to place in science's ability to predict and prevent tragedies. Experts had for decades predicted that Nepal would experience a massive earthquake, but were unable to provide a more precise warning about the recent 7.8-magnitude quake that devastated the country. The Italian seismologists had similarly predicted earthquake probabilities but could not give an exact date.

Science and mathematics have not reached a point where they can forecast with certainty the exact time and specific severity of these cataclysmic events--and may never do so.

"The best we can do is make an assessment of there being a heightened risk in a certain geographic area over a certain window of time," said William Newman, a theoretical physicist at the University of California, Los Angeles, who has received funding from the National Sceince Foundation (NSF) for his work aimed at improving natural hazard predictions. "We can determine a sense of what is likely to occur, but we will never know exactly."

Newman has spent much of his 35-year career working in computational and applied mathematics but also has employed mathematics in applications to probe natural disaster issues such as earthquakes and climate change.

These days, mathematicians seem to be able to model almost anything, but, as Newman points out, the devil is not only in the details but in creating models that can be used for accurate prediction. In the case of tectonic plates, the randomness of their interaction limits the certainty of predictions, and those predictions become less certain as time passes. In much the same way that a weather forecaster can be more certain about predicting tomorrow's weather than next month's, Newman believes earthquake prediction accuracy has the potential to fall off.

"For mathematicians, three aspects come to mind," Newman said. "We like to think of the equations being well posed, well defined, and that we can run with them. In [Edward] Lorenz's case (whose model of turbulence celebrated its 50th anniversary recently), his equations about atmospheric behavior were, by and large, eminently reasonable. He supersimplified and saw that if he perturbed the initial conditions, after a certain amount of time, he could predict nothing."

Yes, you read that right: nothing.

The problem for mathematicians is that forecasting accuracy can only weaken as more variables cloud the equations and models they build. In the case of earthquakes, Newman says the prospects for good predictions are even more dismal than for atmospheric ones. Chaotic dynamics and complexity prevail.

In Los Angeles, where Newman lives, mathematicians and geophysicists have worked together and determined that sometime in the next 30 years, the area is likely to see a substantial earthquake due to its proximity to the San Andreas Fault. And as each year passes, the risk increases in this window of time. The mathematicians can only put so many pre-determined variables into their equations, including the patterns of tectonic plate changes and the environmental conditions that coincide with earthquake occurrences.

"We have to go into this realizing there are bounds," Newman said. "We are looking at complex systems that can produce patterns we just don't understand."

Additionally, while the news focuses on an earthquake and its aftershocks, there are also "foreshocks." But recognizing a a foreshock is impossible without seeing the seismic event that follows. So trying to formulate day-to-day seismologic predictions after any earthquake event can also be confounding.

Why even try to predict earthquakes?

One could easily draw the conclusion at this point that we walk away from the issue, shaking our heads. But mathematicians, computer scientists, physicists, geologists, engineers, and social scientists working together on this issue do provide value, each adding something that could improve the scientific community's understanding of this obviously complex issue.

As instruments become increasingly refined and data proliferate around the world, scientists also gain a better understanding of the consequences of earthquakes.

"It is true that scientists know very little about earthquake predictions," said Junping Wang, program director in NSF's mathematics division. "But this is exactly why we need to support earthquake research. Researching is the only way we can ever hope to build models that help to improve earthquake prediction and build a resilient society."

As they conduct more research in seismology, scientists are able to gain more and better knowledge that can benefit local policymakers looking to enhance preparedness and emergency response to earthquakes and cascading disasters.

"There are still plenty of opportunities where scientific and mathematical research can improve our knowledge," Wang said. "Understanding why an earthquake happened and how it happened helps us build better models, even if they can't tell us a specific date and time. With increased knowledge comes better preparedness."

Earthquake advice from a mathematician

"We can only tell people that there is a certain risk in a certain window of time," Newman said. "Then it's a matter of preparedness."

He cites the example of the Northridge earthquake that rocked the UCLA Mathematical Sciences Building in 1994. Architects designed expansion joints in different sections of the building because they knew that, at some point, it would have to cope with the trauma of earthquakes. In that case, some of the offices went through an "unexpected expansion," but Newman notes that ultimately the repairs were "essentially cosmetic."

Newman, who carries the distinction of being a member of UCLA's mathematics, physics and geology departments, routinely takes students to the San Andreas Fault--and specifically Vazquez Rocks, a set of formations exposed by seismic activity--for their research. He emphasizes that to prevent the fallout of earthquakes like the recent one in Nepal, policymaking that establishes building codes and individual preparedness are essential.

"If you live here, you have to earthquake-proof your home and your business. You need to be able to take care of yourself," he said. "And then when an earthquake does occur, hopefully, it will just be an inconvenience."

-- Ivy F. Kupec,
Investigators
William Newman
Vladimir Keilis-Borok
Related Institutions/Organizations
University of California-Los Angeles

Saturday, May 30, 2015

NSF VIDEO: Black holes and coffee - Scientists & Engineers on Sofas (and other furn...

NSF REPORTS SCHOLARS USING GEOGRAPHY TO STUDY HOLOCAUST

FROM:  NATIONAL SCIENCE FOUNDATION
Geography and mapping give new dimension to study of the Holocaust
Research addresses questions about the scale of the Holocaust, the meaning of place, and the significance of spatial patterns

Numerous scholars in recent years have made the horrors of the Holocaust real to the public through various media, including books and memoirs, films, art, photography and museum exhibitions. Anne Kelly Knowles and her collaborators are using a different approach to better understand the genocide: geography and mapping.

More specifically, the researchers are employing historical geographic information systems (GIS), computer programs that store, display, and analyze data of past geographies to gain new insights into how the Nazis implemented the Holocaust, the patterns of events, and the impact of the Holocaust on different places.

"The key is to recognize that perpetrators and victims experienced the Holocaust at different scales, but that those scales registered – came together – in particular places at particular times," says Knowles, a professor of geography at Middlebury College, who is joining the faculty of the University of Maine in Orono as a professor of history in fall 2015.

"We wanted our geographical explorations and experiments to be deeply grounded in history,’’ she adds. "At the same time, we wanted to ask new questions about the scale of the Holocaust, the meaning of place, and the significance of spatial patterns. Mapping complex data, like the development of the SS concentration camps system, inevitably shows you things you would not know – unless you make a map."

GIS "allows you to layer many kinds of information in the same visual space, and to use animation to see change over time,’" Knowles says. For example, in compiling a data base of 1,300 concentration camps and their associated labor camps," you can ask the computer program a question about the data, and it provides the answer as a map,’’ she says. "I could ask: 'Which camps were established by Jan. 1, 1942?' and it would show me in a map. I could continue to ask questions, and see how the number of camps grew."

Michael De Groot, a former undergraduate history major at Stanford University involved in the research, developed a series of digital layers showing the political boundaries of Europe during WWII. Using animation, he showed how the boundaries changed over time in combination with the expansion of the camps. "When you watch this particular animation, you can see the growth of the Reich and the growth of the camps together," she says. "It so clearly showed that the SS only established camps within territory they politically controlled. Some historians already knew this, but to actually see relationships like that hits you between the eyes.’"

Ultimately, GIS animations like this could become a valuable tool for teaching future students about the events of the Holocaust, Knowles says. "The ability to tell the story of the Holocaust visually makes it very exciting for teachers," she says. "It gets across fundamental information about the dynamics of the Holocaust that is crucial for students to understand."

Some material already is available at the Stanford University spatial history lab website, as well as the United States Holocaust Memorial Museum website, she says.

Knowles worked with nine collaborators, an interdisciplinary team of historians and geographers who call themselves the Holocaust Geographies Collaborative, with resource support provided by Holocaust museum. The National Science Foundation (NSF) funded the project with about $500,000. Last year the group published a book summarizing the first phase of their research, The Geographies of the Holocaust.

Knowles plans to continue her studies with a recently awarded fellowship from the John Simon Guggenheim Memorial Foundation, which annually supports a diverse group of scholars, artists, and scientists chosen on the basis of prior achievement and exceptional promise.

Her Guggenheim project, Telling the Spatial Story of the Holocaust,will be a novella-length eBook that follows 10 people through the Holocaust (1933 – 1945), connecting their experiences to the spatial unfolding of the Holocaust as Nazi plans were implemented in one place after another. It will be equal parts research, new modes of geo-visual story-telling, and multi-dimensional narrative.

In the NSF-funded project, the Holocaust Geographies Collaborative focused on the different scales at which the Holocaust occurred across Europe, from the continental scale of the development of the SS camp system, to the regional scale, including attacks on civilian populations in Belarus and Lithuania, as well as the arrest and transport of Jews in Italy. The group also examined the urban scale of the Auschwitz camp and the Budapest ghetto, and the scale of individual experience in a study of Auschwitz survivor testimony from January, 1945.

"The Holocaust happened on every scale possible, including the planners in Berlin working with maps and deciding where to put the next concentration camp, making it easier for them to dehumanize the places and people they were planning to capture and obliterate," she says. ``But there also are the personal experiences, the intimate meaning of home, community, synagogue. We think of geography as both remote, abstract planning and intensely personal experiences in time and space.’"

By conducting a geographic analysis of a database of places where Italian Jews were arrested – and who arrested them – and where they were taken, it became clear "that Italian troops and police were every bit as involved as the Germans were," she says, which contradicted long-held assumptions that the Italians were not involved or not as culpable as the Germans were in the Italian Holocaust. "When you convert the information in a database into a map, you can see geographically what happens to these people and who arrested them - both Germans and Italians were deeply involved. It gives you a more nuanced sense of what was happening on the ground."

"This is what I find compelling about historical geography," she adds. "Asking questions about what is happening to people in their homes and in the streets makes it more real."

Another piece of the study – examining the database related to the assaults on civilians in Lithuania – revealed a short window of time when the attacks changed fundamentally. Initially, the Einsatzgruppen (specialized attack squads of German soldiers) in Lithuania went after men and boys. "But at the end of August, 1941, they started attacking all Jews – all ages, women and children and the elderly," she says."This is the moment of genocide, when they were trying to kill everyone.

"This is not brand new information, but by visualizing it, it makes this moment of genocide clear," she adds. "This is really important because there is no document in the historical record of anyone ever saying: 'Now is when we are going to start killing all the Jews.'"

The Holocaust Geographies Collaborative now is turning to new research on victims’ experiences, applying GIS and other digital methods, such as corpus linguistics (the study of language as revealed in samples of"real world" text), to analyze video and written testimony.

Knowles points out, most importantly, that the work she and her collaborators are doing is not just history, or geography – but both. And they complement and enrich each other.

"We go back and forth," she says. "A map shows us something that raises historical questions. Then you go back and do historical research, which raises more geographical questions. This is the new way of doing Holocaust studies – and of doing history."

-- Marlene Cimons, National Science Foundation
Investigators
Anne Knowles
Related Institutions/Organizations
Middlebury College

Thursday, May 28, 2015

NASA PROVIDES HIGH RESOLUTION IMAGE OF EUROPA

FROM:  NASA



This 12-frame mosaic provides the highest resolution view ever obtained of the side of Jupiter's moon Europa that faces the giant planet. It was obtained on Nov. 25, 1999 by the camera onboard the Galileo spacecraft, a past NASA mission to Jupiter and its moons which ended in 2003. NASA will announce today, Tuesday, May 26, the selection of science instruments for a mission to Europa, to investigate whether it could harbor conditions suitable for life. The Europa mission would conduct repeated close flybys of the small moon during a three-year period.

Numerous linear features in the center of this mosaic and toward the poles may have formed in response to tides strong enough to fracture Europa's icy surface. Some of these features extend for over 1,500 kilometers (900 miles). Darker regions near the equator on the eastern (right) and western (left) limb may be vast areas of chaotic terrain. Bright white spots near the western limb are the ejecta blankets of young impact craters.

North is to the top of the picture and the sun illuminates the surface from the left. The image, centered at 0 latitude and 10 longitude, covers an area approximately 2,500 by 3,000 kilometers. The finest details that can discerned in this picture are about 2 kilometers across (about 1,550 by 1,860 miles). The images were taken by Galileo's camera when the spacecraft was 94,000 kilometers (58,000 miles) from Europa.

Image Credit: NASA/JPL/University of Arizona.


VERY PROMISING QUANTUM DOTS

FROM:  NATIONAL SCIENCE FOUNDATION
Many uses in researching quantum dots

These nanoparticles can achieve higher levels of energy when light stimulates them.

It's easier to dissolve a sugar cube in a glass of water by crushing the cube first, because the numerous tiny particles cover more surface area in the water than the cube itself. In a way, the same principle applies to the potential value of materials composed of nanoparticles.

Because nanoparticles are so small, millions of times smaller than the width of a human hair, they have "tremendous surface area," raising the possibility of using them to design materials with more efficient solar-to-electricity and solar-to-chemical energy pathways, says Ari Chakraborty, an assistant professor of chemistry at Syracuse University.

"They are very promising materials," he says. "You can optimize the amount of energy you produce from a nanoparticle-based solar cell."

Chakraborty, an expert in physical and theoretical chemistry, quantum mechanics and nanomaterials, is seeking to understand how these nanoparticles interact with light after changing their shape and size, which means, for example, they ultimately could provide enhanced photovoltaic and light-harvesting properties. Changing their shape and size is possible "without changing their chemical composition," he says. "The same chemical compound in different sizes and shapes will interact differently with light."

Specifically, the National Science Foundation (NSF)-funded scientist is focusing on quantum dots, which are semiconductor crystals on a nanometer scale. Quantum dots are so tiny that the electrons within them exist only in states with specific energies. As such, quantum dots behave similarly to atoms, and, like atoms, can achieve higher levels of energy when light stimulates them.

Chakraborty works in theoretical and computational chemistry, meaning "we work with computers and computers only," he says. "The goal of computational chemistry is to use fundamental laws of physics to understand how matter interacts with each other, and, in my research, with light. We want to predict chemical processes before they actually happen in the lab, which tells us which direction to pursue."

These atoms and molecules follow natural laws of motion, "and we know what they are," he says. "Unfortunately, they are too complicated to be solved by hand or calculator when applied to chemical systems, which is why we use a computer."

The "electronically excited" states of the nanoparticles influence their optical properties, he says.

"We investigate these excited states by solving the Schrödinger equation for the nanoparticles," he says, referring to a partial differential equation that describes how the quantum state of some physical system changes with time. "The Schrödinger equation provides the quantum mechanical description of all the electrons in the nanoparticle.

"However, accurate solution of the Schrödinger equation is challenging because of large number of electrons in system," he adds. "For example, a 20 nanometer CdSe quantum dot contains over 6 million electrons. Currently, the primary focus of my research group is to develop new quantum chemical methods to address these challenges. The newly developed methods are implemented in open-source computational software, which will be distributed to the general public free of charge."

Solar voltaics, "requires a substance that captures light, uses it, and transfers that energy into electrical energy," he says. With solar cell materials made of nanoparticles, "you can use different shapes and sizes, and capture more energy," he adds. "Also, you can have a large surface area for a small amount of materials, so you don't need a lot of them."

Nanoparticles also could be useful in converting solar energy to chemical energy, he says. "How do you store the energy when the sun is not out?" he says. "For example, leaves on a tree take energy and store it as glucose, then later use the glucose for food. One potential application is to develop artificial leaves for artificial photosynthesis. There is a huge area of ongoing research to make compounds that can store energy."

Medical imaging presents another useful potential application, he says.

"For example, nanoparticles have been coated with binding agents that bind to cancerous cells," he says. "Under certain chemical and physical conditions, the nanoparticles can be tuned to emit light, which allows us to take pictures of the nanoparticles. You could pinpoint the areas where there are cancerous cells in the body. The regions where the cancerous cells are located show up as bright spots in the photograph."

Chakraborty is conducting his research under an NSF Faculty Early Career Development (CAREER) award. The award supports junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organization. NSF is funding his work with $622,123 over five years.

As part of the grant's educational component, Chakraborty is hosting several students from a local high school--East Syracuse Mineoa High School--in his lab. He also has organized two workshops for high school teachers on how to use computational tools in their classrooms "to make chemistry more interesting and intuitive to high school students," he says.

"The really good part about it is that the kids can really work with the molecules because they can see them on the screen and manipulate them in 3-D space," he adds. "They can explore their structure using computers. They can measure distances, angles, and energies associated with the molecules, which is not possible to do with a physical model. They can stretch it, and see it come back to its original structure. It's a real hands-on experience that the kids can have while learning chemistry."

-- Marlene Cimons, National Science Foundation
Investigators
Arindam Chakraborty
Related Institutions/Organizations
Syracuse University

Tuesday, May 26, 2015

CDC REPORTS ON CIGARETTE SMOKING AND SMOKELESS TOBACCO USE AMONG ADULTS

FROM:  CENTERS FOR DISEASE CONTROL AND PREVENTION
State-specific Prevalence of Current Cigarette Smoking and Smokeless Tobacco Use Among Adults Aged ≥18 years — United States, 2011–2013

These findings underscore the importance of effective population-based interventions focused on reducing the use of all tobacco products. From 2011 to 2013, there was a decline in current cigarette smoking prevalence in 26 states. During the same period, use of smokeless tobacco significantly increased in Louisiana, Montana, South Carolina, and West Virginia. Additionally, the concurrent use of cigarette smoking and smokeless tobacco significantly increased in Delaware, Idaho, Nevada, New Mexico and West Virginia. The use of more than one tobacco product is concerning because adults who use both cigarettes, and smokeless tobacco have higher levels of nicotine dependence and are less likely to report planning to quit than those who exclusively smoke cigarettes. Evidence-based, statewide tobacco control programs that are comprehensive, sustained, and accountable have been shown to reduce smoking rates, as well as tobacco-related diseases and deaths.

Sunday, May 24, 2015

CDC SAYS SIGNIFICANT CORRELATION FOUND BETWEEN SEVERE VISION LOSS AND POVERTY

FROM:  CENTERS FOR DISEASE CONTROL AND PREVENTION
Geographic Disparity of Severe Vision Loss in the United States, 2009–2013

An analysis of U.S. county-level data  found a significant correlation between severe vision loss and poverty. Southern states had the highest prevalence of severe vision loss and poverty. Severe vision loss (SVL) often affects activities of daily living, leads to depression and social isolation, and increases the risk of falls and injuries. Limited data and research are available at the local levels, where interventions and policy decisions to reduce the burden of vision loss and eliminate disparities are often developed and implemented. After examining county-level data from the American Community Survey, SVL prevalence was strongly correlated with poverty. The majority of counties in the top 25 percent for both SVL and poverty were primarily in the southern United States.

Saturday, May 23, 2015

CDC ANNOUNCES DROWNING OUTCOMES RELATED TO DESIGNATED DANGEROUS UNDERWATER BREATH-HOLDING BEHAVIORS

FROM:  CENTERS FOR DISEASE CONTROL AND PREVENTION
Fatal and Nonfatal Drowning Outcomes Related to Dangerous Underwater Breath-Holding Behaviors — New York State, 1988–2011

Drowning is a preventable public health risk. Through education and policy interventions and increased awareness of these dangerous swimming behaviors, the aquatic health and safety community can play a significant role in decreasing fatalities among all recreational populations. This report identifies a class of swimming behaviors, designated dangerous underwater breath-holding behaviors (DUBBs), that can lead to fatal drowning. These behaviors could easily be prevented to decrease the risk of drowning among otherwise healthy swimmers. These findings expand the domain of active drowning surveillance to consider the epidemiology of contributing behaviors in fatal and non-fatal drowning incidents.

Thursday, May 21, 2015

NASA's VACUUM CHAMBER 5

FROM:  NASA 



When you need to test hardware designed to operate in the vast reaches of space, you start in a vacuum chamber.  NASA’s Glenn Research Center in Cleveland has many of them, but Vacuum Chamber 5 (VF-5) is special.  Supporting the testing of electric propulsion and power systems, VF-5 has the highest pumping speed of any electric propulsion test facility in the world, which is important in maintaining a continuous space-like environment.

The cryogenic panels at the top and back of the chamber house a helium-cooled panel that reaches near absolute zero temperatures (about -440 degrees Fahrenheit). The extreme cold of this panel freezes any air left in the chamber and quickly freezes the thruster exhaust, allowing the chamber to maintain a high vacuum environment. The outer chevrons are cooled with liquid nitrogen to shield the cryogenic panels from the room temperature surfaces of the tank.

Most electric propulsion devices, such as Hall Thrusters, use xenon as a propellant, which is very expensive. By capturing the used xenon as ice during testing, researchers are able to recover the propellant to reuse, saving NASA and test customers considerable costs.

The oil diffusion pumps along the bottom of the tank capped by circular covers use a low vapor pressure silicon oil to concentrate small amounts of gas to the point where it can be mechanically pumped from the chamber.

VF-5 will continue to provide a testing environment for Glenn’s advanced Solar Electric Propulsion technology needed for future astronaut expeditions into deep space, including to Mars.

Image Credit: NASA

Sunday, May 17, 2015

DOD LAB DAY AND NEW WARFIGHTER TECH

FROM:  U.S. DEFENSE DEPARTMENT

Right:  Air Force 1st Lt. Caroline Kurtz, a human factors engineer with the Air Force Research Laboratory, briefs Deputy Defense Secretary Bob Work on wearable technology systems as he tours exhibits during DoD Lab Day at the Pentagon, May 14, 2015. At center is behavioral scientist Air Force 2nd Lt. Anthony Eastin. DoD photo by Glenn Fawcett.  

First DoD Lab Day Shows Off Warfighter Technology
By Cheryl Pellerin
DoD News, Defense Media Activity

WASHINGTON, May 14, 2015 – Robots, medical advances, high-tech versions of warfighter tools and more were on display today at the Pentagon during the Defense Department’s first “Lab Day.”

Some of the department’s top officials also were there, including Deputy Defense Secretary Bob Work and Undersecretary of Defense for Acquisition, Technology and Logistics Frank Kendall, who hosted the event.

The event began this morning, with dozens of booths lining the roads and sidewalks in the Pentagon’s center courtyard and the south parking area. The booths held exhibits highlighting the research and development work of Army, Navy and Air Force laboratories and medical scientists.

Kendall said today’s Lab Day is the first in an ongoing outreach campaign that ties together science and technology efforts across the defense research and engineering enterprise.

Three Big Reasons

“We're gathered here today for three big reasons,” Kendall told an audience that included service members, DoD employees, members of Congress, local science and technology high-school students, media and special guests.

“We wanted to show the groundbreaking work going on at DoD labs, recognize the best of the best, and showcase the specific projects and demonstrations that we're working on in the world of science and technology,” he said.

Kendall said the department has thousands of scientists working at DoD labs in 22 states, producing things such as the Internet, the Global Positioning System, car and truck back-up sensors, Ebola virus disease containment, a mobile capability for destroying chemical material stockpiles, night-vision goggles, emergency-room best practices, and more.

“What we see today is innovation in the foreground,” Kendall said.

Depending on Technology

The department and its warfighters depend on so many different technology areas, and in those areas the United States must be stay ahead of everyone else, he added.

In Kendall’s tour of the exhibits, he saw aerospace technology, safer helmets and other personal protection gear, and advances in autonomy such as the Navy's unmanned jet ski; the Marine Corps’ semi-autonomous robots for reconnaissance, surveillance and target acquisition; and the Air Force's remotely piloted “Vigilant Spirit.”

“All these things and many more allow our warfighters to have the cutting-edge capabilities they really need,” Kendall said, “and laboratory innovation is at the forefront of that.”

In his remarks, Work welcomed Lab Day participants on behalf of Defense Secretary Ash Carter, who is at Camp David, Maryland, today meeting with members of the Gulf Cooperation Council.

Special Welcome

“He wanted me to extend his personal thanks for all of you for coming out today, and his personal thanks for everything that you do,” Work said, adding a special welcome for the local high-school STEM -- science, technology, engineering and math -- students.

“You represent the next generation of scientists and engineers,” Work told them. “You’re going to help out the country no matter what, but we hope that after seeing some of the things here today you might at least contemplate coming into the Department of Defense and helping us solve many of the problems we're facing.”

Even more exciting than the advanced technology he saw today were the young people who demonstrated and explained the technology, Work said.

Warfighting Edge

“They represent the best that our country has to offer,” the deputy secretary added. “They help give our service members a warfighting edge and they help keep our country and all of our citizens safe.”

During his tour of the exhibits, Work said he saw advanced armor, new night-vision devices and lifesaving medical instruments, all demonstrating the leading edge of technology.

“You represent the best of about 38,000 scientists and engineers in more than 60 DoD labs across the country, he added, noting that the scientists “work hard every day to ensure that we retain our technological superiority, to prepare us for an uncertain future and accelerate capabilities that we need to get into the hands of our warfighters.”

Work added, “I firmly believe as does Secretary Carter that we must continue to innovate to protect our country.”

Scalable Quantum Network

Work recognized a combined team of scientists and engineers from the Army, Air Force and Navy research labs, who won a $45 million award to the Joint U.S. Service Laboratories to develop the first U.S prototype of a scalable quantum network with memory.

The three-year award is funded by the Office of the Assistant Secretary of Defense for Research and Engineering’s Applied Research for the Advancement of S&T Priorities, or ARAP, program.

Quantum-physics-based computing could increase by a billion-fold computing capability critical to accelerating the building-blocks for game-changing capabilities in command, control, communications, computers, intelligence, surveillance and reconnaissance, called C4ISR, according to the award document.

The service research labs are collaborating to demonstrate the feasibility of the fully integrated quantum-memory technology.

Focus on Emerging Technologies

“One hundred years ago quantum mechanics was discovered and our understanding of it has developed over the last decade,” Work explained. “It’s the foundation of almost all of our modern technology, and this team is trying to figure out how to encrypt and then transmit information across long-range military networks for the warfighter in a provably secure and robust fashion.”

Kendall, who announced the winners of the award during his comments, said that under the ARAP program, the department focuses on emerging technologies that could impact future operational capabilities.

NASA VIDEO: CURIOSITY ROVER REPORT (May 8, 2015): ROVER ROAD TRIP

Saturday, May 16, 2015

NAVY DEVELOPS NEW LIGHTWEIGHT BODY ARMOR

FROM:  U.S. DEFENSE DEPARTMENT
Navy Research Lab Develops New Body Armor for 2016
By Terri Moon Cronk
DoD News, Defense Media Activity

WASHINGTON, May 15, 2015 – New lightweight, flexible and buoyant body armor developed at the Naval Research Laboratory could be in field use by 2016, the lead scientist who has overseen the armor’s two-year development said yesterday.

Research physicist Raymond M. Gamache of the lab’s chemistry division was one of dozens of exhibitors in the Defense Department’s first “Lab Day” at the Pentagon to display the latest innovations that will advance DoD’s Force of the Future, DOD officials said.

Gamache said his new armor will replace the existing enhanced small-arms protective insert to mitigate the impact from bullets and fragmentation.

Armor Protects Torso, Spine

He displayed two forms of his flexible armor: a fabric for the torso that resembles dimpled foam rubber, and an insert of interlocking pieces that lock up into a solid piece upon impact.

Using both, he noted, would provide torso protection, while the insert could be used in a warfighter’s back to shield the spine from damage.

“It’s a great solution for [spinal injury],” Gamache said. And while the insert can’t stop blunt-force trauma, “you’ll still be alive,” he said.

While Gamache’s armor is intended for protection from bullet fire and fragmentation, it would only offer some degree of protection in improvised explosive device blasts, he said.

Armor is Like Fabric

“You hear stories about troops who won’t wear their armor because it’s both heavy and it is very restrictive,” Gamache said. “This is like wearing a fabric, [and] it’s loose,” he said of the material that resembled dimpled foam rubber, noting that the new material is about 2 pounds lighter than existing body armor.

“The beauty of it is no matter what your body contorts into, you’ll always have the same amount of protection,” Gamache said.

The proximity of the tiny spheres of boron carbide and silicon carbide is what protects service members from the vulnerability of bullet impact and fragments, he explained.

“You can twist and turn, but you’re always going to maintain the same protection against bullets,” he said.

The armor can be used all around the globe in any environment from the Middle East to the Asia-Pacific region, Gamache said.

There is no temperature variance with his armor, he added, although ventilation can be added for greater air flow in warm climates.

“With this technology, we’re trying to essentially make lighter, more compliant materials that people will be willing to wear [that] still gives equivalent protection,” Gamache said. “And that’s the bottom line.”

CDC SAYS PROGRESS MADE REDUCING SOME FOODBORNE INFECTIONS

FROM:  U.S. CENTERS FOR DISEASE CONTROL AND PREVENTION
CDC data show progress in reducing some foodborne infections in 2014

n 2014, rates of infection from a serious form of E. coli and one of the more common Salmonella serotypes decreased compared with the baseline period of 2006-2008. Meanwhile, some other less common types of Salmonella increased. Campylobacter and Vibrio rose again in 2014, continuing the increase observed during the past few years, according to data published today by the Centers for Disease Control and Prevention. Today’s report summarizes the rates of infection per 100,000 population and tracks illness trends for key foodborne illnesses.

Infection with Shiga-toxin producing E. coli O157, which can sometimes lead to kidney failure, decreased 32 percent when compared with 2006-2008 and 19 percent when compared with the most recent three years. These infections are often linked to consumption of undercooked ground beef and raw leafy vegetables. Salmonella Typhimurium, which has been linked to poultry, beef, and other foods, was 27 percent lower than it was in 2006-2008, continuing a downward trend begun in the mid-1980s. Two other less common types of Salmonella, Javiana and Infantis, more than doubled for reasons that are unclear. Salmonella Javiana is concentrated in the southeastern United States, but has been spreading within the Southeast and to other areas of the country. However, when all Salmonella serotypes are combined, there was no change in 2014. Campylobacter increased 13 percent and Vibrio increased 52 percent compared with 2006-2008. Yersinia has declined enough to meet the Healthy People 2020 goal.

The data are from FoodNet, CDC’s active surveillance system that tracks nine common foodborne pathogens in 10 states and monitors trends in foodborne illness in about 15 percent of the U.S. population. Today’s report compares the 2014 frequency of infection with the frequency in the baseline period 2006-2008 and in the three most recent years. Overall in 2014, FoodNet logged just over 19,000 infections, about 4,400 hospitalizations, and 71 deaths from the nine foodborne germs it tracks. Salmonella and Campylobacter were by far the most common– accounting for about 14,000 of the 19,000 infections reported.

“We’re cautiously optimistic that changes in food safety practice are having an impact in decreasing E.coli and we know that without all the food safety work to fight Salmonella that more people would be getting sick with Salmonella than we are seeing now,,” said Robert Tauxe, M.D., deputy director of CDC’s Division of Foodborne Waterborne and Environmental Diseases. “The increasing use of whole genome sequencing to track foodborne illness cases will also help; however, much more needs to be done to protect people from foodborne illness.”

The recent decline in the incidence of Shiga toxin-producing E. coli (STEC) O157 follows several years of increasing scrutiny for beef products. Since 1994, the Food Safety and Inspection Service of the U.S. Department of Agriculture has taken STEC O157:H7 extremely seriously and made a number of changes in its regulatory oversight of the beef industry to protect public health.

"We are encouraged by the reduction of STEC O157:H7 illnesses, which reflects our science-based approach to beef inspection, and we look forward to seeing further reductions in Salmonella and Campylobacter infections as our improved standards for poultry take effect later this year, " said Al Almanza, Deputy Under Secretary for Food Safety at USDA. "Data sources like FoodNet allow us to be strategic in developing our food safety policies, and we will do everything within our power to keep reducing cases of foodborne illness from all meat and poultry products."

Under the provisions of the FDA Food Safety Modernization Act, the U.S. Food and Drug Administration is planning to publish major new regulations in 2015. The regulations are geared toward ensuring produce safety, implementing preventive controls on processed foods, and improving the safety of imported foods.

“Prevention of illness is the fundamental goal of our new rules under the FDA Food Safety Modernization Act,” said Michael Taylor, deputy commissioner for Foods and Veterinary Medicine at FDA.  “We have worked with a wide range of stakeholders to devise rules that will be effective for food safety and practical for the many diverse elements of our food system. Once the rules are fully implemented, FoodNet will help us evaluate their impact.”

The FoodNet report also includes results of culture-independent diagnostic tests (a new method for diagnosing intestinal illnesses without needing to grow the bacteria) done in the many hospital laboratories in the FoodNet sites. In 2014, the results of more than 1,500 such tests were reported. More than two-thirds of the tests were for Campylobacter. Other tests performed were for STEC, Salmonella, Shigella and Vibrio. Some of the tests had a positive result. However, the infections were not confirmed by culture, and so CDC experts did not include them in the overall FoodNet results for 2014.

For more information on avoiding illnesses from food, please visit www.foodsafety.gov.

About FoodNet

FoodNet collects information to track rates and determine trends in laboratory-confirmed illnesses caused by nine pathogens transmitted commonly by food: Campylobacter, Cryptosporidium, Cyclospora, Listeria, Salmonella, STEC O157 and non-O157, Shigella, Vibrio and Yersinia. CDC compares annual data with data from a baseline period (2006-2008) and a recent period (2010-2012) to measure progress. Since 2010, FoodNet has been tracking the increasing use of culture‐independent diagnostic tests used by clinical laboratories for diagnosis of bacterial enteric infection. Because these tests are replacing culture-based tests, their use is creating challenges to the ability to identify cases, monitor trends, detect outbreaks, and characterize pathogens.

FoodNet is a collaboration among CDC, ten state health departments, the USDA’s Food Safety and Inspection Service, and the FDA. FoodNet covers 48 million people, encompassing about 15 percent of the United States population. The sites are the states of Connecticut, Georgia, Maryland, Minnesota, New Mexico, Oregon, and Tennessee, and selected counties in California, Colorado, and New York.

Friday, May 15, 2015

CDC REPORT CENTERS ON SWIMMING RISK OF NOROVIRUS SICKNESS

FROM:  U.S. CENTERS FOR DISEASE CONTROL AND PREVENTION
Study Highlights Risk of Norovirus from Swimming
Simple tips can help swimmers stay safe in various swimming venues.

When most people think of norovirus, they think of people marooned on a cruise ship with raging stomach and intestinal illness, unable to leave their cabins. However, an outbreak at an Oregon lake underscores that swimming can also put the public at risk of catching the ugly bug. Fortunately, following a few easy and effective steps can help maximize the health benefits of swimming while minimizing the risk of getting sick.

In honor of Healthy and Safe Swimming week, experts from the Centers for Disease Control and Prevention and local and state health officials in Oregon report today on a summer 2014 outbreak that spread via swimming in a contaminated lake.

The norovirus outbreak in July 2014 linked to a lake near Portland, Oregon sickened 70 people. Those who swam in the lake were 2.3 times more likely to develop vomiting or diarrhea than those who visited the park but didn’t go in the water. More than half of those who got ill were children between 4–10 years old. Experts believe the outbreak began after a swimmer infected with norovirus had diarrhea or vomited in the water and other swimmers swallowed the contaminated water. To prevent other people from getting sick, park officials closed the lake to swimmers for 10 days.

“Children are prime targets for norovirus and other germs that can live in lakes and swimming pools because they’re so much more likely to get the water in their mouths,” said Michael Beach, Ph.D, CDC’s associate director for healthy water. “Keeping germs out of the water in the first place is key to keeping everyone healthy and helping to keep the places we swim open all summer.”

Swimmers can help protect themselves, their families and friends by following a few easy and effective steps:

Keep the pee, poop, sweat, and dirt out of the water!

Don’t swim if you have diarrhea or have been vomiting
Shower before you get in the water
Don’t pee or poop in the water
Don’t swallow lake or pool water
Every hour—everyone out!

Take kids on bathroom breaks
Check diapers, and change them in a bathroom or diaper-changing area–to keep germs away from the water.

Norovirus was the second-leading cause of outbreaks in untreated recreational water, such as lakes, from 1978-2010. Norovirus can live in water for several months or possibly even years. Swimming venues that are not treated with chlorine can pose a particular risk since there are no chemicals to kill the stomach virus.  

May 18–24, 2015, marks the 11th annual Healthy and Safe Swimming Week (formerly known as Recreational Water Illness and Injury Prevention Week). This observance highlights ways in which swimmers, parents, pool owners and operators, beach managers, and public health can maximize the health benefits of water-based physical activity, while minimizing the risk of recreational water–associated illness and injury.

Tuesday, May 5, 2015

SAFEGUARDING CAPE FEAR RIVER BASIN WATER QUALITY (NSF VIDEO INCLUDED)


FROM:  NATIONAL SCIENCE FOUNDATION
Testing the waters: 1,4-Dioxane in North Carolina's Cape Fear River Basin
Environmental engineers play vital role in determining next steps to safeguard drinking water

It was an email from a colleague that tipped off environmental engineer Detlef Knappe of possible 1,4-dioxane contamination in the Cape Fear River Basin, North Carolina's largest watershed and a source of drinking water for communities across the state.

The Environmental Protection Agency (EPA) has classified 1,4-dioxane as a probable human carcinogen. It is an industrial solvent used in the textile and paper industries and a by-product of manufacturing processes involving ethylene oxide, such as the production of certain plastics and surfactants used in laundry detergents and shampoos.

With support from a National Science Foundation (NSF) RAPID grant, Knappe and his team at North Carolina State University (NCSU) have begun to identify 1,4-dioxane sources and how 1,4-dioxane impacts drinking water quality. (RAPID is short for Grants for Rapid Response Research.) Another research goal is to determine whether home filtration devices effectively remove 1, 4-dioxane from tap water and how long those filters will last.

Knappe is also working with managers at water treatment plants and state policymakers in North Carolina to improve testing and treatment standards for 1,4-dioxane. Knappe says so far, the research team has identified three North Carolina communities as key sources of 1,4-dioxane and those communities are now working with the state and NCSU to identify which facilities are the source of the contamination.

He says the team is also evaluating point-of-use devices, such as pitcher and refrigerator filters, and has identified a new adsorbent that is quite effective for 1,4-dioxane removal.

The research in this episode was supported by NSF award #1449768, RAPID; GOALI: Sources of 1,4-Dioxane in the Cape Fear River Watershed of North Carolina and Treatment Options for 1,4-Dioxane Control. GOALI is NSF's Grant Opportunities for Academic Liaison with Industry program.

Miles O'Brien, Science Nation Correspondent
Ann Kellan, Science Nation Producer

Sunday, May 3, 2015

CROWDSOURCED SCIENCE TRACKS "SUDDEN OAK DEATH"

FROM:  NATIONAL SCIENCE FOUNDATION

Trees turned to snags: "Sudden Oak Death" fells California oaks in their prime
Citizen scientists assist with research on infectious plant disease
Sudden oak death, beware.

Crowdsourced science is helping to predict the path of the deadly plant disease, demonstrating the contributions trained citizen scientists can make in large-scale geographic tracking projects.

That's the conclusion of a study of sudden oak death monitoring in California. The results are published in this month's issue of the journal Frontiers in Ecology and the Environment.

Ebola of the plant world

"Sudden oak death is the Ebola of the plant world, the most serious threat to non-agricultural plants," says lead paper author Ross Meentemeyer, director of the Center for Geospatial Analytics at North Carolina State University.

The disease, which has killed millions of oak and tanoak trees in California and Oregon, can infect up to 60 landscape plant species and spread from nursery stock to residential landscapes.

Starting in 2008, University of California, Berkeley, researchers expanded their sudden oak death monitoring efforts exponentially, thanks to observations from 1,600 trained volunteers who collected leaf samples from trees in metropolitan and urban-wildland areas.

Citizen scientists often needed in research

"To answer many science questions, we need the efforts of a large number of people--and the general public can help," says Sam Scheiner, National Science Foundation (NSF) director for the NSF-NIH-USDA Ecology and Evolution of Infectious Diseases Program, which funded the research.

"This study shows that asking local residents to report on the locations of outbreaks of sudden oak death can provide critical information. The result is a better understanding of the spread of this serious plant disease."

Adds Meentemeyer, "We were able to get data from backyards in the San Francisco Bay area, along with other locations.

"Those data were used to develop accurate computer models for the disease's spread, showing that properly trained and educated citizen scientists can collect data that's just as reliable as that of professionals."

Predictions allow for targeted treatments

Accurate predictions about sudden oak death's spread allow scientists to target treatments to the most vulnerable areas, says paper co-author and forest pathologist Matteo Garbelotto of UC Berkeley.

The annual Sudden Oak Death Blitz, which includes extensive publicity during peak periods for the disease, involves high school students, homeowners, tree specialists, firefighters, teachers and others.

Follow-up evaluation showed that trained citizen scientists were as effective as experts in identifying and collecting diseased tree leaves, whether or not they reported having a professional background in science.

Additional authors of the paper are Monica Dorning and John Vogler of NC State and Douglas Schmidt of UC Berkeley.

-- Cheryl Dybas, NSF
-- D'Lyn Ford, NCSU
Investigators
David Rizzo
Ross Meentemeyer
Related Institutions/Organizations
University of California-Davis
University of North Carolina at Charlotte

Friday, May 1, 2015

EXAMINING ENVIRONMENTAL STEWARDSHIP OF BAJA CALIFORNIA SUR REGION

FROM:  NATIONAL SCIENCE FOUNDATION
Scientists map sustainability challenges in Baja California Sur fisheries

Conduct review of ecological and social data for 12 regions around Baja
The waters surrounding Baja California Sur are teeming with fish. They're also favorite locales for fishers in pursuit of a living.

But like cross-currents in the ocean, the needs of humans and fish meet--and often clash--in this deep blue sea.

In results published this week in the journal Proceedings of the National Academy of Sciences (PNAS), scientists assessed marine sustainability in the Baja California Sur region by applying a framework that accounts for both ecological and human dimensions of environmental stewardship.

The resulting maps, the researchers say, are mosaics of regional sustainability outlooks with the potential to guide environmental policy.

The maps, according to lead PNAS paper author Heather Leslie of Brown University in Providence, R.I., show regional strengths and weaknesses that can help guide fishers, conservationists and other decision-makers as they consider steps to preserve the peninsula's coastal marine ecosystems.

"This framework enabled us to integrate many different types of data related to coupled human-environment systems," says Leslie.

"We were able to be more comprehensive in our analysis than we would have otherwise--for example, if we had focused solely on the data closest to our individual disciplines."

Dozen regions near Baja California Sur profiled

Leslie and the team--which included anthropologists, economists, ecologists, fisheries scientists, geographers and other social scientists--gathered data on 13 ecological and social variables for each of 12 regions around Baja California Sur.

They developed a profile of sustainability potential for each region in four "dimensions": social governance systems and actors, and ecological resource units and resource systems.

What became clear, says Leslie, is that each region's profile is different, suggesting that the most effective policies to achieve sustainability will be those tailored to shore up each area's weaknesses without undermining its strengths.

"More than one billion people worldwide rely on fish and other seafood as their primary source of nutrition, yet coastal resources are often poorly understood and at risk of long-term decline," says Sarah Ruth, a program director for the National Science Foundation's (NSF) Dynamics of Coupled Natural and Human Systems (CNH) Program, which funded the research.

"Fisheries involve people and coastal ecosystems interacting with one another in myriad ways," says Ruth. "This work increases our understanding of these complex systems, and ultimately will help us manage marine resources in a more sustainable way."

Magdalena Bay and Todos Santos: A tale of two regions

Take, for example, stories the researchers' maps illustrate about the Pacific coast regions of Magadalena Bay and Todos Santos. If Baja California Sur is a leg, then Magdalena Bay would be the knee and Todos Santos the front of the ankle.

In Magdalena Bay, fisheries are diverse and productive. But the area is also crowded with fishers who use different kinds of gear and don't necessarily all follow the same locally developed rules, Leslie says.

While the ecological foundation there for sustainability looks good, the social dimensions are considerably less promising, the scientists found.

"Depending on which type of data one musters regarding the potential for sustainable fisheries, Magdalena Bay could be scored as either well-endowed or quite weak," write the paper authors, which also include Xavier Basurto of Duke University and Octavio Aburto-Ooropeza of the University of California, San Diego.

In Todos Santos, local-scale fisheries institutions are strong. There is cooperation and compliance among fishers, based on the social science surveys the researchers conducted.

But the nearshore ocean is less productive and yields fewer fish species than Magdalena Bay, just 150 miles to the north.

While not far geographically, the regions are opposites in their social and ecological profiles, says Leslie.

In Magdalena Bay, the most productive strategy might be to cultivate stronger local institutions, Leslie says. In Todos Santos, maintaining existing local institutional strength might make more sense.

Are there fundamental trade-offs?

The diversity of regional strengths and weaknesses shown in the maps is borne out statistically.

The researchers looked for correlations among the four dimensions across Baja California Sur, but found few.

"Does something like this represent fundamental trade-offs?" Leslie asks. "What we found suggests that it does. There's nowhere that does well in every dimension."

The David and Lucile Packard Foundation, Brown University's Environmental Change Initiative and Voss Environmental Fellowship Program, and other organizations also funded the research.

-- Cheryl Dybas, NSF
-- David Orenstein, Brown University
Investigators
Heather Leslie
Sriniketh Nagavarapu
Related Institutions/Organizations
Brown University

Search This Blog

Translate

White House.gov Press Office Feed