Showing posts with label INTELLIGENT ROBOTS. Show all posts
Showing posts with label INTELLIGENT ROBOTS. Show all posts

Sunday, February 23, 2014

SPACE SMART SPHERES

FROM:  NASA 
Smart SPHERES Are About to Get A Whole Lot Smarter

Smart devices – such as tablets and phones – increasingly are an essential part of everyday life on Earth. The same can be said for life off-planet aboard the International Space Station. From astronaut tweets to Google+ Hangouts, our reliance on these mobile and social technologies means equipment and software upgrades are an everyday occurrence – like buying a new pair of shoes to replace a pair of well-worn ones.

That’s why the Intelligent Robotics Group at NASA’s Ames Research Center in Moffett Field, Calif., with funding from the Technology Demonstration Missions Program in the Space Technology Mission Directorate, is working to upgrade the smartphones currently equipped on a trio of volleyball-sized free-flying satellites on the space station called Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES). In 2011 on the final flight of space shuttle Atlantis, NASA sent the first smartphone to the station and mounted it to SPHERES.

Each SPHERE satellite is self-contained with power, propulsion, computing and navigation equipment as well as expansion ports for additional sensors and appendages, such as cameras and wireless power transfer systems. This is where the SPHERES' smartphone upgrades are attached.

By connecting a smartphone, the SPHERES become Smart SPHERES. They now are more intelligent because they have built-in cameras to take pictures and video, sensors to help conduct inspections, powerful computing units to make calculations and Wi-Fi connections to transfer data in real time to the computers aboard the space station and at mission control.

"With this latest upgrade, we believe the Smart SPHERES will be a step closer to becoming a ‘mobile assistant' for the astronauts,” said DW Wheeler, lead engineer with SGT Inc. in the Intelligent Robotics Group at Ames. "This ability for Smart SPHERES to independently perform inventory and environmental surveys on the space station can free up time for astronauts and mission control to perform science experiments and other work.”

Later this year, NASA will launch a Project Tango prototype Android smartphone developed by Google’s Advanced Technology and Projects division of Mountain View, Calif. The prototype phone includes an integrated custom 3-D sensor, which means the device is capable of tracking its own position and orientation in real time as well as generating a full 3-D model of the environment.

“The Project Tango prototype incorporates a particularly important feature for the Smart SPHERES – a 3-D sensor,” said Terry Fong, director of the Intelligent Robotics Group at Ames. “This allows the satellites to do a better job of flying around on the space station and understanding where exactly they are.”
Later this month, Ames engineers will fly the prototype phone several times aboard an airplane that is capable of simulating microgravity by performing a parabolic flight path. The team has modified the motion-tracking and positioning code developed by Google that tells the phone where it is to work in the microgravity conditions of the space station. To verify that the phone will work, they must take the phone out of the lab at Ames and test it in a microgravity environment.

The SPHERES facility aboard the space station provides affordable opportunities to test a wide range of hardware and software. It acts as a free-flying platform that can accommodate various mounting features and mechanisms in order to test and examine the physical or mechanical properties of materials in microgravity.

SPHERES also provides a test bed for space applications including physical sciences investigations, free-flying spatial analyses, multi-body formation flying and various multi-spacecraft control algorithm verifications and analyses.

SPHERES also is used for the annual Zero Robotics student software programming competition. Ames operates and maintains the SPHERES facility, which is funded by the Human Exploration and Operations Mission Directorate at NASA Headquarters in Washington.

To date, astronauts have conducted 77 investigations using SPHERES to test techniques to advance automated dockings, satellite servicing, spacecraft assembly and emergency repairs. Now researchers are preparing to control the SPHERES in real time from ground control stations on Earth and from space.

In the long run, free-flying robots like SPHERES could also be used to inspect the exterior of the space station or future deep space vehicles. Robots like the smartphone-enhanced SPHERES and NASA's Robonaut 2, will provide some of the help of another crew member; SPHERES' cameras can act as another set of eyes, while Robonaut 2 literally adds another set of hands to act as an assistant with small and bulky items alike. An added bonus is that robots do not require any additional life support.

As with Robonaut 2, all tests to date have occurred in the safety of the space station's interior. However, in the future, upgraded SPHERES may venture outside the orbiting outpost.

“This is no ordinary upgrade – we’ve customized cutting-edge commercial technologies to help us answer questions like: How can robots help humans live and work in space? What will happen when humans explore other worlds with robots by their side? Can we make this happen sooner, rather than later?" said Fong. "Building on our experience in controlling robots on the space station, one day we'll be able to apply what we've learned and have humans and robots working together everywhere from Earth's orbit, to the moon, asteroids and Mars."
Rachel Hoover

Ames Research Center, Moffett Field, Calif.

Sunday, April 8, 2012

LUCAS THE ROBOT WITH THE HUMAN FACE

FROM DEPARTMENT OF DEFENSE ARMED WITH SCIENCE
Dr. Greg Trafton (left) and Lucas the Robot at the Laboratory for Autonomous Systems Research (LASR)


Admittedly, the initial idea of a robot with a face conjures up memories of every single SciFi robot movie I’ve ever seen.  Usually involving humans fleeing in terror as the autonomous voice screams “kill, kill” while shooting  rockets out of a gun-arm.  Or overly negative and depressed, like Marvin the Paranoid Android.  Frankly, I’d take my chances with the later.  He’d be a downer, but at least he has no plans for world domination.

Despite my preconceived notions of the robotic overlord race that is sure to enslave (or depress) us all, my experience at the Navy’s new robotics lab was a little less dramatic.  What I discovered was not a legion of soldier robots, but a team of highly trained scientists prepared to explain how they’re working toward a goal of integrating robotics into military life.

The brand new Laboratory for Autonomous Systems Research (LASR), located at the Naval Research Laboratory (NRL) in Washington, D.C. is spearheading efforts to combine human interaction with robotic skill and capability.  The goal is to take the best of both worlds and find a way to make missions easier and more effective for service members.  This means everything from locating IEDs to fighting fires.
So how are they doing that?  It all starts in the lab, of course.

This complicated and scientific process involves running experiments on autonomous systems in different situations and different environments.  Luckily, LASR is equipped with different environmental rooms designed to provide just that.  Scientists who work at the lab can step into the desert for a quick sandstorm, then walk across the hall to the rainforest to run experiments.  All of this without having to set foot outside the Navy’s new robotics laboratory.

“It’s the first time that we have, under a single roof, a laboratory that captures all the domains in which our sailors, Marines and fellow DOD service members operate,” said Rear Adm. Matthew Klunder, chief of naval research. “Advancing robotics and autonomy are top priorities for the Office of Naval Research. We want to reduce the time it takes to deliver capability to our warfighters performing critical missions. This innovative facility bridges the gap between traditional laboratory research and in-the-field experimentation—saving us time and money.”

Several of the projects going on in this lab are working toward creating viable solutions for problems service members might actually face.  One of these is Damage Control for the 21st Century—a program to develop firefighting robots for use aboard Navy ships.
Meet Lucas.

Lucas is a computerized cognitive model robot.  This means he’s designed to act the way a person does, and reacts the way a person might.  He’s built with a trifecta of skills: mobile, dexterous and social capabilities.  This means that he’s able to assume people think differently (e.g. don’t always come to the same conclusions), and he understands human limitations.

This concept is known as the “theory of the mind”, as Dr. Greg Trafton explained.  Trafton, a roboticist at the Navy Center for Allied Research in Artificial Intelligence, Information Technology Division, NRL explained that Lucas was created to appear more human than robot so he could solve human problems in a more practical manner.  Basically, Trafton’s working to create robots that think.
Lucas “thinks” using computational theories to find out what a person might be thinking in certain situations.  Lucas – and his female counterpart, Octavia – can see and understand words, expressions, even hand gestures.

Search This Blog

Translate

White House.gov Press Office Feed