National Robotics Week: Mars Exploration Rover II
Part II of William Clancey's Q & A:
What are some of the aspects (i.e.: design, engineering, computing) that originated with the MER that are used presently on Earth?
The answer to this question probably lies more in our future than near-term.
First, we must remember that the greatest problem of remotely working on Mars is the delay caused by communicating over such a great distance. Even at the speed of light the commands and data require 5 to 20 minutes each way depending on the relative position of the planets. It’s simply not practical to control a robotic system in real-time (“joy-sticking”) because you have to move the rover too slowly and must keep stopping to see what you are doing. Instead, we send programs and allow the robot to operate untended throughout the martian day (and with MSL’s radioisotope thermoelectric power system it’s possible to operate the instruments at night).
On Earth we don’t have this time delay problem, and people can monitor and control robotic actions (navigation, cameras, shovels, etc.) directly like playing a video game. The robot becomes a direct extension of the human body, rather than a proxy like MER that operates without our immediate involvement. So all this means that some of the greatest operating challenges we’ve faced with MER are not necessarily relevant to undersea robots, surgical robotic systems, and so on.
Now this said, the fact is we don’t always want to be involved in controlling systems that might be automated and operate without our continuous monitoring. A good example is the recent progress in developing “auto-mobiles”—cars that can drive themselves. These robotic vehicles use methods that have been under development in Artificial Intelligence laboratories for nearly 50 years. Today, with a range of lightweight sensors and vastly more powerful, inexpensive computers, the control program that drives the vehicle is able to process sonar and visual images in realtime, which means a robotic car can actually move at 60 mph down a highway staying properly spaced in its lane, and even taking an exit on a predetermined route. These robotic vehicles use methods related to what we have exploited on MER for long drives over relatively unobstructed terrain. However, most of the software and many of the sensors on the auto-mobiles of today has been developed since MER arrived on Mars.
One of the more interesting applications of the MER programmed, mobile laboratory concept may be in robotic mining for gold. Some mines in South Africa are so deep it takes too long for miners to get to work, and the environment is so extreme that they need something like a spacesuit to stay cool and dry (instead companies air condition miles-deep, extensive tunnels at great cost). Most likely a mixture of programmed robots and robots controlled directly in real time will be employed for mining in coming decades.
Do you see robots dominating the future of space exploration, or do you think it will be a robotic-human collaboration?
I believe it’s inevitable that people will colonize Mars and eventually go beyond our solar system. Automated systems will surely be important for monitoring and controlling life support and other habitat systems, reconnaissance and surveys, scientific measurements, and so on. After a long sol working on the martian surface in a spacesuit, we might be glad to put the rover on autopilot to take us back to the habitat.
A robotic collaborator is something else altogether. For if a robot is capable of collaborating in the manner of a person, it would necessarily have its own projects and point of view, so we might have to entice it to work with us.
I believe what’s desirable instead, as scientists and engineers explain in my book, are automated systems that do more of what we find tedious and uninteresting, and so allow us to focus on the strategic, conceptual, and analytic aspects. An example are the computer tools that enable planetary scientists to directly plan what a rover or orbiting spacecraft will do when, and other tools that assist the engineers in refining these plans into tested programs.
Nevertheless, the success of IBM’s Watson program suggests another liaison might emerge. As important as search tools are today for scientists, a search program with more capability to extract meaning from text, photographs, and videos will undoubtedly change how research is done and what will be possible. Such tools might be personalized into assistants that know what you’ve read and are trying to learn about, so it’s possible they will approach an early graduate student’s ability in coming decades.
So again, as I said about MER being different from our “horseless carriage” notion of a robot—something that replaces or is identical to what we already know—artificial intelligence may play a role in changing how we do our work and enable people to work alone and together in ways that are not now obvious.
Even today with robotic systems present on Mars, orbiting Saturn, on the away to Pluto, and so on—while astronauts have not left Earth orbit in 40 years—robots do not in any sense dominate space exploration because they are not capable of exploring at all. People are exploring Mars, virtually present on the surface through a combination of “virtual reality” planning and analysis computer tools, which let them see, manipulate, and understand the materials they encounter. Working on Mars is possible because the programmed, mobile laboratories of MER and MSL reliably carry out the operations the scientists prescribe. In effect, the robots don’t replace the scientists, but rather the integrated technology makes the scientists into cyborgs on Mars, as one scientist told me, “with two boots on the ground.”
The story of people and machines in space is going to be like that—the systems that relate our bodies to our actions and our thoughts, and ourselves to each other, will amplify and extend our physical, intellectual, and social reach in ways not yet imagined.