NASA - National Aeronautics and Space Administration
+ Visit NASA.gov

Telepresence

Robonaut uses several novel techniques for establishing remote control of its subsystems and enabling the human operator to maintain situation awareness.

Telepresence requires that a human operator control the actions of a remotely operated robot. In the case of the Robonaut project, the human operator must control forty-three individual degrees of freedom. The use of three axis hand controllers would present a formidable task for the operator. Because Robonaut is anthropomorphic, the logical method of control is one of a master-slave relationship whereby the operator's motions are essentially mimicked by the robot. The operator performs the arm, head and hand motions for the required tasks and a master-slave control mechanism duplicates the same motions in the Robot. The goal of telepresence control is to provide an intuitive, unobtrusive, accurate and low-cost method for tracking operator motions and communicating them to the robotic system. Some of the component technologies used in Robonaut's telepresence system include Helmet Mounted Displays (HMD), force and tactile feedback gloves and posture trackers.

UnitB and teleoperator

Telepresence uses virtual reality display technology to visually immerse the operator in the robot's workspace. This way the teleoperator feels as if he or she is in the place of the robot. Visual feedback is provided by a stereo display helmet and includes live video from Robonaut's head cameras. The HMD provides a view into the robot's environment, facilitating intuitive operation and natural interaction with the work site. To be an effective tool for the robonaut project, the HMD must take into account image registration (stereo or bi-ocular view), field-of-view (FOV), graphical overlay capabilities and speech recognition capabilities.

Controlling Robonaut's highly dexterous fingers and hands is made possible by mapping the motions of the teleoperator's fingers onto the hand and finger motions of Robonaut. Finger tracking is accomplished through glove based finger pose sensors. Bend sensitive materials are used to track the orientation of each of the fingers. That information is used to command the action of Robonaut's fingers. Complex manipulation tasks are then made as intuitive as performing the task with your own hands.

Force sensors are built into Robonaut's hands. The forces imparted on Robonaut's fingers can be displayed to the teleoperator by means of a mechanical exoskeleton worn by the teleoperator. Figure 2 demonstrates how the finger forces measured by Robonaut's force sensors can be used to convey haptic information back to the teleoperator.

Arm, torso and head tracking is accomplished with the use of magnetic based position and orientation trackers. Mapping the motions of the human appendages to the motions of Robonaut's arms and head is accomplished similarly to the way the finger tracking is performed. The telepresence system will generate robot position commands through teleoperator pose tracking. Future telepresence control will address new methods and algorithms that will significantly improve safety and performance of teleoperated human-scaled dexterous robots during in-space operations.

Developing dedicated software tools for real-time, camera based, human posture tracking and, text and graphical advising capabilities will achieve this goal for robot operators. These features will allow the natural and unencumbered control of anthropomorphic robots, while minimizing training and maximizing robot performance. These new technologies have the potential to provide any telepresence interface with real-time operator tracking and audio-visual task feedback. Operators of dexterous space robots will take full advantage of the robots' high performance only if teleoperation is made easy and safe.

MORE INFO IN NASA SITE NETWORK