NASA - National Aeronautics and Space Administration
+ Visit NASA.gov

Head

Unit A head

Robonaut's head is still a work in progress, but the existing system includes an articulated neck that allows the teleoperator to point Robonaut's camera as eyes. The head's two small color cameras deliver stereo vision to the operator's helmet display, yielding a form of depth perception. The inter-ocular spacing of the cameras is matched to typical human eye spacing, with a fixed vergence at arm's reach. The neck drives are commanded using a 6 axis Polhemus sensor mounted on the teleoperator's helmet, and can track the velocities of typical human neck motions. Like the arms, the neck's endoskeletonis covered in a fabric skin, which is fitted into and under the helmet.

The helmeted approach is unusual in the robotics world, where cameras are typically mounted in exposed locations on pan-tilt-verge units. Robonaut's requirements for a rugged design, working with Astronauts in cluttered environments drove the development towards a better protection system, such as the helmets that humans wear here on Earth. The helmet is made of an epoxy resin, "grown" using a stereo lithography machine at the Johnson Space Center, giving Robonaut protection from collisions.

cameras

The neck joint designs share substantial commonality with the arm joints, and are controlled with the same real time control system. Their kinematics is based on a pan tilt serial chain, with the first rotation about Robonaut's spine, and then a pitch motion about a lateral axis. The pitch motion axis does not pass through the camera CCD's , but is instead below, as is found in the Atlas joint in the human neck. This offset (actually a D-H link length) allows the cameras to translate forward, letting Robonaut direct its vision downward over its chest. A new set of articulating eyes has been built for Robonaut. The pointing system directs two pairs of eyes, independently verging them for tracking humans and objects. Each pair includes a large camera with computer controlled zoom, focus and iris adjustments, as well as a smaller camera to provide peripheral vision. The system has been assembled, and integrated with the brainstem for pointing control and calibration. The next step will be integration with the visual cortex, and then insertion of the system into the robot's helmet, replacing the old cameras.

MORE INFO IN NASA SITE NETWORK