Imagine a soldier wounded on the battlefield in a place so dangerous that even a squad's intrepid medic cannot offer assistance.
Up rolls "Robo Sally," with a strong robotic torso and 3-D capable cameras for eyes. Its dexterous arms and fingers carry out pre-set programming or remote assistance to patch up a wounded soldier and deliver him to safety.
This fantasy could become the reality for future warfighters through new technology developed by the Johns Hopkins University Applied Physics Laboratory.
Robo Sally, a mobile high dexterity manipulator, is the amalgamation of a few different prototype technologies that ultimately creates a remote-controlled, four-wheeled robotic torso. A team from the university's lab put it on display at the Association for Unmanned Vehicle Systems International (AUVSI) convention Wednesday in Washington, D.C.
It's central component is the arms, designed for a project funded by the Defense Advanced Research Projects Agency (DARPA) as prosthetics for amputees or those who had limited use of their limbs.
Together, the technologies within this four-wheeled robot could pave the way for Explosive Ordnance Disposal technicians to disarm bombs more efficiently, project engineers say, or for space explorers who want to get their (robotic) hands dirty on other planets.
Program Manager Matthew Johannes shows one of the robot’s hands, containing 10 separate motors, which can be detached and interchanged.
Ten motors in each hand allow the robot to grip roughly 20 pounds, the engineers say, and the arms can curl 45 pounds.
Robo Sally uses the sensor from Microsoft's XBOX Kinect mounted on its shoulder to help determine what is in front of it and can currently pick up solid objects such as a cylinder or a ball. Project Manager Matthew Johannes says his team is working on developing more dexterity, allowing it to perform tasks such as picking up a toothbrush and squeezing a toothpaste tube.
The robot presently relies on an operator wearing wrist and waist bands, special gloves and a headset that provides 3-D vision of what the drone sees. The operator has to adjust for a lag in the feed and limited visibility.
The next step involves programming automatic movements into the robot, Johannes says, allowing it to perform set tasks on its own.
This kind of “cutting edge research” sets the new requirements for a future where robotic limbs take over for dangerous jobs, or help those who can’t use their own arms, he says.
The operator wears a special headset and sensors to control Robo Sally’s arms, waist and hands. The screen at right shows the resulting 3D image of its cameras.