DARPA Robotics Challenge
DARPA, a government agency known for innovative advances in technology, sponsored the Robotics Challenge to promote critical improvements in robotics technology for disaster relief operations—especially where severe risks make human action too risky, or time constraints are critical. For example, an accident at a nuclear power plant that involves high levels of radiation might prove too dangerous for humans, but requires quick response.
While DARPA required that the emergency response robots “must be compatible with human operators, environments and tools,” it did not require humanoid form. But the IHMC focus on humanoid robots is rooted in a simple concept: Because the robots will be working in environments built for humans, a human-like robot is best-suited to the challenges involved.
A robot responding to a typical disaster scenario is likely to face rubble and other obstacles, unsettled ground, closed or stuck doors and windows, ladders and other challenges. Humanoid robots that can navigate human-scale environments and use human-type tools will have an advantage.
Obviously, sending robots into dangerous situations obviates risk to humans.
Moreover, teaming them with human partners in a system that optimizes their complementary strengths can enhance their performance. For instance, a human at a remote monitoring post, surveying the scene through the robot’s “eyes,” might be called on to make rapid judgments a robot is poor at, such as quickly determining the best path through a debris field. The robot, equipped with precise scanning tools, can quickly provide precise measurements of distance or size in situations where humans could only make rough estimates.
The challenges to be addressed are truly “DARPA Hard” – extremely high in technical risk, but equally high in operational payoff. As a result, this project will address several unsolved problems in collaborative design of human-machine interfaces, humanoid balance and walking algorithms, humanoid motion planning in complex 3D environments, and combined humanoid mobility and human-machine team manipulation.
But the IHMC team is confident it has an innovative approach to combine the cognitive abilities of humans, computational power, and the physical abilities of robots to handle novel situations in a new way.
The IHMC DRC Team:
- Jerry Pratt, Research Scientist, IHMC Team leader
- Matt Johnson, Research Scientist, IHMC Team Leader
- John Carff, Research Scientist
- Twan Koolen, Research Scientist
- Jesper Smith, Research Scientist
- Gray Thomas, Research Scientist
- Luca Colasanto, Intern
- Daniel Duran, Intern
- Jeff van Egmond, Intern
- Khai-Long Ho Hoang, Intern
- Doug Stephen, Intern
- Eric Morphis, Intern
- Nathan Mertins, Intern
- Stephen McCrory, Intern
VRC Hightlights - February 28, 2013
This video shows the latest progress of the IHMC DARPA Robotics Challenge team. It focuses heavily on walking capabilities, but also includes some other developments such as multi-contact balance and steering wheel control. In the walking work, we tried to address a variety of challenges to bipedal locomotion that might be encountered during a disaster response. These include: sloped terrain, uneven terrain, gaps, and stairs. We also demonstrate some body constraints during bipedal locomotion, such as squeezing sideways between obstacles and ducking under obstacles.
30 DoF Robot Climbing Steep Stairs
A 30 DoF simulated test robot climbing stairs with a rise of 35 cm and a tread of only 15 cm at a rate of 7.9 seconds per ten steps. Stairs this steep disallow the use of the full foot polygon to balance on a step by CoP placement, forcing us to focus on the use of angular momentum and improved weight distribution algorithms for balance control. In addition, combined ground reaction force control and foot orientation control is necessary, where the orientation of the foot is constrained by the line contact with the step. We are also able to climb variable height stairs.
Stair climbing fits nicely into our momentum and instantaneous capture point based control framework. In addition, almost the same high level behavior is used for flat ground walking and stair climbing, with only minor modifications to e.g. center of mass height trajectories. We plan to make full use of the algorithms developed for varying height stairs in the DRC, enabling dynamic, extremely rough terrain locomotion.
Simple Force Controlled Manipulation
Demo0 - the first internal demo for the IHMC Darpa Robotics Challenge Effort.
Since most of the efforts of the IHMC Robotics lab have been in bipedal locomotion, we felt a need to show ourselves that we could do basic manipulation as well.
The goal of Demo0 was to demonstrate our ability to do simple force controlled manipulation in the absence of locomotion. For this purpose, we created a simulated world in which there are three tables of different heights and two boxes. We then had our 30 degree of freedom simulated robot move the boxes between the tables. The operator can select a box, as well as where to put it, by clicking in the simulated world. When selecting a box, the operator can specifically select the faces of the box which the robot will use to pick it up. Therefore, depending on the robot's orientation with respect to the box, the operator can easily select the best two faces to securely interact with the box. Commands to pick up the selected box and to put it down can also be given through the GUI. Once a box has been selected for pick-up the operator then chooses the target destination by moving around a semi-transparent blue box of the same size and shape to the desired location.
The robot performs position control on the box in Cartesian space, both with respect to the world and with respect to its own chest, while also controlling the compression forces exerted on the box by the hands. While performing these tasks, the robot also controls its posture. The robot can lean towards a box, rotate its pelvis and chest independently, as well as control its center of mass height from anywhere between a full crouch to an upright stance.
During Demo0, the robot's fingers were locked in fixed positions. The problem of grasp control was deferred to a later demo. The robot morphology was close to, but not the same as the GFE robot.
DRC driving task outtakes.
The GFE robot sits on a box and is supposed to manipulate the steering wheel (which is comically large for testing purposes). There was a bug in the specification of the desired configurations of the hands with respect to the steering wheel, resulting in the behavior shown in the video.