Building and fine-tuning robotic systems takes lots of time. This is especially true for robots designed to interact within and manipulate an ever-changing array of objects in Amazon facilities. Developing robotic systems in a virtual environment can accelerate this process, but it’s harder than it looks.
Engineers have been accelerating new-product design using digital models and virtual simulations for decades. But these existing tools don’t meet Amazon’s need to develop and scale its fleet of complex robots.
To understand why, consider video games. Modern video games simulate worlds that look visually realistic at interactive rates.
“Take a race car game, for example. Everything looks physically plausible, but the forces behind the movements aren’t necessarily accurate,” says Andrew Marchese, an Amazon Robotics principal applied scientist who specializes in robotic manipulation. “They approximate some of the torques and forces that push and pull an object in the real world. So, a car’s acceleration may look realistic, even though the car’s engine is not big enough to generate the force needed to jump across the missing section of a bridge.”
Many industrial simulations also rely on approximations. Amazon, for example, uses visual simulators to plan its facilities and approximate how robots will move and interact safely with associates.
Comparing Robin real and simulated workcells
This side-by-side comparison shows the same perception and motion planning software driving both a real and simulated Robin robotic workcell.
“To develop complex robotic manipulation systems, we need both visual realism and accurate physics,” says Marchese. “There aren’t many simulators that can do both. Moreover, where we can, we need to preserve and exploit structure in the governing equations — this helps us analyze and control the robotic systems we build.”
The more complex the system, the more likely those small gaps between virtual and physical devices turn into chasms. Developers in the field call it the sim2real gap.
“This is why it is commonplace in robotics to write and test code against physical systems,” Marchese says. “But this approach is not scalable for the variety of types and configurations of robots Amazon is developing. Doing things this way, there is just not enough time or hardware for everyone on a project team to keep testing a system until they get it right.
“Our ambition is to develop robots in simulation first,” Marchese adds. “We want to write software against virtual robots, test it in realistic simulations, verify safety on a real robot, and deploy. And our team is making real progress in doing this.”
Modeling the underlying physics
To achieve this vision, Amazon must not only create models of complex robots but also the objects they will interact with regularly.
A robotic arm, for example, might include a pneumatic gripper with multiple suction cups on the end. A model of that arm must evaluate the flow of air through the gripper’s tubes and valves, the contact forces of the rubber cups on a package, how the deformation of the cups during contact changes airflow, and what happens if only some cups make contact.
Understanding Robin vacuum gripper behavior
As shown in this video of Robin’s vacuum tool, Amazon’s workcell simulations model the robot’s end-of-arm tool. These high-fidelity pneumatic and multi-body models enable developers to test both nominal and anomalous behavior — like dropping packages.
Understanding Robin vacuum gripper behavior
This video demonstrates how Amazon’s models can mimic successful robot behavior as well. Amazon scientists and engineers use these types of experiments to calibrate and validate their models.
In addition, it must also simulate how the robot’s vision system identifies individual items in a pile of mixed packages, and how its arm calculates the approach angle and force needed to lift it. It is a lot to do in a single simulation environment, especially in high-fidelity.
“The complexity of Amazon’s facilities makes this an even greater challenge,” says Clay Flannigan, Amazon Robotics senior manager, advanced robotics.
“Simulating robots is hard because robots interact with the world and the world is complex,” Flannigan explains. “There are many simulators that understand the movement of rigid robots in free space. But we stock essentially millions of items, and we want our robots to be able to interact with millions of different items in our inventory. This is an enormously difficult robotics challenge.”
Consider, for example, the range of packages a robotic arm might encounter. They include rigid boxes that hold a single, immobile object encased in cardboard or foam. That box is straightforward to model. Other boxes look the same on the outside but contain products that may shift their weight when lifted. Harder still are bubble-wrap mailers that deform and shift their center of gravity when lifted.
Given the number of packages Amazon handles every day, creating one-off models based on empirical tests isn’t feasible. Instead, Flannigan says, the company wants to model the underlying physics of these interactions.
An accurate first principles model requires highly detailed physics. In addition to airflow, a pneumatic gripper must also model contact forces, inertia, friction, and aerodynamics. While the physics are well understood, their application to individual components must be verified to ensure the models are accurate.
Building and verifying such models is a massive undertaking. Fortunately, though, MIT researchers have been working on a toolkit to model robotic components for years. It is called Drake.
Building a platform
Drake — the brainchild of Russ Tedrake, director of MIT’s Center for Robotics and vice-president of the Toyota Research Institute — is an open-source toolbox for modeling and optimizing robots and their control system.
The open-source part is critical to Amazon. Many modeling tools provide little or no insight into how their solvers produce their simulations. Drake, on the other hand, reveals its governing equations. “This lets us poke at the underlying physics and modify how they are applied,” Flannigan says. “If there is a bug, we can find it and fix it.”
Drake brings together several desirable elements for online simulation. The first is a robust multibody dynamics engine optimized for simulating robotic devices. The second is a systems framework that lets Amazon scientists write custom models and compose these into complex systems that represent actual robots. “At first the framework can seem a bit formal, but it is actually key to reusing and integrating components within large models,” Marchese said. The third is what he calls a “buffet of well-tested solvers” that resolve numerical optimizations at the core of Amazon’s models, sometimes as often as every time step of the simulation.
Another key feature is its robust contact solver. It calculates the forces that occur when rigid-body items interact with one another in a simulation.
“Figuring out those forces is a really difficult problem,” Marchese says. “If you don’t have a good contact solver, you might use the wrong force to grip an object, and drop it.”
Drake’s powerful features make it a critical platform for Amazon’s virtual robot development plans. In fact, Drake is now a strategic project for Amazon. This enables Amazon developers to work more closely with and make code contributions to Drake. In addition, last year, Amazon and MIT launched a Science Hub, a collaboration focused on areas of mutual interest, including robotics.
Changing robot development
While there will always be a sim2real gap, Amazon scientists and engineers are working to narrow the gap. One way they do that is by leveraging real data to validate the fidelity of the simulator.
“We are always comparing the model with the hardware,” Flannigan says. “If we get first principles right, the error in model converges over time. There is always some uncertainty in our model, but once we quantify this, it is relatively easy to apply it again in similar applications.”
The bigger challenge remains in deformable objects — things that bend, flap, twist, and sag. The Amazon and Drake teams are both making progress on handling soft bodies with large deformations, like stuffed animals or squishy pet toys.
That is a challenge Vanessa Metcalf, an Amazon Robotics software development manager, is addressing. “Right now, we don’t have a practical way to empirically understand how a robot will pick up millions of different deformable items.
Watch the Robin robotic arm deftly handling packages
“Finding a model in simulation that we can apply to a broad category of products is a massive challenge, and we’re looking for ways to address it. For example, are there objects that have deformable parts but also rigid parts that are easier to model? We’re looking at what we can do first and build on that.”
Despite the challenges, Amazon simulations are already yielding results. One of the Amazon Robotics program teams came up with a new robotic manipulation concept they thought might improve fulfillment. They were able to use the simulator developed by Metcalf’s team to quickly validate the idea.
“It took about a month to test the concept in simulation,” Metcalf says. “It turned out to be a great idea that’s being implemented now. If we had to wait for the hardware to do the concept validation, it would have taken three times as long. That’s just one of many examples of how simulation can be incredibly impactful.”
As Amazon continues to chip away at simulation challenges, it is continuously improving its modeling infrastructure. And with good reason.
Our dream is that all of our robotics research and development starts in simulation. When someone has an idea, their first reaction would not be to order parts, but to use the simulator.
Solving these challenges and achieving high-fidelity simulation would enable scientists and engineers to test new ideas and novel configurations as quickly as they could type their thoughts on a keyboard. They could generate conditions that rarely occur in prototype physical experiments, but that happen regularly within an organization that has robots that help deliver millions of packages a day. Teams could collaborate on different parts of a project simultaneously. No one would have to wait their turn for someone to reconfigure a robot prototype to test a new idea.
“Our dream is that all of our robotics research and development starts in simulation,” Metcalf says. “When someone has an idea, their first reaction would not be to order parts, but to use the simulator. They could develop an entire robotic workcell in a virtual environment, with a final safety check occurring on hardware.”
This reality is on the horizon, suggest Metcalf, Marchese and Flannigan. Although physics-based simulation has open challenges, Amazon is making real progress and the tools are accelerating the way Amazon develops new robots. Ultimately, this will result in more smiles from Amazon customers, and ever improving safety in its facilities.