Novel hardware-based modeling approach for multi-robot tasks

Technological revolution means robots no longer are the song of the future. The Governor of the Bank of England predicts today that up to half of British workforce face redundancy in the imminent ‘second machine age’. No wonder, the research of multi-robot systems generates serious buzz both for promising (albeit at times scary) results and for their application prospects in the real world.
Novel hardware-based modeling approach for multi-robot tasks

Franka: A Robot Arm That’s Safe, Low Cost, and Can Replicate Itself

This factory robot can be trusted not to kill its human coworkers

Photo-illustration: Edmon de Haro

window.fbAsyncInit = function() {
FB.init({
appId : ‘174248889578740’,
xfbml : true,
version : ‘v2.4’
});
};

(function(d, s, id){
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) {return;}
js = d.createElement(s); js.id = id;
js.src = “//connect.facebook.net/en_US/sdk.js”;
fjs.parentNode.insertBefore(js, fjs);
}(document, ‘script’, ‘facebook-jssdk’));

@media print {
.article-detail iframe {
display: block !important;
}
}
p.intro-section {
width: 85%;
padding-bottom: 10px;
}
@media only screen and (max-width: 620px) {
#medium-content {
width: 100%!important;
}
p.intro-section {
width: 100%;
padding-bottom: 10px;
}
#food-widget{
margin:0 0 30px 0;
float:left;
}
#food-widget #prev, #food-widget a h1{
display:none;
}
#food-widget a{
text-decoration:none;
}
#food-widget #next img{
display: inline-block;
float: left;
width: 33%;
margin: 0 10px 0 0;
}
#food-widget #next img.arrow{
display:none;
}
#food-widget #next h2{
display: inline;
float: left;
width: 58%;
margin: 0;
}
#food-widget a h1,#food-widget a h2{
text-decoration:underline;
}
#food-widget a div{
color:#03a6e3;
font-size:18px;
font-weight:bold;
}
#food-widget ul li#next {
float: none;
text-align: left;
width: 95%;
padding: 5px;
}
}

Sami Haddadin once attached a knife to a robot manipulator and programmed it to impale his arm. No, it wasn’t a daredevil stunt. He was demonstrating how a new force-sensing control scheme he designed was able to detect the contact and instantly stop the robot, as it did.

Now Haddadin wants to make that same kind of safety feature, which has long been limited to highly sophisticated and expensive systems, affordable to anyone using robots around people. Sometime in 2017, his Munich-based startup, Franka Emika, will start shipping a rather remarkable robotic arm. It’s designed to be easy to set up and program, which is nice. But what makes it special is that, unlike typical factory robots, which are so dangerous they are often put inside cages, this arm can operate right next to people, assisting them with tasks without posing a risk.

And did I mention that it can build copies of itself?

The robot, also called Franka Emika—“It’s like first and last name,” Haddadin explains—is not the only one ever designed to operate alongside human workers. Indeed, this type of system, known as a collaborative robot, or cobot, is one of the fastest growing segments in the robotics market, with global sales expected to jump from US $100 million in 2016 to over $3.3 billion in just five years, according to one estimate.

All the big industrial robot makers are trying to develop their own cobots, but the most innovative designs have come from startups. Rethink Robotics introduced its Baxter dual-arm robot in 2012, and more recently it unveiled a single-arm robot called Sawyer. The cobot sector, however, is currently dominated by Danish company Universal Robots, which ships thousands of robots each year. Even so, such robots remain pretty rare. Expect that to change rapidly over the next few years as Haddadin’s company—which is financially backed by a group of investors that include German robot maker Kuka—and other firms enter the market.

Haddadin, who’s worked at one of Germany’s top robotics labs and had a brief stint at the celebrated robotics company Willow Garage in Silicon Valley, says one thing that will set Franka apart from the competition is its manipulation skills. While some of its specs [PDF]—seven axes of motion, 80-centimeter reach, 3-kilogram payload, and 0.1-millimeter accuracy—are comparable with those of other robots, Franka is designed to perform tasks that require direct physical contact in a carefully controlled manner. These include drilling, screwing, and buffing, as well as a variety of inspection and assembly tasks that electronics manufacturers in particular have long wanted to automate.

Franka has more dexterity than is typical for a robotic arm because it is what is known as a torque-controlled robot. It uses strain gauges to measure forces on all of its seven joints, allowing it to detect even the slightest collisions. In contrast, most industrial robots have no force-sensing capabilities at all—and that’s why they are dangerous: They’ll take you out and won’t even notice it.

One prerequisite for torque control is an extremely detailed model of your robot’s dynamics. You need to factor in even the smallest effects, such as elasticity, vibration, and friction in the components. That’s because torque control works by comparing actual force measurements on the robot to reference values computed from a model in real time. So if your model is off, your control will be off too.

Haddadin saw that not as a hurdle but as an advantage. “The truth is, I model the hell out of everything I build,” he says. Gerd Hirzinger, a pioneer of torque-controlled robots and one of Haddadin’s mentors at the German Aerospace Center’s Institute of Robotics and Mechatronics, called Franka a “long-yearned-for breakthrough.”

Another factor that will make Franka stand out is cost. At the time of this writing, the robot was available for preorder at a yet-to-be-confirmed price of €9,900, or about $10,500. That’s a startlingly low figure for such a capable robotic arm. For comparison, Rethink’s Sawyer sells for $29,000, and Universal Robots’ best-selling UR5 costs even more, at $35,000.

Henrik Christensen, director of the Contextual Robotics Institute at the University of California, San Diego, says Franka is “an impressive piece of hardware.” But he adds that with cobots the main challenge is “not just the hardware but also the software to make it easily accessible to nonexperts.” Universal Robots, he says, is “beating the competition by having by far the best user interface.” So that’s an area where Franka will need to prove itself.

Haddadin says his company devoted just as much attention to software as it did to the design of the robot itself. Users can program Franka by moving it with their hands and tapping on a touch screen, with a variety of preprogrammed motions readily available. And once you’ve created a program for one Franka, you can just copy it over the cloud to one or more other Frankas.

But perhaps the most ambitious part of Haddadin’s plan is getting Franka to essentially clone itself. During initial production runs, the robot was performing about 80 percent of the work, but the goal is 100 percent, he insists. Looking further into the future, Haddadin envisions sending containers all around the world as mobile robot factories. “Inside there will be Frankas building Frankas,” he says.

Hordes of self-replicating robots popping up everywhere? For whatever it’s worth, it’s probably a good thing Haddadin is making them very human friendly—even when holding a knife.

This article appears in the January 2017 print issue as “Employee of the Month. Every Month.”

Franka: A Robot Arm That’s Safe, Low Cost, and Can Replicate Itself

A Parallel Air Traffic Control System Will Let Delivery Drones Fly Safely

Engineers are figuring out how to let drones fly beyond visual range

Photo-illustration: Edmon de Haro

window.fbAsyncInit = function() {
FB.init({
appId : ‘174248889578740’,
xfbml : true,
version : ‘v2.4’
});
};

(function(d, s, id){
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) {return;}
js = d.createElement(s); js.id = id;
js.src = “//connect.facebook.net/en_US/sdk.js”;
fjs.parentNode.insertBefore(js, fjs);
}(document, ‘script’, ‘facebook-jssdk’));

@media print {
.article-detail iframe {
display: block !important;
}
}
p.intro-section {
width: 85%;
padding-bottom: 10px;
}
@media only screen and (max-width: 620px) {
#medium-content {
width: 100%!important;
}
p.intro-section {
width: 100%;
padding-bottom: 10px;
}
#food-widget{
margin:0 0 30px 0;
float:left;
}
#food-widget #prev, #food-widget a h1{
display:none;
}
#food-widget a{
text-decoration:none;
}
#food-widget #next img{
display: inline-block;
float: left;
width: 33%;
margin: 0 10px 0 0;
}
#food-widget #next img.arrow{
display:none;
}
#food-widget #next h2{
display: inline;
float: left;
width: 58%;
margin: 0;
}
#food-widget a h1,#food-widget a h2{
text-decoration:underline;
}
#food-widget a div{
color:#03a6e3;
font-size:18px;
font-weight:bold;
}
#food-widget ul li#next {
float: none;
text-align: left;
width: 95%;
padding: 5px;
}
}

In 2013, shortly before Christmas, Amazon.com released a video depicting its plans to speed packages to their destinations using small drones. Some commentators said it was just a publicity stunt. But the notion began to seem less far-fetched when Google revealed its own drone-based delivery effort in 2014, something it calls Project Wing. And in the early months of 2016, DHL actually integrated drones into its logistics network, albeit in an extremely limited way—delivering packages to a single mountaintop in Germany that is difficult to access by car in winter.

“It started to get momentum after serious players came in,” says Parimal Kopardekar, NASA’s senior engineer for air transportation systems, who has been researching ways to work these buzzing little contraptions into an air traffic control system created for full-size aircraft. “We need to accommodate drones.”

This past August, the U.S. Federal Aviation Administration (FAA) introduced Part 107, also known as the Small UAS Rule, which allows companies to use small drones in the daytime (or during twilight) and within visual line of sight of the pilot, so long as they are not flown over people who aren’t participating in these operations.

This year promises to see the FAA’s drone rules loosen even more. At the InterDrone conference in Las Vegas this past September, FAA head Michael Huerta explained that his agency was drafting rules to allow drones to be flown over random bystanders (the FAA calls them “non-participants”) and that it plans to release proposed regulations to that effect by the end of 2016. “We’re also working on a proposal that would allow people to fly drones beyond visual line of sight,” he said. Such a move would open the door to the use of small drones to deliver packages, among other things.

Of course, when you start flying drones where you can’t see them, you need to put technology in place to be sure that they don’t hit anything or injure anybody. While the details of how exactly to do that remain to be hammered out, there is no shortage of ideas.

One of the companies working on this challenge is PrecisionHawk, based in Raleigh, N.C. It’s one of just two companies to have obtained a waiver from the FAA allowing it to fly small drones beyond the operator’s visual line of sight. For such flights, the FAA does, however, require that an observer be posted to look out for full-scale aircraft.

Still, the waiver increases the range of the company’s drone operations from how far away you can see a small aerial vehicle—typically a kilometer or less—to how far away you can see a full-size plane—6 to 7 kilometers. The waiver does not allow “a 200-mile straight-line flight from A to B,” notes Thomas Haun of PrecisionHawk. Nevertheless, he’s heartened by the “much broader area” the exception permits. The engineers at PrecisionHawk obtained that waiver in part because it had created a system to help drone pilots safely operate a vehicle that they can’t directly see.

Avoiding a collision with a full-size aircraft is job No. 1, of course. But the more typical danger is much more mundane—running into a tree or a wall. To avoid that, PrecisionHawk uses satellite imagery to create a detailed terrain model, one of sufficient resolution to capture how high each tree and building is. Its system continually updates that model as new satellite imagery becomes available. Flight-planning software or even the autopilot on the drone itself can then use this information to avoid obstacles.

PrecisionHawk has also worked out a mechanism for drone operators to get updates through Verizon’s cellular network on the location of full-size aircraft—the same sort of information that air traffic controllers have. And PrecisionHawk’s drones report their positions over that same wireless network, so air traffic controllers and pilots can, in principle, know where these machines are. “What we’re providing as a product is primarily the software and data,” says Tyler Collins, the creator of this system, which goes by the acronym LATAS (Low Altitude Traffic and Airspace Safety). “We want LATAS on every drone.”

PrecisionHawk’s system mimics the strategy that is increasingly being used to manage full-size aircraft, whereby those aircraft determine their positions using GPS or some other form of satellite navigation and broadcast that information by radio to everyone else. The equipment for this form of air traffic management, called ADS-B (for Automatic Dependent Surveillance-Broadcast), will be mandatory on most U.S. aircraft by 2020.

While it might seem sensible to include small drones in the upcoming ADS-B regime, doing so could easily overwhelm that system, given the huge and growing number of drones—they’re selling at a rate of about 2 million a year in the United States alone, according to the FAA [PDF]. With those numbers growing so fast, an independent scheme for drone-traffic management seems inevitable.

NASA, Google, and Amazon have all been contemplating what such a system should entail. While the concepts that have been outlined vary in many ways, they are all similar in that they would restrict drones to the first few hundred feet above the ground and to locations that are well separated from any airports—that is, to parts of the sky full-size aircraft rarely visit.

At an airport in Reno, Nev., this past October, NASA and various industry partners carried out trials meant to help establish detailed technical requirements for a drone traffic-management system, one that would allow deliveries like the one depicted in that 2013 Amazon video. So whether or not it was a publicity stunt, perhaps this indeed is what the future holds. Haun of PrecisionHawk says, “We actually don’t think the future is very far off.”

This article appears in the January 2017 print issue as “Air Traffic Control for Delivery Drones.”

A Parallel Air Traffic Control System Will Let Delivery Drones Fly Safely

After Mastering Singapore’s Streets, NuTonomy’s Robo-taxis Are Poised to Take on New Cities

An AI alternative to deep learning makes it easier to debug the startup’s self-driving cars

Photo-illustration: Edmon de Haro

window.fbAsyncInit = function() {
FB.init({
appId : ‘174248889578740’,
xfbml : true,
version : ‘v2.4’
});
};

(function(d, s, id){
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) {return;}
js = d.createElement(s); js.id = id;
js.src = “//connect.facebook.net/en_US/sdk.js”;
fjs.parentNode.insertBefore(js, fjs);
}(document, ‘script’, ‘facebook-jssdk’));

@media print {
.article-detail iframe {
display: block !important;
}
}
p.intro-section {
width: 85%;
padding-bottom: 10px;
}
@media only screen and (max-width: 620px) {
#medium-content {
width: 100%!important;
}
p.intro-section {
width: 100%;
padding-bottom: 10px;
}
#food-widget{
margin:0 0 30px 0;
float:left;
}
#food-widget #prev, #food-widget a h1{
display:none;
}
#food-widget a{
text-decoration:none;
}
#food-widget #next img{
display: inline-block;
float: left;
width: 33%;
margin: 0 10px 0 0;
}
#food-widget #next img.arrow{
display:none;
}
#food-widget #next h2{
display: inline;
float: left;
width: 58%;
margin: 0;
}
#food-widget a h1,#food-widget a h2{
text-decoration:underline;
}
#food-widget a div{
color:#03a6e3;
font-size:18px;
font-weight:bold;
}
#food-widget ul li#next {
float: none;
text-align: left;
width: 95%;
padding: 5px;
}
}

Take a short walk through Singapore’s city center and you’ll cross a helical bridge modeled on the structure of DNA, pass a science museum shaped like a lotus flower, and end up in a towering grove of artificial Supertrees that pulse with light and sound. It’s no surprise, then, that this is the first city to host a fleet of autonomous taxis.

Since last April, robo-taxis have been exploring the 6 kilometers of roads that make up Singapore’s One-North technology business district, and people here have become used to hailing them through a ride-sharing app. Maybe that’s why I’m the only person who seems curious when one of the vehicles—a slightly modified Renault Zoe electric car—pulls up outside of a Starbucks. Seated inside the car are an engineer, a safety driver, and Doug Parker, chief operating officer of nuTonomy, the MIT spinout that’s behind the project.

The car comes equipped with the standard sensor suite for cars with pretensions to urban autonomy: lidars on the roof and around the front bumper, and radar and cameras just about everywhere else. Inside, the car looks normal, with the exception of three large buttons on the dashboard labeled Manual, Pause, and Autonomous, as well as a red emergency stop button. With an okay from the engineer, the safety driver pushes the Autonomous button, and the car sets off toward the R&D complex known as Fusionopolis.

By the end of this year, nuTonomy expects to expand its fleet in Singapore from six cars to dozens, as well as adding a handful of test cars on public roads in the Boston area, near its Cambridge headquarters, and one or two other places.

“We think Singapore is the best place to test autonomous vehicles in the world,” Parker tells me as the car deftly avoids hitting a double-parked taxi.

One-North offers a challenging but not impossible level of complexity, with lots of pedestrians, a steady but rarely crushing flow of vehicle traffic, and enough variability to give the autonomous cars what they need to learn and improve.

Riding in an autonomous car makes you acutely aware of just how many potentially dangerous behaviors we ignore when we’re behind the wheel. Human drivers know from experience what not to worry about, but nuTonomy’s car doesn’t yet, so it reacts to almost everything, with frequent (and occasionally aggressive) attempts at safety. If the car has even a vague suspicion that a pedestrian might suddenly decide to cross the road in front of it, it will slow to a crawl.

This mistrust of pedestrians as well as other drivers was designed into the software. “Humans are by far our biggest challenge,” Parker says.

Over the course of 15 minutes, our car has to deal with people walking in the gutter, cars drifting across the centerline, workers repairing the road, taxis cutting across lanes, and buses releasing a swarm of small children. Even a human driver would have to concentrate, and it’s unsurprising that the safety driver sometimes has to take over and reassure the car that it’s safe to move.

To handle these complex situations, nuTonomy uses formal logic, which is based on a hierarchy of rules similar to Asimov’s famous Three Laws of Robotics. Priority is given to rules like “don’t hit pedestrians,” followed by “don’t hit other vehicles,” and “don’t hit objects.” Less weight is assigned to rules like “maintain speed when safe” and “don’t cross the centerline,” and less still to rules like “give a comfortable ride.”

The car tries to follow all of the rules all the time, but it breaks the less important ones first: If there’s a car idling at the side of the road and partially blocking the lane, nuTonomy’s car can break the centerline rule in order to maintain its speed, swerving around the stopped car just as any driver would. The car uses a planning algorithm called RRT*—pronounced “r-r-t-star”—to evaluate many potential paths based on data from the cameras and other sensors. (The algorithm is a variant of RRT, or rapidly exploring random tree.) A single piece of decision-making software evaluates each of those paths and selects the path that best conforms to the rule hierarchy.

By contrast, most other autonomous car companies rely on some flavor of machine learning. The idea is that if you show a machine-learning algorithm enough driving scenarios—using either real or simulated data—it will be able to figure out the underlying rules of good driving, then apply those rules to scenarios that it hasn’t seen before. This approach has been generally successful for many self-driving cars, and in fact nuTonomy is using machine learning to help with the much different problem of interpreting sensor data—just not with decision making. That’s because it’s very hard to figure out why machine-learning systems make the choices they do.

“Machine learning is like a black box,” Parker says. “You’re never quite sure what’s going on.”

Formal logic, on the other hand, gives you provable guarantees that the car will obey the rules required to stay safe even in situations that it’s otherwise completely unprepared for, using code that a human can read and understand. “It’s a rigorous algorithmic process that’s translating specifications on how the car should behave into verifiable software,” explains nuTonomy CEO and cofounder Karl Iganemma. “That’s something that’s really been lacking in the industry.”

Gill Pratt, CEO of the Toyota Research Institute, agrees that “the promise of formal methods is provable correctness,” while cautioning that it’s “more challenging to apply formal methods to a heterogeneous environment of human-driven and autonomous cars.”

nuTonomy is quickly gaining experience in these environments, but it recognizes that these things take time. “We’re strong believers that this is going to make roads much, much safer, but there are still going to be accidents,” says Parker. Indeed, one of nuTonomy’s test vehicles got into a minor accident in October. “What you want is to be able to go back and say, ‘Did our car do the right thing in that situation, and if it didn’t, why didn’t it make the right decision?’ With formal logic, it’s very easy.”

The ability to explain what’s happened will help significantly with regulators. So will the ability to show them just what fix you’ve made so that the same problem doesn’t happen again. Effective regulation is critical to the success of autonomous cars, and it’s a challenging obstacle in many of the larger auto markets. In the United States, for example, federal, state, and local governments have created a hodgepodge of regulations related to traffic, vehicles, and driving. And in many areas, technology is moving too fast for government to keep up.

A handful of other companies are testing autonomous taxis and delivery vehicles on public roads, including Uber in Pittsburgh. The motive is obvious: When robotic systems render human drivers redundant, it will eliminate labor costs, which in most places far exceed what fleet operators will pay for their autonomous vehicles. The economic potential of autonomous vehicles may be clear. But what’s less clear is whether regulators will approve commercial operations anytime soon.

In Singapore, the city-state’s government is both more unified and more aggressive in its pursuit of a self-driving future. “We’re starting with a different philosophy,” explains Lee Chuan Teck, deputy secretary of Singapore’s Ministry of Transport. “We think that our regulations will have to be ready when the technology is ready.” Historically, Singapore has looked to the United States and Europe for guidance on regulations like these, but now it’s on its own. “When it came to autonomous vehicles, we found that no one was ready with the regulations, and no one really knows how to test and certify them,” says Tan Kong Hwee, the director for transport engineering of the Singapore Economic Development Board.

Singapore’s solution is to collaborate with local universities and research institutions, as well as the companies themselves, to move regulations forward in tandem with the technology. Parker says that these unusually close ties between government, academia, and industry are another reason nuTonomy is testing here.

Singapore has good reason to be proactive: Its 5.6 million people are packed into just over 700 square kilometers, resulting in the third most densely populated country in the world. Roads take up 12 percent of the land, nearly as much as is dedicated to housing, and as the population increases, building more roads is not an option. The government has decided to make better use of the infrastructure it has by shifting from private cars (now used for nearly 40 percent of trips) to public transit and car shares. Rather than spending 95 percent of their time parked, as the average car does today, autonomous cars could operate almost continuously, reducing the number of cars on Singapore’s roads by two-thirds. And that’s with each car just taking one person at a time: Shared trips could accommodate a lot more people.

Over the next three to five years, Singapore plans to run a range of trials of autonomous cars, autonomous buses, autonomous freight trucks, and even autonomous utility vehicles. The goal will be to understand how residents use autonomous vehicle technology in their daily lives. Beyond that, Lee says, Singapore is “about to embark on a real town that we’re developing in about 10 to 15 years’ time, and we’re working with the developers from scratch on how we can incorporate autonomous vehicle technology into their plans.” Building new communities from scratch, such as One-North, is a Singaporean specialty.

In this new town, most roads will be replaced with paths just big enough for small autonomous shuttles. For longer trips, on-demand autonomous cars and buses will travel mostly underground, waiting in depots outside the city center until they’re summoned. It’s a spacious, quiet vision, full of plazas, playgrounds, and parks—and practically no parking spaces.

To begin to meet this challenge, nuTonomy has partnered with Grab, an Asian ride-sharing company, making autonomous taxi services available to a small group of commuters (chosen from thousands of applicants) around One-North. Testing the taxis in a real application like this is important, but equally important is understanding how users interact with the cars once they stop being a novelty and start being a useful way to get around. “People very quickly start to trust the car,” says Parker. “It’s amazing how quickly it becomes normal.”

If all goes well, Parker adds, the company should be ready to offer commercial service through Grab—to all customers, not just preapproved ones—around the One-North area in 2018. At first, each taxi will have a safety driver, but nuTonomy is working on a way to allow a human to remotely supervise the otherwise autonomous car when necessary. Eventually, nuTonomy will transition to full autonomy with the option for teleoperation.

“The whole structure of cities is going to change,” Parker predicts. “I think it’s going to be the biggest thing since the beginning of the automobile age.”

This article appears in the January 2017 print issue as “Hail, Robo-taxi!”

After Mastering Singapore’s Streets, NuTonomy’s Robo-taxis Are Poised to Take on New Cities