ProMat and Automate Day 3 News and Notes: R2-D2 and 3 Tons of Fun!

CHICAGO – Observers of these “news and notes” updates may notice a not-so-subtle casualness to the write-ups as the length of time increases while at a trade show. ProMat and Automate are no different, as we continue to see more robots (the R2-D2 model squealing a few rows over from our video booth was fun!) and meet with companies in the automation space.

Speaking of R2-D2, I discovered during Day 3 that the Star Wars Celebration event is being held right after this show, in fact with a day of overlap (April 11). The Star Wars folks will be at the other part of the McCormick Place convention center – in the West Hall area, while ProMat and Automate takes up the North and South Halls. I saw a few people wandering around carrying light sabers, in addition to the R2-D2 robot – I figure that’s going right over to that event on Friday through the weekend.
Waypoint goes big with MAV3K
At the Waypoint Robotics booth, visitors could check out MAV3K (pronounced “May-Vick”), the latest member of the company’s industrial-grade autonomous mobile robot family. MAV3K can carry items up to 3,000 pounds, with omnidirectional mobility for “smooth and nimble movement of your heaviest materials,” the company said.

Like its Vector robot, MAV3K includes support from Waypoint’s Dispatcher software, which lets companies set up the robot and have it operate autonomously in under 15 minutes. The MAV3K’s batteries also keep it moving throughout the workday, but it can also recharge by connecting to the Waypoint EnZone wireless charging system. MAV3K also includes dual-safety rated lidar sensors, a three-stage safety system and autonomous navigation are designed to have MAV3K safely find its way around a manufacturing or warehouse floor safety.

“We are thrilled to offer the workforce a better tool to move large, heavy materials,” said Jason Walker, CEO of Waypoint Robotics. “We’ve architected our products from Day 1 with the philosophy of ‘Bobby first.’ Bobby is the worker who’s been there for years and knows the job better than anyone. We’ve designed MAV3K so Bobby and workers like him can send it on missions to move the heaviest materials in his factory.”
Inspekto aims to disrupt inspection process
I had a very quick but great meeting with the leaders of Inspekto, which was honored at the Automate show with a Gold Award in the vision systems category of the Vision Systems Design Innovators Awards. After spending a few minutes talking with them, I can understand why they were honored.

The Inspekto S70 system.

Launched in November 2018, the Inspekto S70 is an “autonomous vision system” that combines a camera, light, lens and mounting aimed at industrial inspection processes. “Capable of inspecting any product, on any line, using any handling method, the system is a major tool for profitability per line for industrial plants, regardless of industry or geography,” the company said. With the company’s Plug and Inspect technology, Inspekto says the system can be installed in 30 to 60 minutes, with a price tag of just over $11,000 (€10,000).

The company said it plans on launching a new suite of applications for the platform next month at the Control trade fair for quality assurance professionals, held in Stuttgart, Germany.

“Installing the INSPEKTO S70 means that valuable staff can be moved from monotonous QA tasks to more productive roles and traditional tedious solutions replaced by simple to use and very affordable systems,” said Harel Boren, CEO and co-founder of Inspekto. “Over time, a €10,000 investment in an off-the-shelf product will save a plant hundreds of thousands and drastically improve productivity.

Yonatan Hyatt (left) and Harel Boren (right) from Inspekto.

“In fact, one of our customers, a world leading automotive plant, recently reported direct savings of €468,336 per year from just one location using an INSPEKTO S70 system. When you think about installing multiple systems to achieve Total QA, the impact on customer profits is extraordinary.”

The company said the S70 system has already been deployed in manufacturing plants across several industries, less than six months after launching. It claims a commercial footprint of more than 2,500 industrial plants worldwide, and the company said it plans to expand into the U.S. market as well. For more details, head to the company’s website.
Final bits and pieces
I spent most of the day conducting some video interviews with robotics leaders, including Melonee Wise from Fetch Robotics, Daniel Theobald from Vecna Robotics, Matt Yearling from PINC, and Joel Reed from IAM Robotics, among others. We plan to have those videos up soon for readers to enjoy – thanks to everyone who helped us out on that project.

Stay tuned next week for even more updates, posts, and analysis from the show, and if you’re sticking around Chicago for the Star Wars Celebration, May the Force Be With You!

Additional ProMat / Automate coverage:

ProMat and Automate Day 2 News, Notes, and Forklifts
News and Notes from Day 1 at ProMat/Automate 2019
MiR Launches MiR1000 for Autonomous Transport of up to 1 Ton Loads
Robotiq Unveils New Vacuum Grippers, Sanding Kit
Epson Robots Launches New Robots, Intelligent Feeding System
IAM Robotics Redesigns, Expands Swift System for Mobile Fulfillment
Download: Mobile Robots Move Beyond Pilot Projects
ProMat and Automate Show Guide: Robot Company Showcase
Brain Corp Launches Autonomous Delivery Robot Concept
6 River to Launch Mobile Sort System at ProMat 2019
ProMat and Automate Day 3 News and Notes: R2-D2 and 3 Tons of Fun!

Recycling Robot Learns Through System of Touch

Every year, trash companies sift through an estimated 68 million tons of recycling, which is the weight equivalent of more than 30 million cars.

A key step in the process happens on fast-moving conveyor belts, where workers have to sort items into categories such as paper, plastic and glass. Such jobs are dull, dirty, and often unsafe, especially in facilities where workers also have to remove normal trash from the mix.

The RoCycle robotic system uses touch sensors to detect whether an item is paper, plastic, or metal. Source: Jason Dorfman, MIT CSAIL

With that in mind, a team led by researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has developed a robotic system that can detect if an object is paper, metal or plastic.

The team’s “RoCycle” system includes a soft teflon hand that uses tactile sensors on its fingertips to detect an object’s size and stiffness. Compatible with any robotic arm, RoCycle was found to be 85% accurate at detecting materials when stationary, and 63% accurate on an actual simulated conveyor belt. Its most common error was identifying paper-covered metal tins as paper, which the team says would be improved by adding more sensors along the contact surface.

“Our robot’s sensorized skin provides haptic feedback that allows it to differentiate between a wide range of objects, from the rigid to the squishy,” said MIT professor Daniela Rus, senior author on a new paper about RoCycle that will be presented later this month at the IEEE International Conference on Soft Robotics in Seoul, South Korea. “Computer vision alone will not be able to solve the problem of giving machines human-like perception, so being able to use tactile input is of vital importance.”

A collaboration with Yale University, RoCycle directly demonstrates the limits of sight-based sorting: it can reliably distinguish between two identical-looking Starbucks cups made of paper and plastic that would give vision systems trouble.
Incentivizing recycling
Rus says that the project is part of her larger goal to reduce the back-end cost of recycling, in order to incentivize more cities and countries to create their own programs. Today recycling centers aren’t particularly automated: their main kinds of machinery include optical sorters that use different wavelength light to distinguish between plastics, magnetic sorters that separate out iron and steel products, and aluminum sorters that use eddy currents to remove non-magnetic metals.

Five different test objects. Source: MIT CSAIL

This is a problem for one very big reason: just last month China raised its standards for the cleanliness of recycled goods it accepts from the US, meaning that most of our single-stream recycling is now sent to landfills.

“If a system like RoCycle could be deployed on a wide scale, we’d potentially be able to have the convenience of single-stream recycling with the lower contamination rates of multi-stream recycling,” said Ph.D. student Lillian Chin, lead author on the new paper.

It’s surprisingly hard to develop machines that can distinguish between paper, plastic and metal, which shows how impressive a feat it is for humans. When we pick up an object, we can immediately recognize many of its qualities even with our eyes closed, like whether it’s large and stiff or small and soft. By feeling the object and understanding how that relates to the softness of our own fingertips, we are able to learn how to handle a wide range of objects without dropping or breaking them.

This kind of intuition is tough to program into robots. Traditional rigid robot hands have to know an object’s exact location and size to be able to then calculate a precise motion path. Soft hands made of materials like rubber are much more flexible, but have a different problem: because they’re powered by fluidic forces, they have a balloon-like structure that can easily puncture quite easily.
How RoCycle works
Rus’ team used a motor-driven hand made of a relatively new material called “auxetics.” Most materials get narrower when pulled on, like a rubber band when you stretch it; auxetics, meanwhile, actually get wider. The MIT team took this concept and put a twist on it, quite literally: they created auxetics that, when cut, either twist to the left or right. Combining a “left-handed” and “right-handed” auxetic for each of the hand’s two large fingers makes them interlock and oppose each other’s rotation, enabling more dynamic movement. The team calls this “handed-shearing auxetics”, or HSA.

“In contrast to soft robots, whose fluid-driven approach requires air pumps and compressors, HSA combines twisting with extension, meaning that you’re able to use regular motors,” said Chin.

After determining the type of object, the robot can place the object in the correct bin. Source: Jason Dorfman, MIT CSAIL

The team’s gripper first uses its “strain sensor” to estimate an object’s size, and then uses its two pressure sensors to measure the force needed to grasp an object. These metrics – along with calibration data on the size and stiffnesses of objects of different material types – is what gives the gripper a sense of what material the object is made of. Since the tactile sensors are also conductive, they can detect metal by how much it changes the electrical signal.

“In other words, we estimate the size and measure the pressure difference between the current closed hand and what a normal open hand should look like,” said Chin. “We use this pressure difference and size to classify the specific object based on information about different objects that we’ve already measured.”

RoCycle builds on a set of sensors that the team developed for a different paper to estimate an object’s size and stiffness. Those sensors could detect the radius of an object to within 30 percent accuracy, and tell the difference between “hard” and “soft” objects with 78 percent accuracy. The team’s hand is also almost completely puncture-resistant: it was able to be scraped by a sharp lid and punctured by a needle more than twenty times, with minimal structural damage.

As a next step, the researchers plan to build out the system so that it can combine tactile data with actual video data from a robot’s cameras. This would allow them to further improve its accuracy and potentially allow for even more nuanced differentiation between different kinds of materials.

Chin and Rus co-wrote the RoCycle paper alongside MIT postdoctoral associate Jeffrey Lipton, as well as PhD student Michelle Yuen and professor Rebecca Kramer-Bottiglio of Yale University.

This project was supported in part by Amazon, JD, the Toyota Research Institute and the National Science Foundation. The complete paper is available here.
Recycling Robot Learns Through System of Touch