Robot Investments Weekly: Healthcare AI and Robotics Systems Shine

With all of the ProMat and Automate coverage over the past few weeks, we’ve been pretty busy around here, so we’ve slipped a bit on covering the transactions in the robotics, automation, and artificial intelligence space. Fortunately, we’re caught up enough to give you an overview of some of the more interesting transactions recently.

This week we’re highlighting 13 recent transactions covering the robotics, automation, and artificial intelligence space. If you’ve missed some transactions over the past few months, you can track them through the RBR Transactions Database. This regularly updated database lets you sort deals by company, industry, technology, or transaction type.
Robotics, AI in healthcare space
A cluster of investments have been made in robotics and AI companies using technology to better diagnose diseases and perform surgery in recent weeks.

HistoSonics, which develops a non-invasive robotics platform and novel beam therapy, closed a $54 million Series C financing round earlier this month. The company’s Robotically Assisted Sonic Therapy (RAST) combines robotics and imaging with proprietary sensing technology “to deliver personalized treatments with unparalleled precision and control,” the company said. The system uses histotripsy and focused sound energy “to generate pressures strong enough to liquefy and completely destroy targeted tissues at sub-cellular levels,” it continued.

 

Enlitic’s AI platform can help radiologists discover abnormalities for radiologists. Source: Enlitic

Enlitic, which develops AI to streamline medical imaging workflows for radiologists, closed $15 million in Series B funding earlier this month. The company’s platform uses deep learning and other AI forms to develop algorithms that identify and analyze suspicious findings in medical images. “Working closely with hospitals and radiology providers around the world, the company has developed a comprehensive platform enabling the development, validation, and seamless integration of clinical AI at scale,” the company said. It added that early applications of the technology were able to speed up radiologists’ interpretation by more than 20%, while also improving true positive rates and reducing false positive rates by more than 10%.

The company’s first product interprets chest x-rays, triaging normal from abnormal scans, and detecting and characterizing more than 40 distinct abnormalities, the company said. Enlitic said it is working with partners around the world for approvals to deploy the product in several countries.

Another company helping radiologists is Aidoc, which raised $27 million to expand its own AI solutions. The Israel-based company said it will use the funding to grow its technology and go-to-market team to support demand for its products. The company also announced it analyzed its 1 millionth CT scan in real-time, “the largest number of images analyzed by an AI tool and a landmark in the radiology AI ecosystem.” The company’s solutions are able to flag acute anomalies in real-time for radiologists.

On the pathology side, Deep Lens announced closing a $14 million Series A financing round for its AI-driven digital pathology platform. The company said it plans to use the funding to expand its product development, scale its services, sales, and marketing organizations. The company’s Virtual Imaging for Pathology Education and Research (VIPER) technology combines AI with advanced pathology workflows “while also facilitating peer-to-peer collaboration and patient identification for clinical trials. The company said its goal is to provide users with fast and accurate information for better patient care and advanced clinical research.

Another company in the pathology space, Boston-based PathAI announced raising $60 million in Series B funding. The company said it plans to use the new funds to “enhance offerings to existing partners, drive continuous improvement of its flagship pathology research platform, meet market demands, and fuel research and development into new tools and medical devices.” The company develops AI-powered research tools and services for pathology, helping to improve the accuracy and diagnosis and the efficacy of treatment for diseases like cancer, “leveraging modern approaches in machine and deep learning.”

Finally, startup Theator announced raising $3 million in seed round funding to develop its AI-based surgical platform. The platform helps “surgeons enhance capabilities and reduce medical errors by leveraging machine learning and computer-vision to identify, optimize and scale dissemination of best practices,” the company said. While other companies focus on static images such as x-rays and CT scans for diagnostics, Theator said it is working to leverage video footage. The company’s Minutes platform provides intelligently edited versions of surgical procedures that cover steps and outcome-critical components. “Hours-long procedures can be reviewed in minutes, helping surgeons prepare and review procedures,” the company said. In addition, AI-powered algorithms and analytics can inform surgeons on their performance, with videos stored for upcoming procedures or to debrief during post-operative processes.
Amazon buys Canvas, OnRobot buys Blue Workforce
A couple of interesting acquisitions of note:

Amazon announced it would acquire Canvas Technology, which develops autonomous carts that can move items around warehouses, for an undisclosed amount.
OnRobot, which develops end-of-arm tools and grippers for cobots, announced it would acquire the assets of Blue Workforce, which developed the robot called RAGNAR. Denmark-based Blue Workforce had recently filed for bankruptcy, and OnRobot said it would also hire 12 robot developers from the company.
While not an acquisition, FLIR Systems did announce it made a strategic investment in DroneBase, a drone operations company that provides businesses access to one of the largest unmanned aerial surveillance (UAS) pilot networks. The investment would make FLIR the exclusive provider of thermal product solutions and thermal image training provider for DroneBase’s pilot network. Terms of the investment were undisclosed.

Automotive-related investments for AI, teleoperation
While not completely related to the self-driving car space, there were a couple of interesting announcements that could make us better drivers.

Affectiva, which develops an Emotion AI and human perception platform, announced closing $26 million in funding. The company’s technology helps autonomous vehicles and other vehicles to understand drivers’ and passengers’ states and moods, providing alerts when drivers are distracted, etc.
Phantom Auto, which develops remote teleoperation of autonomous vehicles, said it raised about $19 million in Series A financing. The company’s systems allow for a remote teleoperator, who sits in a cockpit with a steering wheel watching images from cameras in the car, to take over control when the car faces “tricky situations.”


Wrapping up the rest
I’m getting anxious about a giant bunny coming to the house to deliver some candy to my kids this weekend, so I’m going to wrap up the rest of the recent transactions. Click the links to learn more:

Slingshot Aerospace raised $5 million for its AI-based orbital analytics platform.
KeyMe earned $50 million to expand its key duplication robotics platform, which can automatically make keys at a kiosk.

Happy Easter, everyone!
Robot Investments Weekly: Healthcare AI and Robotics Systems Shine

4 Reasons Why ‘Made in China’ Isn’t Cost-Effective

It’s no secret why many companies choose to outsource some or all of their manufacturing processes to China. The availability of cheap labor and lack of stringent regulations in the country mean U.S. companies can save significantly on fixed costs by outsourcing – at least in the short-term. In practice, however, offshore manufacturing tends to cost far more in the long run. Here are some of the main reasons why.
1) The current trade war with China
The U.S. is currently imposing approximately $250 billion in tariffs on Chinese imports. For the first time in a long time, many manufacturers are finding domestically manufactured products are now more cost-effective than their outsourced counterparts. It’s unclear how this trade war will play out, but at least in the short term, offshore manufacturing is no longer necessarily less expensive from a fixed-cost perspective.
2) Quality assurance overseas is easier said than done
Many companies have discovered the hard way that contractual quality clauses are extremely difficult to enforce from overseas. In 2013, after a faulty airbag inflator led to a number of deaths and injuries in cars equipped with Takata airbags, the company recalled 3.6 million cars. Takata was a successful Japanese-based automotive parts company, but the faulty inflators that caused the airbag malfunctions were manufactured in Mexico. Ultimately, upwards of 42 million cars with Takata airbags were eventually recalled by order of the National Highway Traffic Safety Administration — and the company went bankrupt in 2017.
3) Hidden costs add up

More on reshoring:

Webcast: Reshoring and the Supporting Key Technologies
Manufacturing Reshoring From Robotics Hasn’t Happened — Yet, Says Study
Reshoring, Robotics Rising Together, Reports OECD
Could Reshoring Restore American Jobs? A Quick Look at the Numbers
How a WWII-era Plan may Provide a Robotics, AI Blueprint for Manufacturing in the Heartland
4 Considerations for Western Companies That Want to Leave China

Not surprisingly, overseas shipping and transport (including air freight) can cost a great deal of time and money. With lag times of two to three months — to say nothing of unanticipated delays — the total landed costs can add up quickly. Many manufacturers recommend engaging legal guidance both in the U.S. and the place of manufacture. This, too, can be costly, though probably less expensive than moving forward without legal consult and needing it later. There can also be hidden costs associated with production quantity; in many cases, outsourcing is only cost-effective with high-quantity production yields.
4) Automation is changing the manufacturing landscape in the U.S.
The next frontier of cost efficiency is happening right here in the U.S. Incorporating automation into manufacturing is helping companies save costs and increase efficiency across a wide range of industries. The flip side, of course, is that many people are concerned robots will end up replacing human workers, threatening to negate one of the many benefits of reshoring: providing reliable jobs to domestic workers.

This concern is not an unreasonable one. Various studies, including a 2017 report from the McKinsey Global Institute and a study from the University of Oxford suggested anywhere from 20% to 50% of jobs are threatened by automation. The reality is more nuanced, and thankfully, much more encouraging.

As many companies are discovering, automation actually provides an opportunity for different kinds of employment (and in many cases, more employment in general). Ray Products, where I work, is an excellent example. When we introduced a fully robotic six-axis trimmer to our thermoforming workflow, we ultimately ended up increasing our workforce by 20%. This isn’t just anecdotal — a recent Brookings Institution report found that in Germany, where manufacturers are using three times more robots than their U.S. counterparts, they’re also employing more people.

While it’s impossible to predict the future, my money’s on a combination of robots and reshoring.

About the author: Jason Middleton is vice president of sales at Ray Products, an Ontario, Calif.-based plastics manufacturer.
4 Reasons Why ‘Made in China’ Isn’t Cost-Effective

How to Build a Robotics Career Without a Ph.D.

If you’re worried about the future job security of your current career, it could be time to switch into a more in-demand career, such as one offered by the robotics or artificial intelligence fields.

Fortunately, you don’t need a doctorate to make that kind of a job transition. By showing creativity, dedication and a willingness to learn in-demand skills, you could set yourself up for success despite not having an advanced degree. Here are six strategies that can help you get to a new career in robotics or AI.
1) Examine your educational options
Transitioning into a technology career will almost inevitably require you to increase your formal education, even without going as far as getting a Ph.D. With that in mind, start by looking for robotics programs in your community. Alternatively, you can check for online courses — especially if your obligations make it difficult to commit to attending in-person classes.

Also, prioritize the programs that will put you in an excellent position to capitalize on the existing skills shortage. According to an EY report, 80% of respondents said an AI talent shortage prevented enterprise-level adoption of that technology. If you have AI knowledge that applies to robotics, you could find yourself in exceptional demand, provided your educational program makes you well equipped.
2) Understand your role in a changing workforce
It’s likely that fear of losing your job to robots is what compelled you to upskill and embrace the robotics trend by moving into the industry itself instead of shying away from it.

Study patterns within the workforce and how robots relate to them. Once you are aware of the impact of robotics on the future workforce, you can better determine how you fit within it, and how robots could be a career booster for you instead of a hindrance.

For example, if you previously worked as a civil engineer, many of the skills you already possess will lend themselves nicely to your goal of landing a robotics engineering role. In your profession as a civil engineer, you may have designed things like roads and bridges, which translates well into designing robots.

Plus, your former engineering work equipped you to realize the various phases of the design process, and you’re probably comfortable working with others to get jobs done. In short, think about the skills you already have and how they could help you excel as the workplace evolves.
3) Browse job listings find essential skills for robot-related roles
Once you have an idea of sources for robotics degrees, and how you can fit into the future workforce, start looking through job listings. While it may seem premature to do this, you can learn valuable information that can guide a career move. First, look for companies that most often have technology job openings that appeal to you.

Then, dig deeper and see what skills they require. Figure out how you could sharpen those skills and create an action plan for encouraging meaningful personal growth. Remember, making your skills applicable to the future is a substantial part of enjoying long-term job security.

Check out the newly launched website that seeks to address the robotics skills shortage by connecting qualified candidates with robotics companies. It’s called Robots.Jobs and has both featured jobs and companies, giving you an idea of the know-how that you need to progress in the field. (Editor’s note: Robotics Business Review is a partner with Robots.Jobs)

Making connections at local robotics gatherings can help generate job opportunities.
4) Become a confident networker
Research indicates that up to 85% of people find their jobs through networking. If you don’t know anyone currently working in robotics, it’s time to change that. Search for local gatherings of individuals working in robotics and other tech fields. Be prepared to talk about your career goals and what you can offer to employers.

At this stage in your career switch, it may not feel like you can bring much to the table concerning relevant robotics expertise. But think about the other things you have to offer from your former role. For example, maybe you were a marketing professional and worked on projects that required you to assess and meet client needs. In that case, you’re probably great at seeing the big picture and knowing how to achieve successful outcomes.
5) Improve critical thinking skills by staying abreast of robotics news
Robotics is a fast-moving sector, and you cannot assume things in the industry will stay relatively constant up to and beyond the time when you’re ready to enter the employment market and start hunting for a robotics job. It’s smart to build a collection of robotics resources such as websites and blogs. Aim to read them daily, or at least several times a week.

More on workforce, skills:

Why Humans Will Remain at Center of an Automated Workforce
Moving Your Workforce Forward: AI and Automation Action Plan
Report: How to Prepare Your Future Workforce for Robotics Disruption
Robotics, Workforce Development Go Hand in Hand, Say A3 Leaders at RoboBusiness
The Time Is Now for Conversations Around Workforce Retraining
How to Start (or Change to) a Robotics Career (free download)

As you digest the information, think about it critically by asking yourself how the things you read about could have short- or long-term effects on the robotics industry. Similarly, as you read about exceptionally innovative robotics achievements, ponder the pros and cons of such progress. Challenging yourself to analyze the news like this keeps your knowledge current, plus encourages you to think beyond the words on the page.
6) Look for a tech internship
It’s one thing to earn a degree from a robotics program, but you will also need hands-on experience that will be valuable if you enrolled in an online degree program that didn’t offer any or very many lab experiences for learners. An internship can help you discover what it’s really like to work in a robotics or other tech company.

When you’re getting your resume ready before applying for an internship, make sure to list any relevant projects completed, even the ones where you built robots with DIY kits. The companies looking for interns want well-rounded applicants, and that often means people who aren’t afraid to roll up their sleeves and get to work by carefully following instructions, and realizing that they may not get everything right the first time.

Speaking of DIY kits, they can be excellent for helping you apply skills learned in formal coursework. Even if your robotics degree program doesn’t recommend using them, you should because you’ll gain a richer understanding of how robots function.

Of course, the internships at the biggest, most well-known companies will be very competitive. But keep in mind that you could learn just as much — or more — by working for a small startup.
Stay determined during your quest
Besides staying aware of these six tips, don’t forget that you need a diligent mindset as you strive to build a robotics career without a Ph.D. Although many of the applicants may have more education than you, the other traits you offer could make you become the candidate of choice.
How to Build a Robotics Career Without a Ph.D.

Using the physics of airflows to locate gaseous leaks more quickly in complex scenarios

Engineers are developing a smart robotic system for sniffing out pollution hotspots and sources of toxic leaks. Their approach enables a robot to incorporate calculations made on the fly to account for the complex airflows of confined spaces rather than simply ‘following its nose.’
Using the physics of airflows to locate gaseous leaks more quickly in complex scenarios

Under the Sea: Drone Startup Boxfish Helps Antarctic Researchers

Robots and drones have long tackled tasks that are dangerous for humans, and now they can add another item to the list – exploring the waters under the ice shelf in Antarctica.

A team from New Zealand startup Boxfish Research completed a five-week stay in Antarctica, assisting Dr. Regina Eisert and a research team from the University of Canterbury. The research team was participating in the Antarctic Top Predator program, with the goal of studying and capturing footage of the Orca and Minke whales as they congregated in the ice channel that resupplies the McMurdo research station.

Ben King, Boxfish Research

Boxfish Research Co-founder Ben King and his team supplied the scientists with the Boxfish ROV and Boxfish 360 underwater vision systems, designed to handle extreme weather conditions.

Robotics Business Review recently spoke with King about the journey to Antarctica and the lessons learned for using underwater drones in harsh conditions.
Origins of the project
Q: Give us a quick background on how this all started and how Boxfish Research got involved in the project.

King: Boxfish Research is a premium mini-RV manufacturer located in New Zealand. We make an ultra-maneuverable underwater drone that is specifically designed for working in tough conditions and capturing ultra-high-definition video, with cinema quality. One of the scientific researchers, Dr. Regina Eisert got in touch with us about how she could get some of our equipment down there. We came to an arrangement to collaborate on the project so I could join the team and be on the ground down there in Antarctica.

Q: What were they hoping to get out of the project from using the drone? Was it just video, or other types of data?

King: At this stage, their research focuses on the top predators in Antarctica, primarily Type C killer whales and Minke whales, and they’re particularly interested in feeding behavior, social behavior and range. This is done in order to better understand the overall fishery and biomass in the Ross Sea region as part of the marine protected area. Having eyes under the water is an amazing way to get a better understanding of the social behavior of the animals, and ideally the feeding behavior as well.

Whales spotted in Antarctica. Source: Ben King, Boxfish Research

Q: Did they have any sensors that they wanted to use on your equipment to measure water data, temperature, things like that?

King: No, because this project came together rather quickly. We didn’t have time to integrate anything additional, but certainly for future years there’s discussions already in place about returning next summer for hydrophone recordings and other aspects of research down there, like studying the bottom around Scott which would benefit from USBL [ultra-short baseline underwater acoustic positioning] and other navigation systems to have a better geo-reference for all the footage.
Getting to Antarctica
Q: How long did it take do get the project rolling? Were there a lot of hoops to jump through in order to get to Antarctica?

King: We thought that it would, but it all happened remarkably quickly, and that’s why we didn’t have a chance to integrate any additional equipment. The first mention of us doing this was in late August, and we had to ship the gear in November down to the ice. We’re a small company, so we just did everything we could to get the gear we had ready to go down to the ice.

Q: How long did it take for you to get to Antarctica from New Zealand?

King: I flew to Christchurch, got kitted out with all my cold weather gear for Antarctica, and then stayed overnight and we left the next day. But one of the team members got bumped off the flight and ended up taking five days before he got there because of the weather conditions. Getting to and from Antarctica is the real challenge – some people can be delayed up t o a week or more in some extreme cases.

Q: Did you have to do any other kind of special training to be ready for those weather conditions?

King: Everyone that goes to Antarctica with the New Zealand Antarctic program is given filed training, it’s basically a 24-hour course on the ice, camping out, eating dehydrated food and learning how to survive basically in case you get stuck out there for whatever reason. It was good fun.

I have a lot of outdoor experience myself. It was a fairly comfortable environment, but that’s not the case for all.  The staff down there did a fantastic job of really making sure that everybody had the skills and the equipment and the support they needed to operate safely.

Q: Once you got there, how long were you there with the research team?

King: About five weeks in total. We were working with night operations. We traveled each night from the base to McMurdo Station, where the helicopter pad is, and the New Zealand Helicopter, and we flew out to the ice at about 10 p.m. each night and were back at the base at about 9:30 a.m.

The research team traveled by helicopter to reach its location for the dives. Source: Ben King, Boxfish Research

Q: Was there a particular reason why you were conducting the research at night? Was it because that’s when the whales were likely going to be in the area?

King: There were a number of reasons. We were able to have close support from the helicopters so they could stick with us for the entire evening, because there’s very little going on at night, so we had our own pilot. The light is better for doing photo identification above the water because the sun is a bit lower. And the whales do seem to be most active around then – they talked about ‘whale o’clock’, between 10:30 p.m. and 12 a.m. – we’d often see a lot of whales where they’d become very active.
Extreme weather, rugged conditions
Q: From your company’s standpoint, what were some of the goals you were hoping to attain? Was it to see whether the equipment could withstand these temperatures and weather extremes? Was it to see if you could achieve high quality video?

King: Our goal was basically to get the ROV out in the field and use it as much as possible and capture the most stunning footage we possibly could. In addition, we wanted to get field experience in such an extreme environment with our equipment.

We exceeded all expectations on all fronts. The equipment performed exceptionally well. We had no loss of dive time because of equipment failure at all. We had 100% uptime, which was amazing. We did 15 dives with 21 hours under the water, down to 210 meters (about 700 feet).

We also spooled our entire tether out – we had 440 meters of tether that we maxed out – we pushed everything. The ROV was handled about 10 times a day in and out of vehicles, and into helicopters and out of helicopters and across the ice and back. It stood up to that really, really well.

The cold weather didn’t affect the performance, and we were able to stay underwater longer than what the humans could endure above the water.

Minke whale cruising the Ross Sea from Boxfish Research on Vimeo.
We were also able to do a world’s first – we did a live dive in the bar at Scott Base. We convinced the telecom tech to loan us 500 meters of fiber optic cable that we rolled it out across the ice to the hole where we were diving, when we couldn’t get out to the ice edge with the helicopter. I was able to pilot the ROV from inside the bar, so that was pretty cool.

One of the bonuses of the trip was that when we weren’t out with the whales, we had this hole so we could just go out there and dive. It was about 66 meters deep at the hole, and about 100 meters offshore. From there we could explore the sea floor, which was just teeming with life – creatures and critters and fish and octopus and sponges – all manner of life.

Q: Did any issues or challenges pop up where you needed to adapt quickly, being in Antarctica without having to drive to a store to get gear?

King: Probably the most annoying thing was when we did the first dive, we found the friction increased slightly, and the tether rail made it a little bit challenging to wind in on the snow. So I got the carpenter to help me make a plywood base plate that we strapped the tether reel to, and then we just screwed it down to the ice each day with ice screws, and that completely solved the problem. It was a really minor thing, but a very simple solution and we were good to go.
How robots can help researchers
Q: What message do you want to bring to customers that might be thinking about deploying these types of systems?

King: What we really proved – and I even proved to myself – was just how incredible the propulsion system is that we have on our ROV, coupled with the really high-end video. The eight-vectored thruster propulsion system really gives you full freedom to capture the footage.

More on research robotics:

3 Teams Compete for $1M in NOAA Bonus XPRIZE
MIT’s Soft Robotic Fish Can Swim With Real Ones
Recycling Robot Learns Through System of Touch
Video: Watch a Robot Assist With Hurricane Disaster Cleanup
6 Experimental Uses for Robotics in 2019

The perfect example was outside Scott Base. There was always a reasonably strong current, and it didn’t affect operations because I was able to maneuver the ROV as if there wasn’t any current, simply because we can direct that thrust in any direction. So I could pan around an object simply by compensating for the current as I moved, which was very easy to do. I just really saw the power of that.

Additionally, the fact that the sea floor was sloping at about a 40-degree angle and that I could just pitch the entire vehicle down to look at the sea floor – made it really amazing to be able to just move around under there with total freedom.

Q: Have you been able to step back and really appreciate the type of opportunity you had in visiting Antarctica?

King: Oh, absolutely. I’ve been looking through the footage, and we have been slowly releasing it – it’s just incredible. At first everyone was telling us that you don’t achieve any results in your first season – it’s always treated as a pilot. We obviously didn’t go into it with that mindset, but what I discovered is that it’s a common way of looking at it – the scientists have that kind of attitude, which was a bit surprising. But we certainly didn’t come home feeling like we needed to think that way.

Q: But on the other hand, you want to go back, correct?

King: Absolutely. There’s always more to capture. We got some stunning footage of penguins swimming through the water, and we also want to work with other scientists. Antarctica specifically, there’s so much we could have done – exploring the ice sheet, there’s people looking at ice. We can put a manipulator on the ROV to grab samples – we could do gas samples from some of the volcanic vents. We’re talking to a guy about sponges to take core samples of the sponges. We could do push cores in the sediment – there’s all kinds of things we could do over and above capturing the fantastic video.

Q: So how cold did it get?

King: In the water it was -2 C (28.4 F). The lowest temperature before we put the ROV in the water was –6 C (21.2 F) from it sitting out in the cold. And the air temperature, the coldest we had was -14 or -15 (about 5 F), and with the wind chill it was well below -20 C (-4 F).
Under the Sea: Drone Startup Boxfish Helps Antarctic Researchers

Giving robots a better feel for object manipulation

A new learning system improves robots’ abilities to mold materials into target shapes and make predictions about interacting with solid objects and liquids. The system, known as a learning-based particle simulator, could give industrial robots a more refined touch — and it may have fun applications in personal robotics, such as modelling clay shapes or rolling sticky rice for sushi.
Giving robots a better feel for object manipulation

How IoT Condition Monitoring Maintains Machine Health

A primary manufacturing goal is to maintain high product quality. For many enterprises, however, this objective still seems hardly achievable. Too often, product quality issues are uncovered only when a product fails in testing, or worse, when a customer makes a return or triggers a recall.

A common cause behind reduced product quality is faulty equipment, which has not been properly maintained or calibrated. Manufacturers are increasingly turning to IoT-driven machine condition monitoring, which helps reveal equipment issues that can affect the quality of products so they can be fixed before things get worse.
The IoT-driven approach to product quality control
Condition monitoring enables product quality control by detecting combinations of equipment health, such as spindle vibration frequency, engine temperature, cutting speed, and ambient parameters, such as temperature and humidity. Combined, these parameters can cause deterioration in the quality of a product output.

A historical data set that contains equipment condition records gathered through a time period (say, a year) is combined with the data bout product quality deviations and context data (for example, equipment maintenance history) from either ERP, PIMS, or DCS systems. The combined data set is then fed into advanced machine learning algorithms, which can then detect causal correlations in the incoming data records. Uncovered correlations are reflected in predictive models, which are then used to identify combinations of equipment condition and environmental parameters that can lead to product quality issues.

For example, in pulp processing some of the quality issues include deviations in the concentration of dissolved alkali. The machine learning component of IoT detects hidden patterns in the data and states that a higher concentration of alkali stems from a deviation in two process parameters: reduced processing temperature and increased white liquor flow.
Use cases across industries
Manufacturers across industries can leverage IoT for monitoring the condition of machines and controlling the quality of products and components manufactured on them. Here are a few examples:
Pulp and paper
In the pulp and paper industry, IoT allows monitoring the condition of rollers in paper machines. A defect of just one roller bearing can significantly affect the quality of the produced paper and cause fluffing and changes in paper thickness. Monitoring the condition of roller bearings with vibration sensors is enough to avoid a large percentage of quality issues. Vibration sensors on each end of the roller continuously gather real-time data about the roller health and relay it to the cloud software. If a roller does not function properly, an IoT solution alerts an operator.
Electronics
In electronics, in the process of mounting semiconductors on circuit boards, tiny puffs of air are used to direct the placement of chips. The placement machines are calibrated according to current environmental conditions (temperature, humidity, etc.). A minor change in, say, temperature parameters generates a heat profile that can cause placement defects. Temperature and humidity sensors are used to monitor the environment, in which machines operate. Once a change is detected, a machine can receive a command to make calibration adjustments to meet the quality standards.

Steelmaking
In the steel industry, IoT helps detect equipment issues that affect the quality of steel during the metal forming process. During the process, a slab – an output of casting – is reheated and run through the rolling mills, so that its thickness is reduced to less than an inch. The problems in the condition and alignment of rolling mills can lead to significant quality issues. The most common causes may include the rolls failing to catch the metal, so that it will pile up or the rolls not rolling evenly, which results in one side of the metal sheet being thicker than the other. To prevent these issues, the condition and alignment of rolling mills’ bearings is monitored with vibration and magnetic guide sensors.
Automotive
In the automotive industry, the penetration of moisture into the spaces and gaps in welded spots can lead to porosity, while temperature variations in welding machines can lead to a weld joint failure. IIoT is applied to monitor the temperature and the level of humidity around a machine to avoid incorrect placement and ensure high quality of the welded products.
The benefits of product quality control based on IoT-driven condition monitoring
Monitoring the condition and the environment of machines, on which products are manufactured, delivers the following benefits:

Compared to traditional production quality control techniques (for example, test checks carried out at the end of a production cycle), IoT-driven equipment condition monitoring lets users pinpoint quality issues at the production stage, when an issue can still be mitigated.
The analytics capabilities of IoT-driven condition monitoring solutions lay a foundation for improvements in product quality. For example, combining historical data from vibration sensors attached to the milling rolls’ bearings with the data about past quality losses, manufacturers conclude that an 8% increase in a roller bearing’s vibration causes a metal sheet’s left side be 0.1 inch thicker than the right side. Manufacturers can then use these insights to improve the quality of the output products.

Getting started
Since the technology market does not yet offer out-of-the-box IoT-driven condition monitoring solutions, enterprises need to design and implement custom IoT applications. Given the complexity of IoT implementation, it proves efficient for the enterprises to collaborate with external parties: an IoT platform vendor or an independent IoT integrator.

More on IoT, analytics:

Component Roundup: Variable Frequency Drives, IoT-based Energy Harvester
2019 Predictions from Robotics, Automation, and AI Industry Experts
3 Cybersecurity Challenges for IIoT Devices
Infographic: The Influence of AI and Automation on Manufacturing
Oden Aims to Help Manufacturers Add AI Framework to Operations
Analytics at the Edge Could Help Manufacturers With Predictive Maintenance, Says Greenwave VP

Opting for an IoT platform vendor has the following advantages:

Lower implementation cost;
Simpler integration with enterprise and shop floor management systems;
More comprehensive upgrades.

However, going for the collaboration with a single IoT platform vendor, enterprises are unlikely to get the best-of-breed functionality, as they often get locked up in the vendor’s solution ecosystem with limited options to test alternative solution components that may be a better fit.

Collaborating with an IoT integrator, on the other hand, offers the possibility to ‘build’ an IoT solution from the components tailored to the enterprise’s needs. Still, the cost of implementation will rise, as enterprises have to buy separate individual modules from multiple vendors and partner with an integrator to bring these modules together.
A point to consider
Although IoT-based condition monitoring paves the way to improvements in production quality, such an approach has certain limitations, as data about machine conditions may be not enough for well-rounded quality assurance. Monitoring the condition of machines, for instance, cannot identify issues arising from the use of defective or misidentified components, or improper material handling.

Controlling the quality of products by monitoring the condition of machines, on which they are manufactured, helps to drive yield improvement, reduce scrap, and minimize rework. Compared to other quality assurance techniques (for example, based on inspecting parts and semi-finished products as they move through the production cycle), the condition monitoring-based approach may offer less differentiation in terms of quality control scope, but it helps identify quality issues at their incipient stage and predict potential ones.

About the author: Boris Shiklo is the CTO at ScienceSoft, responsible for the company’s long-term technological vision and innovation strategies. Under his supervision, the company’s development team has successfully fulfilled complex projects of more than 80,000 man-hours in healthcare, banking & finance, retail, telecommunications, public sector, and other domains. He has a solid background in IT consulting, software development, project management, and strategic planning.
How IoT Condition Monitoring Maintains Machine Health

Cobot Roundup: Makers Show New Applications for Collaborative Uses

CHICAGO – Cobots continue to grow in popularity throughout different manufacturing and other enterprises, with many of their newest capabilities and features on display last week at Automate and ProMat.

Universal Robots took center stage with its cobots in use in collaborative displays at throughout the two shows with exhibitors demonstrating more than a total of 80 collaborative robot applications, including vision-guided product inspection, picking parts off conveyors on the fly, riding on top of mobile robots while performing machine tending, perform live robotic sorting and induction into put walls and pouches, adding 7th axis capabilities, hand giveaways to attendees and playing golf on a putting green.

“We have successfully developed a rapidly expanding ecosystem around our collaborative robots,” said Stuart Shepherd, regional sales director of Universal Robots’ Americas division.
Cobots on display
While many of the cobots were performing tasks that had been on display at other recent trade shows, there were some demonstrations that certainly caught the eye.

The show ended just before the opening round of the Masters, which Tiger Woods won Sunday thanks in no small part to making the putts he needed to, recording three birdies on the back nine. While not being able to read a green yet, robots are already putting as demonstrated by Ready Robotics, which featured a UR10 cobot in a putting green demo that also featured Forge/OS and Forge/Ctrl programming, enabling attendees to experience intuitive hands-on experiences programming cobots in real time.

At the PHD Inc. booth, visitors saw a UR5 mounted on the saddle/carriage of a two meter-long PHD Series ESU electric base slide traveling back and forth simulating pick and place on both sides of the slide. The seven-foot slide, the largest the company has built to date, enables companies to use only a single robot – if capacity doesn’t require two robots – rather than buying an additional unit to perform a second task seven feet or less away from where the first task was performed, said Kaleb Hoot, PHD applications engineer.

“It means that a company can spend $10,000 on a slide rather than $50,000 on an additional robot,” said Hoot, adding that the market had been looking for slides offering longer reaches as well as ones that can handle heavier payloads. Heavier payloads was also a feature of some of the newer cobots, as well as some of the newer mobile industrial robots that were featured at Automate and ProMat.

The newest slide is the 12th in the company’s portfolio. The slides can handle loads from a load ranging from a few ounces up to 300 pounds. The slide use different types of bearings in order to provide the precision, force and load capabilities.

PHD offers electric and pneumatic slides ranging from compact to gantry styles.

Also at PHD, the UR5e was featured using the new UR+ certified Pneu-connect dual gripper in a hands-on display to demonstrate the ease of programming and capabilities of the UR. Both demos featured an analog sensor, now available for PHD Series GRH kits, and the Pneu-Connect X2 kits which provide two PHD grippers mounted to the UR robot for maximum efficiency in automation performance. The X2 dual gripper kits include the Freedrive feature that interface with the UR for simplified positioning and programming.

FerRobotics AOK/905

Additional Universal Robots cobot displays included:

FerRobotics: Showcased sanding technology, the FerRobotics AOK/905 in a live demonstration with a UR10 polishing a wooden chair. The FerRobotics AOK/905 is a UR+ certified package, designed as a plug & play sanding and polishing solution for UR cobots.
Advanced Handling Systems: Showed a UR10e moving small giveaway boxes back and forth between totes using UR+ certified products; the Schmalz CobotPump ECBPi and a PickIt3D vision system. When the robot received a signal, the next box it picked was presented to a booth visitor instead of being placed in a tote, then it resumed its previous routine.
Bimba Manufacturing: A UR3 with Bimba’s UR+ certified Collaborative Robot Vacuum Tool (CRVT) was on display. The CRVT combines a maintenance free non-clogging single stage venturi vacuum pump, vacuum switch or sensors, and a valve in a simple package allowing users to quickly start moving parts with their UR cobots. Ideal applications for the CRVT include pick and place operations such as CNC machine automation, packaging and palletizing, and assembly.
LMI Technologies: Showed a UR5 working collaboratively with the UR+ certified LMI Gocator 3210 snapshot sensor that uses stereo structured light technology to measure shape and orientation of parts for automating inspection, part movement, and guidance type applications. The UR5 guided a Gocator around a large part to capture 3D data from different angles. When applying inspection tools all embedded within the Gocator firmware, users could generate a complete 3D point cloud and measure specific features i.e. diameter and depth of holes.
Mobile Industrial Robots (MiR): Showcased a MiR200 autonomous mobile robot (AMR) with a UR5 and an OnRobot UR+ certified RG2 gripper application. This application picked up circuit boards at a stationary table, drove around autonomously and delivered the circuit boards to the same table. To be as precise as this task requires, the MiR200 connected to a precision docking station built into the stationary table, demonstrating how the MiR200 adds mobility with extreme precision to the UR5, enabling the cobot to service multiple work stations.

Though Universal certainly has a large portion of the market, there were other companies that were offering their cobots as well.
Doosan Robotics
Coming to North America after initially launching in the European markets was the lineup of cobots from Korea-based Doosan Robotics.

These four cobot models enable customers to experience first-hand safe, versatile and easy-to-use automation. Doosan’s ergonomically designed cobots can serve a wide variety of customer needs, offering a broad range of capabilities – a working radius of 35.4 to 66.9 inches (900 to 1,700 millimeters) and a load capacity of 13.2 to 33.1 pounds (6 to 15 kilograms).

Doosan cobots are equipped with proprietary torque sensors on all six joints, enabling the robots to be used in diverse applications that utilize advanced force and compliance control algorithms.

Powered by a teach pendant, which is a human-centered touchscreen control embedded with Doosan’s award-winning software, Doosan cobots are extremely intuitive to teach and easy to operate, enabling customers to take full control without having to write complicated programming scripts.

Applications: To showcase the full range of their capabilities, Doosan cobots were aligned to track a conveyer, assemble gears and arrange letters to spell words as programmed. The main demonstration was highlighting automotive composite solutions, where six cobots collaborated with two human workers to execute fine motor activities including inspection, assembly and placement of parts on an actual automobile.

More ProMat and Automate coverage:

At ProMat 2019, Companies Pitch Efficiency for Warehouses
Analysis: 5 Key Robotics Trends from ProMat and Automate 2019
At Automate 2019, Robot Vendors Tout Simplicity Across Products
ProMat and Automate Day 3 News and Notes: R2-D2 and 3 Tons of Fun!
ProMat and Automate Day 2 News, Notes, and Forklifts
News and Notes from Day 1 at ProMat/Automate 2019
MiR Launches MiR1000 for Autonomous Transport of up to 1 Ton Loads
Robotiq Unveils New Vacuum Grippers, Sanding Kit
Epson Robots Launches New Robots, Intelligent Feeding System
IAM Robotics Redesigns, Expands Swift System for Mobile Fulfillment
ProMat and Automate Show Guide: Robot Company Showcase

The demos highlighted a wide range of accessories that empower the Doosan cobot experience, maximizing performance and production efficiency. These accessories include Mobile Base, a solution enabling flexible relocation and movement equipped with a direct teaching unit and Smart Vision Module System. The Smart Vision Module System allows cobots to inspect the surrounding area using mounted cameras.

“We are in the middle of the Fourth Industrial Revolution, where collaborative robots play a key role. Humans will be increasingly empowered to achieve higher levels of efficiency and immediate productivity gains through harnessing technology,” said Byungseo Lee, CEO of Doosan Robotics. “Doosan is leading this transformation with the creation of our innovative cobots, which will help unlock productivity for our customers in North America.”
Productive Robotics
Productive Robotics also introduced new cobots, featuring enhanced human-like vision, building on the OB7 cobot the company first introduced to the market two years ago.

The new OB7-Max 8 and the OB7-Max 12 offer the same teaching platform as the OB7, enabling users to show the robot how to do the job rather than programming the unit. The new robots can handle larger payloads and offer longer reaches than the base model. The OB7-Max 8 can handle payloads of up to 8kg, had has a reach of up to 1700 mm, while the OB7-Max 12 has a 12kg payload capacity and a maximum reach of 1300 mm.

Each model can also be equipped with the company’s proprietary vision system, OB7 Vision, enabling the robot to learn to recognize and pick up objects with the push of the button.

According to the company, the OB7 series of robots will have an improved sense of touch by the end of the year, enabling them to handle a wider variety of objects.
SNAP Resources
The Fremont, Ohio-based SNAP, a Motion Controls Robotics affiliated company, joined Ready Robotics at their booth to display the mobile SNAPMate Station, featuring a FANUC CR-15iA collaborative robot. SNAP demonstrated how to simplify cobot automation using the Ready Robotics Forge software and controller. The Forge controller makes programming a FANUC cobot, mobile workstations intuitive by using hand guidance with drag-and-drop programming.
Cobot Roundup: Makers Show New Applications for Collaborative Uses