4 Reasons Why ‘Made in China’ Isn’t Cost-Effective

It’s no secret why many companies choose to outsource some or all of their manufacturing processes to China. The availability of cheap labor and lack of stringent regulations in the country mean U.S. companies can save significantly on fixed costs by outsourcing – at least in the short-term. In practice, however, offshore manufacturing tends to cost far more in the long run. Here are some of the main reasons why.
1) The current trade war with China
The U.S. is currently imposing approximately $250 billion in tariffs on Chinese imports. For the first time in a long time, many manufacturers are finding domestically manufactured products are now more cost-effective than their outsourced counterparts. It’s unclear how this trade war will play out, but at least in the short term, offshore manufacturing is no longer necessarily less expensive from a fixed-cost perspective.
2) Quality assurance overseas is easier said than done
Many companies have discovered the hard way that contractual quality clauses are extremely difficult to enforce from overseas. In 2013, after a faulty airbag inflator led to a number of deaths and injuries in cars equipped with Takata airbags, the company recalled 3.6 million cars. Takata was a successful Japanese-based automotive parts company, but the faulty inflators that caused the airbag malfunctions were manufactured in Mexico. Ultimately, upwards of 42 million cars with Takata airbags were eventually recalled by order of the National Highway Traffic Safety Administration — and the company went bankrupt in 2017.
3) Hidden costs add up

More on reshoring:

Webcast: Reshoring and the Supporting Key Technologies
Manufacturing Reshoring From Robotics Hasn’t Happened — Yet, Says Study
Reshoring, Robotics Rising Together, Reports OECD
Could Reshoring Restore American Jobs? A Quick Look at the Numbers
How a WWII-era Plan may Provide a Robotics, AI Blueprint for Manufacturing in the Heartland
4 Considerations for Western Companies That Want to Leave China

Not surprisingly, overseas shipping and transport (including air freight) can cost a great deal of time and money. With lag times of two to three months — to say nothing of unanticipated delays — the total landed costs can add up quickly. Many manufacturers recommend engaging legal guidance both in the U.S. and the place of manufacture. This, too, can be costly, though probably less expensive than moving forward without legal consult and needing it later. There can also be hidden costs associated with production quantity; in many cases, outsourcing is only cost-effective with high-quantity production yields.
4) Automation is changing the manufacturing landscape in the U.S.
The next frontier of cost efficiency is happening right here in the U.S. Incorporating automation into manufacturing is helping companies save costs and increase efficiency across a wide range of industries. The flip side, of course, is that many people are concerned robots will end up replacing human workers, threatening to negate one of the many benefits of reshoring: providing reliable jobs to domestic workers.

This concern is not an unreasonable one. Various studies, including a 2017 report from the McKinsey Global Institute and a study from the University of Oxford suggested anywhere from 20% to 50% of jobs are threatened by automation. The reality is more nuanced, and thankfully, much more encouraging.

As many companies are discovering, automation actually provides an opportunity for different kinds of employment (and in many cases, more employment in general). Ray Products, where I work, is an excellent example. When we introduced a fully robotic six-axis trimmer to our thermoforming workflow, we ultimately ended up increasing our workforce by 20%. This isn’t just anecdotal — a recent Brookings Institution report found that in Germany, where manufacturers are using three times more robots than their U.S. counterparts, they’re also employing more people.

While it’s impossible to predict the future, my money’s on a combination of robots and reshoring.

About the author: Jason Middleton is vice president of sales at Ray Products, an Ontario, Calif.-based plastics manufacturer.
4 Reasons Why ‘Made in China’ Isn’t Cost-Effective

How IoT Condition Monitoring Maintains Machine Health

A primary manufacturing goal is to maintain high product quality. For many enterprises, however, this objective still seems hardly achievable. Too often, product quality issues are uncovered only when a product fails in testing, or worse, when a customer makes a return or triggers a recall.

A common cause behind reduced product quality is faulty equipment, which has not been properly maintained or calibrated. Manufacturers are increasingly turning to IoT-driven machine condition monitoring, which helps reveal equipment issues that can affect the quality of products so they can be fixed before things get worse.
The IoT-driven approach to product quality control
Condition monitoring enables product quality control by detecting combinations of equipment health, such as spindle vibration frequency, engine temperature, cutting speed, and ambient parameters, such as temperature and humidity. Combined, these parameters can cause deterioration in the quality of a product output.

A historical data set that contains equipment condition records gathered through a time period (say, a year) is combined with the data bout product quality deviations and context data (for example, equipment maintenance history) from either ERP, PIMS, or DCS systems. The combined data set is then fed into advanced machine learning algorithms, which can then detect causal correlations in the incoming data records. Uncovered correlations are reflected in predictive models, which are then used to identify combinations of equipment condition and environmental parameters that can lead to product quality issues.

For example, in pulp processing some of the quality issues include deviations in the concentration of dissolved alkali. The machine learning component of IoT detects hidden patterns in the data and states that a higher concentration of alkali stems from a deviation in two process parameters: reduced processing temperature and increased white liquor flow.
Use cases across industries
Manufacturers across industries can leverage IoT for monitoring the condition of machines and controlling the quality of products and components manufactured on them. Here are a few examples:
Pulp and paper
In the pulp and paper industry, IoT allows monitoring the condition of rollers in paper machines. A defect of just one roller bearing can significantly affect the quality of the produced paper and cause fluffing and changes in paper thickness. Monitoring the condition of roller bearings with vibration sensors is enough to avoid a large percentage of quality issues. Vibration sensors on each end of the roller continuously gather real-time data about the roller health and relay it to the cloud software. If a roller does not function properly, an IoT solution alerts an operator.
Electronics
In electronics, in the process of mounting semiconductors on circuit boards, tiny puffs of air are used to direct the placement of chips. The placement machines are calibrated according to current environmental conditions (temperature, humidity, etc.). A minor change in, say, temperature parameters generates a heat profile that can cause placement defects. Temperature and humidity sensors are used to monitor the environment, in which machines operate. Once a change is detected, a machine can receive a command to make calibration adjustments to meet the quality standards.

Steelmaking
In the steel industry, IoT helps detect equipment issues that affect the quality of steel during the metal forming process. During the process, a slab – an output of casting – is reheated and run through the rolling mills, so that its thickness is reduced to less than an inch. The problems in the condition and alignment of rolling mills can lead to significant quality issues. The most common causes may include the rolls failing to catch the metal, so that it will pile up or the rolls not rolling evenly, which results in one side of the metal sheet being thicker than the other. To prevent these issues, the condition and alignment of rolling mills’ bearings is monitored with vibration and magnetic guide sensors.
Automotive
In the automotive industry, the penetration of moisture into the spaces and gaps in welded spots can lead to porosity, while temperature variations in welding machines can lead to a weld joint failure. IIoT is applied to monitor the temperature and the level of humidity around a machine to avoid incorrect placement and ensure high quality of the welded products.
The benefits of product quality control based on IoT-driven condition monitoring
Monitoring the condition and the environment of machines, on which products are manufactured, delivers the following benefits:

Compared to traditional production quality control techniques (for example, test checks carried out at the end of a production cycle), IoT-driven equipment condition monitoring lets users pinpoint quality issues at the production stage, when an issue can still be mitigated.
The analytics capabilities of IoT-driven condition monitoring solutions lay a foundation for improvements in product quality. For example, combining historical data from vibration sensors attached to the milling rolls’ bearings with the data about past quality losses, manufacturers conclude that an 8% increase in a roller bearing’s vibration causes a metal sheet’s left side be 0.1 inch thicker than the right side. Manufacturers can then use these insights to improve the quality of the output products.

Getting started
Since the technology market does not yet offer out-of-the-box IoT-driven condition monitoring solutions, enterprises need to design and implement custom IoT applications. Given the complexity of IoT implementation, it proves efficient for the enterprises to collaborate with external parties: an IoT platform vendor or an independent IoT integrator.

More on IoT, analytics:

Component Roundup: Variable Frequency Drives, IoT-based Energy Harvester
2019 Predictions from Robotics, Automation, and AI Industry Experts
3 Cybersecurity Challenges for IIoT Devices
Infographic: The Influence of AI and Automation on Manufacturing
Oden Aims to Help Manufacturers Add AI Framework to Operations
Analytics at the Edge Could Help Manufacturers With Predictive Maintenance, Says Greenwave VP

Opting for an IoT platform vendor has the following advantages:

Lower implementation cost;
Simpler integration with enterprise and shop floor management systems;
More comprehensive upgrades.

However, going for the collaboration with a single IoT platform vendor, enterprises are unlikely to get the best-of-breed functionality, as they often get locked up in the vendor’s solution ecosystem with limited options to test alternative solution components that may be a better fit.

Collaborating with an IoT integrator, on the other hand, offers the possibility to ‘build’ an IoT solution from the components tailored to the enterprise’s needs. Still, the cost of implementation will rise, as enterprises have to buy separate individual modules from multiple vendors and partner with an integrator to bring these modules together.
A point to consider
Although IoT-based condition monitoring paves the way to improvements in production quality, such an approach has certain limitations, as data about machine conditions may be not enough for well-rounded quality assurance. Monitoring the condition of machines, for instance, cannot identify issues arising from the use of defective or misidentified components, or improper material handling.

Controlling the quality of products by monitoring the condition of machines, on which they are manufactured, helps to drive yield improvement, reduce scrap, and minimize rework. Compared to other quality assurance techniques (for example, based on inspecting parts and semi-finished products as they move through the production cycle), the condition monitoring-based approach may offer less differentiation in terms of quality control scope, but it helps identify quality issues at their incipient stage and predict potential ones.

About the author: Boris Shiklo is the CTO at ScienceSoft, responsible for the company’s long-term technological vision and innovation strategies. Under his supervision, the company’s development team has successfully fulfilled complex projects of more than 80,000 man-hours in healthcare, banking & finance, retail, telecommunications, public sector, and other domains. He has a solid background in IT consulting, software development, project management, and strategic planning.
How IoT Condition Monitoring Maintains Machine Health

Enhanced Robot ‘Vision’ Enables More Natural Interaction With Humans

TROY, N.Y. — The wide-eyed, soft-spoken robot named Pepper motors around the Intelligent Systems Lab at Rensselaer Polytechnic Institute. One of the researchers tests Pepper, making various gestures as the robot accurately describes what he’s doing. When he crosses his arms, the robot identifies from his body language that something is off.

“Hey, be friendly to me,” Pepper says. Pepper’s ability to pick up on non-verbal cues is a result of the enhanced “vision” the lab’s researchers are developing. Using advanced computer vision and AI, the team is enhancing the ability of robots like this one to more naturally interact with humans.

“What we have been doing so far is adding visual understanding capabilities to the robot, so it can perceive human action and can naturally interact with humans through these non-verbal behaviors, like body gestures, facial expressions, and body pose,” said Qiang Ji, professor of electrical, computer, and systems engineering, and the director of the Intelligent Systems Lab.
Detecting non-verbal clues and emotion
With the support of government funding over the years, researchers at Rensselaer have mapped the human face and body so that computers, with the help of cameras built into the robots and machine-learning technologies, can perceive non-verbal cues and identify human action and emotion.

Among other things, Pepper can count how many people are in a room, scan an area to look for a particular person, estimate an individual’s age, recognize facial expressions, and maintain eye contact during an interaction.

Another robot, named Zeno, looks more like a person and has motors in its face making it capable of closely mirroring human expression. The research team has been honing Zeno’s ability to mimic human facial communication in real time right down to eyebrow – and even eyeball – movement.

More on robot development:

6 Experimental Uses for Robotics in 2019
Survey: Retailers Say Inventory Problems Could Be Solved With Robotics
Companies Make Strides in Improving Stroke Rehabilitation With Robots
Sphero Launches New Programmable Mobile Robot via Kickstarter
Creator Offers Better Burger Experience With Robot Cooks

Universal Robots to Show Cobot Uses for Labor-Hungry Industries
Looking Back, Looking Forward: The Year in Robotics

Ji sees computer vision as the next step in developing technologies that people interact within their homes every day. Currently, most popular AI-enabled virtual assistants rely almost entirely on vocal interactions.

“There’s no vision component. Basically, it’s an audio component only,” Ji said. “In the future, we think it’s going to be multimodal, with both verbal and nonverbal interaction with the robot.”
Other applications of research
The team is also working on other vision-centered developments, like technology that would be able to track eye movement. Tools like that could be applied to smartphones and tablets.

Ji said the research being done in his lab is currently being supported by the National Science Foundation and Defense Advanced Research Projects Agency. In addition, the Intelligent Systems Lab has received funding over the years from public and private sources including the U.S. Department of Defense, the U.S. Department of Transportation, and Honda.

What Ji’s team is developing could also be used to make roads safer, he said, by installing computer-vision systems into cars.

“We will be able to use this technology to ultimately detect if the driver is fatigued, or the driver is distracted,” he said. “The research that we’re doing is more human-centered AI. We want to develop AI, machine-learning technology, to extend not only humans’ physical capabilities, but also their cognitive capabilities.”

That’s where Pepper and Zeno come in. Ji envisions a time when robots could keep humans company and improve their lives. He said that is the ultimate goal.

“This robot could be a companion for humans in the future,” Ji said, pointing to Pepper. “It could listen to humans, understand human emotion, and respond through both verbal and non-verbal behaviors to meet humans’ needs.”
Enhanced Robot ‘Vision’ Enables More Natural Interaction With Humans

3 Ways AI and Machine Learning Are Keeping Construction Workers Safer

The secret is out: working in construction is dangerous. Construction workers are killed on the job five times more often than any other workers, with an average of 14 workers dying on the job every day.

About the author: Matt Man is co-founder and CEO of indus.ai, a construction intelligence platform. Contact him here on LinkedIn.

In addition, struck-by deaths have risen by 34% over the past decade, bringing more urgency to site managers and workers to find ways to limit workplace injuries and deaths. Artificial intelligence solutions are providing construction managers more control over their job sites to reduce workplace hazards.

Here are three ways that AI is helping to keep construction safer for workers:
1) Increased visibility to prevent surprises
Increased visibility is crucial to improving on-site safety. Surprise injuries, such as falls, account for nearly 40% of construction deaths. Granted, there is no way to completely eliminate the risk of falling on a construction site, but increasing on-site visibility and awareness with AI software can help reduce these surprises.

AI-supported cameras provide real-time footage while also gathering and analyzing all inbound data concerning the job site. From the materials to the vehicles to the workers, everything on a job site is accounted for in real-time. The data gathered by the AI gives construction managers insights into the sites to anticipate anything that may prove dangerous via interactive dashboards, and allows them to make better decisions with regard to employee safety.

Construction managers can be proactive to better understand where to focus their planning, training, and instruction when they do their safety walk.

In a 2016 photo competition, humans were pitted against AI software to review photo submissions for potential job-site safety risks. The AI processed all 1,080 images in under five minutes, while the human experts took more than five hours to complete the same task.

2) AI can simplify tasks before the work begins
Construction sites are especially risky due to the number of variables involved. One way to mitigate the dangers on a construction site is to do as little as possible at the site – or move the construction site altogether. A combination of AI and robotics can produce prefabricated construction, which allows building elements to be built in a controlled factory and then transported to a construction site. This process controls many of the would-be hazards on a standard construction site, and completes the most dangerous tasks without risking human injury.

More on construction, AI:

At Windover Construction, Drones Go Beyond ‘Eye in the Sky’
5 Ways Robotics Will Disrupt the Construction Industry in 2019
PrecisionHawk Expands Drone Tech Portfolio with Construction-Focused Acquisition
Market for Commercial Drones to Nearly Triple by 2024, Research Says
Drone Usage Grows to Get the Industrial Dirty Jobs Done (FREE report)

Thanks to AI and prefabricated construction, site managers can ensure the safest construction sites possible. Improvements in safety lead many to anticipate that there will be a 6% increase in modular construction by 2022.
3) Increased transparency and accountability
The introduction of AI-supported equipment on a construction site holds every stakeholder accountable in unprecedented ways. Lost paperwork, communication breakdowns, and misunderstandings are no longer acceptable excuses, with AI working to correct such mishaps. With AI software tracking and analyzing every piece of inbound data in real-time around-the-clock, all stakeholders are kept in the know regarding all progress or setbacks. Additionally, all stakeholders can see why problems arise and why workers are getting injured.

This level of workplace transparency goes a long way in keeping site managers accountable for the success of their job sites. With all stakeholders having an eye on the site, construction managers must go above and beyond to ensure that workers are kept safe.
Conclusion – going beyond ‘Be careful out there’
Construction sites are inherently dangerous, but construction managers and designers owe it to their workers to ensure the sites they work on are as safe as possible. “Being careful” is no longer an acceptable safety technique, and has proved to be woefully ineffective as the rate of workers hurt or killed on-site continues to skyrocket. AI can give construction managers the ability to protect their workers like never before, and ensure the sites are as safe as possible.
3 Ways AI and Machine Learning Are Keeping Construction Workers Safer

Recycling Robot Learns Through System of Touch

Every year, trash companies sift through an estimated 68 million tons of recycling, which is the weight equivalent of more than 30 million cars.

A key step in the process happens on fast-moving conveyor belts, where workers have to sort items into categories such as paper, plastic and glass. Such jobs are dull, dirty, and often unsafe, especially in facilities where workers also have to remove normal trash from the mix.

The RoCycle robotic system uses touch sensors to detect whether an item is paper, plastic, or metal. Source: Jason Dorfman, MIT CSAIL

With that in mind, a team led by researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has developed a robotic system that can detect if an object is paper, metal or plastic.

The team’s “RoCycle” system includes a soft teflon hand that uses tactile sensors on its fingertips to detect an object’s size and stiffness. Compatible with any robotic arm, RoCycle was found to be 85% accurate at detecting materials when stationary, and 63% accurate on an actual simulated conveyor belt. Its most common error was identifying paper-covered metal tins as paper, which the team says would be improved by adding more sensors along the contact surface.

“Our robot’s sensorized skin provides haptic feedback that allows it to differentiate between a wide range of objects, from the rigid to the squishy,” said MIT professor Daniela Rus, senior author on a new paper about RoCycle that will be presented later this month at the IEEE International Conference on Soft Robotics in Seoul, South Korea. “Computer vision alone will not be able to solve the problem of giving machines human-like perception, so being able to use tactile input is of vital importance.”

A collaboration with Yale University, RoCycle directly demonstrates the limits of sight-based sorting: it can reliably distinguish between two identical-looking Starbucks cups made of paper and plastic that would give vision systems trouble.
Incentivizing recycling
Rus says that the project is part of her larger goal to reduce the back-end cost of recycling, in order to incentivize more cities and countries to create their own programs. Today recycling centers aren’t particularly automated: their main kinds of machinery include optical sorters that use different wavelength light to distinguish between plastics, magnetic sorters that separate out iron and steel products, and aluminum sorters that use eddy currents to remove non-magnetic metals.

Five different test objects. Source: MIT CSAIL

This is a problem for one very big reason: just last month China raised its standards for the cleanliness of recycled goods it accepts from the US, meaning that most of our single-stream recycling is now sent to landfills.

“If a system like RoCycle could be deployed on a wide scale, we’d potentially be able to have the convenience of single-stream recycling with the lower contamination rates of multi-stream recycling,” said Ph.D. student Lillian Chin, lead author on the new paper.

It’s surprisingly hard to develop machines that can distinguish between paper, plastic and metal, which shows how impressive a feat it is for humans. When we pick up an object, we can immediately recognize many of its qualities even with our eyes closed, like whether it’s large and stiff or small and soft. By feeling the object and understanding how that relates to the softness of our own fingertips, we are able to learn how to handle a wide range of objects without dropping or breaking them.

This kind of intuition is tough to program into robots. Traditional rigid robot hands have to know an object’s exact location and size to be able to then calculate a precise motion path. Soft hands made of materials like rubber are much more flexible, but have a different problem: because they’re powered by fluidic forces, they have a balloon-like structure that can easily puncture quite easily.
How RoCycle works
Rus’ team used a motor-driven hand made of a relatively new material called “auxetics.” Most materials get narrower when pulled on, like a rubber band when you stretch it; auxetics, meanwhile, actually get wider. The MIT team took this concept and put a twist on it, quite literally: they created auxetics that, when cut, either twist to the left or right. Combining a “left-handed” and “right-handed” auxetic for each of the hand’s two large fingers makes them interlock and oppose each other’s rotation, enabling more dynamic movement. The team calls this “handed-shearing auxetics”, or HSA.

“In contrast to soft robots, whose fluid-driven approach requires air pumps and compressors, HSA combines twisting with extension, meaning that you’re able to use regular motors,” said Chin.

After determining the type of object, the robot can place the object in the correct bin. Source: Jason Dorfman, MIT CSAIL

The team’s gripper first uses its “strain sensor” to estimate an object’s size, and then uses its two pressure sensors to measure the force needed to grasp an object. These metrics – along with calibration data on the size and stiffnesses of objects of different material types – is what gives the gripper a sense of what material the object is made of. Since the tactile sensors are also conductive, they can detect metal by how much it changes the electrical signal.

“In other words, we estimate the size and measure the pressure difference between the current closed hand and what a normal open hand should look like,” said Chin. “We use this pressure difference and size to classify the specific object based on information about different objects that we’ve already measured.”

RoCycle builds on a set of sensors that the team developed for a different paper to estimate an object’s size and stiffness. Those sensors could detect the radius of an object to within 30 percent accuracy, and tell the difference between “hard” and “soft” objects with 78 percent accuracy. The team’s hand is also almost completely puncture-resistant: it was able to be scraped by a sharp lid and punctured by a needle more than twenty times, with minimal structural damage.

As a next step, the researchers plan to build out the system so that it can combine tactile data with actual video data from a robot’s cameras. This would allow them to further improve its accuracy and potentially allow for even more nuanced differentiation between different kinds of materials.

Chin and Rus co-wrote the RoCycle paper alongside MIT postdoctoral associate Jeffrey Lipton, as well as PhD student Michelle Yuen and professor Rebecca Kramer-Bottiglio of Yale University.

This project was supported in part by Amazon, JD, the Toyota Research Institute and the National Science Foundation. The complete paper is available here.
Recycling Robot Learns Through System of Touch

MiR Launches MiR1000 for Autonomous Transport of up to 1 Ton Loads

ODENSE, DENMARK and HOLBROOK, N.Y. – Mobile Industrial Robots (MiR) today launched its MiR1000 autonomous mobile robot, with the ability to automatically pick up, transfer, and deliver pallets and other heavy loads up to 1,000 kg (2,200 lbs). The company will demonstrate the MiR1000 and other AMRs at this week’s Automate 2019 show in Chicago.

Like the company’s MiR500, which was introduced last year, the MiR1000 is “a collaborative, safe and flexible alternative to potentially dangerous and expensive forklifts on the factory floor,” the company said. MiR also announced it was releasing artificial intelligence capability across all of its AMRs for improved navigation.

The company’s MiR100, MiR200 and MiR500 have been installed in more than 45 countries, at companies such as Airbus, Flex, Honeywell, Toyota, Visteon, and Hitachi, MiR said. Thomas Visti, MiR’s CEO, said the company built the MiR1000 in response to strong demand from customers of the smaller robots, who also wanted to transport heavier components, such as those required in the aerospace and automotive industries.

The MiR1000 features two flexible pallet lifts for the two most commonly used types of pallets – the EU pallet and the 40-inch by 48-inch pallet. Like the company’s other AMRs, the MiR1000 can be programmed via its user interface, or through the MiRFleet robot fleet management system. The company said its AMRs can also easily integrate different top modules, such as pallet lifts, conveyors, a robot arm or other options to support several applicatiosn.

“With the MiR1000, we are once again extending the possibilities for automating internal logistics, especially for those who want to transport very large materials without reconfiguring their infrastructure,” said Visti. “Manufacturers today must deal with ever-changing customer demands, which means they need flexible and easily adaptable production facilities. Conventional logistics solutions like forklifts and conveyor belts, and even traditional automated guided vehicles (AGVs) haven’t been able to support this type of production.”

More on mobile robots:

Download: Mobile Robots Move Beyond Pilot Projects
Brain Corp Launches Autonomous Delivery Robot Concept
6 River to Launch Mobile Sort System at ProMat 2019
Robots Will Be Working in 50,000 Warehouses by 2025, Report Says
ProMat Preview: What’s Really Happening With Mobile Robots

He added the company has made it easier to optimize the transportation of materials without requiring rebuilding infrastructure or extensive programming capabilities. “Customers have seen that with our other robots, and will experience the same efficiencies with the MiR1000 and much heavier loads.”
AI and mobile robot navigation
With the AI capabilities now incorporated into the company’s software, as well as strategically placed camera that function as an extended set of robot sensors, MiR said its robots can now optimize their route-planning and driving behavior. The cameras, called MiREyesight, enable the robots to “detect and recognize different moving obstacles and react accordingly.” As an example, the robots will continue driving if they detect a person not in their path, but will park if the robots detect an AGV so it can drive by. MiR said the robot can also predict blocked areas or highly trafficked areas in advance, and re-route instead of entering the blocked area and then re-routing.

The company plans to showcase all of its AMRs and software at booth #7368 at the Automate show.
About Mobile Industrial Robots
Founded in 2013 by Danish robotics industry professionals, MiR was acquired last year by Teradyne, the owner of cobot manufacturer Universal Robots. In addition to its Odense headquarters, the company has regional offices in Dortmund, Frankfurt, Shanghai, New York and San Diego. The company said its sales have risen 500% from 2015 to 2016, and 300% from 2016 to 2017, as well as from 2017 to 2018. Last year, it was awarded the EY Entrepreneur of the Year in Denmark.
MiR Launches MiR1000 for Autonomous Transport of up to 1 Ton Loads

Robotiq Unveils New Vacuum Grippers, Sanding Kit

QUEBEC CITY – Robotiq today announced the launch of three new tools and software for collaborative robots that help automate packaging, palletizing, and sanding processes. The company will showcase the new tools at this week’s Automate 2019 show in Chicago.

Robotiq’s AirPick includes options for one or two suction cups on its vacuum gripper. Source: Robotiq

The AirPick, EPick and Robotiq Sanding Kit are designed for manufacturers looking for less expensive and less complicated options for those processes, without having to build a custom-designed solution, the company said.

AirPick and EPick are customizable vacuum grippers aimed at several industrial applications, with plug-and-play features that make them easier to program and quick to install on cobots. Robotiq said the tools’ ability to handle objects of varying sizes, shapes, materials, and weights “makes them an effective solution for packaging, palletizing, pick-and-place, assembly, and machine-tending applications.” Both AirPick and EPick come with options for one or two suction cups for customers to choose from.

The EPick vacuum gripper also features one or two-suction cup options. Source: Robotiq

Robotiq said the two vacuum grippers complete the company’s lineup of grippers. They added that expanding into vacuum grippers was a natural step for the company, which is one of the leaders in the end-of-arm tools and grippers for cobots.
Sanding kit made for UR cobots
The Robotiq Sanding Kit (photo, above) is the company’s first application-based package, built as the only hardware and software sanding solution for Universal Robots. The company said the kit increases quality and productivity while saving manufacturers hours of programming. The software’s built-in path generator ensures that “consistent force is applied at each cycle, which makes it easy to automate dirty and tedious finishing tasks,” the company said.

More on grippers:

Insider Report: Market Playbook for End-of-Arm Tools
Acutronic, Robotiq Team Up on ROS-Native Grippers
Cobot Arms, Grippers Offer Manufacturers Value at IMTS
MIT, Harvard Researchers Create Soft and Strong Robot Hand

“The introduction of these solutions to the Robotiq product family is built on our expertise from supporting thousands of clients with their automation projects over the past 10 years,” said Jean-Philippe Jobin, CTO and co-founder at Robotiq. “AirPick, EPick  and the Robotiq Sanding Kit were all engineered for helping manufacturers start production faster in mind. We wanted to support them in automating their cobot applications by offering solutions that are easy to use, safe, and flexible.”

In addition to showing all three new products at Automate (at booth #7165), Robotiq said it plans to show its complete product lineup of specially designed plug-and-play grippers, force sensors, camera technology, and related software.
Robotiq Unveils New Vacuum Grippers, Sanding Kit

IAM Robotics Redesigns, Expands Swift System for Mobile Fulfillment

PITTSBURGH – IAM Robotics today announced extending its Swift robot solution for e-commerce fulfillment operations by adding conveyor integration and a new transport robot that works with the Swift to exchange full or empty totes. The company plans to show the new offering at this week’s ProMat 2019 event in Chicago, April 8-11.

For the past three years, the company has demonstrated advances in autonomous mobile robots (AMRs) at the ProMat and Modex events. “This year, IAM continues to push those boundaries by introducing extended capabilities to the Swift Solution for the logistics industry in grocery, health and beauty, pharmaceuticals, and consumer-packaged goods,” the company said in a statement.

The new design for Swift includes a sleeker look with a smoother lift, and is available with either a fixed tote or an integrated motorized drive roller (MDR) for automatic tote transfer, IAM Robotics said. An adjustable carriage can match the height of a facility’s existing conveyor infrastructure for easier integration.

A graphic from IAM Robotics shows the benefits of adding the new Bolt robot to an existing Swift solution. Source: IAM Robotics

The new transport robot, named Bolt, is aimed to improve the throughput and return on investment of Swift by exchanging totes with Swift, delivering the full tote to the next process in the fulfillment cycle, such as packing and shipping. By leveraging Bolt for transportation, the company said that Swift can remain focused on picking.

Joel Reed, IAM Robotics president and CEO

“With the growing demands in e-commerce, the logistics industry is looking for autonomous robotic solutions that provide flexibility in their operational planning and execution,” said Joel Reed, CEO of IAM Robotics. “IAM is responding to these operational challenges by providing advanced innovative solutions in autonomous navigation, material selection and handling, and tote transport and transfer.”

ProMat attendees can visit booth S4679 at ProMat to experience the Swift system during the show.
About IAM Robotics
IAM Robotics, founded in 2012, is one of the leaders in the flexible autonomous robotic material handling space for e-commerce order fulfillment and material handling in logistics and manufacturing. The company’s Swift Product Suite provides companies with a robotic solution to address existing labor shortages, accelerate e-commerce environments, and changing consumer expectations.

The company, one of the RBR50 2018 award winners, raised $20 million in new funding in November 2018. It also teamed up with global logistics provider DB Schenker to bring the robotics technology into its operations. More recently, the company announced a collaboration with TREW to integrate robotics into the material handling and order fulfillment operations.

Related video: IAM Robotics at RoboBusiness 2018

IAM Robotics Redesigns, Expands Swift System for Mobile Fulfillment

Infographic: How AI is Being Deployed Across Industries

It is impossible to ignore the apparent impact of artificial intelligence in our everyday lives. As presented in the infographic below by techjury.net, there is hardly any critical sector or industry that does not rely on AI to perform specific tasks that humans find difficult or impossible to complete.

AI is an advanced field of computer science whereby computer systems are designed to exhibit or mimic characteristics associated with human behavior. These characteristics include the ability to learn (acquiring information and the rules for using the said information), reasoning (using these rules to make informed judgments), self-correction (learning from previous failures), understanding language, and other mental capabilities.
Forms of AI technology
To help differentiate between some of the terms thrown around the idea of AI, here are some general definitions with technologies associated with AI. These areas are not separate from each other – for example, robotics can utilize machine vision technologies, and robotics process automation can utilize natural language processing for customer service chatbots.

Robotics: Engineering involved in designing and manufacturing robots. The significant advantage of this technology is that some of the robots can be utilized to perform tasks that are difficult or impossible by human standards.
Robotic process automation: The use of specialized computer programs or software robots that automate and standardize high-volume repetitive and tedious tasks usually done by humans.
Machine learning: The science of making computer systems perform actions without being explicitly programmed. By using existing data, computers can forecast future behaviors, patterns, and outcomes without needing human intervention.
Natural language processing: Focused on the interactions between computers and human languages, especially how computer systems can be programmed to analyze, interpret, and manipulate a large amount of natural language data.
Machine vision: The science and technology of using computer systems to provide imaging-based automated inspection and analysis with the aid of a camera, analog-to-digital conversion, and digital signal processing.

Applications of AI in various industries
The influence of AI technology can be seen across sectors such as transportation, education, manufacturing, online shopping, communication, sports, media, healthcare, politics and government, banking and finance, aerospace, and so much more.

Below is a list of essential industries impacted by AI:

Transportation: Autonomous car, also known as a self-driving car, is a vehicle that can sense its environment and is capable of moving without human interference. This technology can transform the transportation system, because it can analyze traffic and alternative routes, thus reducing travel times.
Manufacturing: high performing robots work faster, and complete tasks more efficiently than humans. Also, they can work for long periods nonstop as long as the power required for them to function is available. By using 3D technology and machine vision, these machines can speed up the process of product manufacturing.
Healthcare: Applications such as autonomous surgical robots, virtual nursing assistants, automated image diagnosis, and dosage error reduction have been some of the ways AI has been crucial for the technological advancements in the health sector.
Entertainment: machine learning can predict a user’s behavior to make recommendations on the type of movie, music, TV shows, and other content he’ll be interested in. Also, adverts can now be personalized based on the user’s preference, thereby increasing the chances a marketer will make a sale.
Sports: AI technology like automation and predictive analysis can be used in business decisions, sponsorship activations, ticket sales, and determining athletes’ performance.

Future applications of AI would be utilized in automated transportation, cyborg technology, solving problems associated with climate change, deep-sea and space exploration.

If the projected growth of the AI software market from $1.4 billion in 2016 to $59.8 billion in 2025 is anything to go by, AI is set for a massive takeover in the coming years.

Source: Techjury.net (https://techjury.net/stats-about/ai/)

About Techjury.Net: Techjury.net is a team of software experts that tests and reviews software to help companies improve their offerings and help end users choose the best products for their needs. They subscribe and thoroughly test all aspects of different software to create impartial and complete reviews, to give praise and honest, constructive criticism where it may be needed. Visit Techjury.net for more information.
Infographic: How AI is Being Deployed Across Industries

Silicon Sensing Systems’ Latest Gyros Guide World’s Largest Construction Vessel

Silicon Sensing Systems Ltd’s announced this week that their latest inertial measurement technology has been used in AD Navigation AS’s new pilot’s aid, the ADX XR, to successfully guide the world’s largest construction vessel ‘Pioneering Spirit’ into Maasvlakte in the port of Rotterdam.

ADX XR display showing the predicted turn into port. Source: Silicon Sensing Systems

With large vessels such as the Pioneering Spirit, which measures 372 x 124 meters (1,227 x 407 feet), final entrance and docking maneuvers in the close confines of a port are typically controlled by a pilot. The pilot uses the ADX XR as their ship-independent navigation aid.

“Our MEMS gyros are relied on in many maritime roles, including positioning, stabilisation, and navigation, but our team is particularly proud of this successful trial with AD Navigation on the Pioneering Spirit vessel. Our devices are based on our patented vibrating ring design which means they offer a unique combination of precision performance and robustness – a combination that is particularly appropriate in the tough and ever-changing maritime environment,” said Steve Capers, General Manager at Silicon Sensing Systems.

Silicon Sensing’s CRH02 silicon MEMS gyro. Source: Silicon Sensing Systems

In the trials, precise movement data from Silicon Sensing’s CRH02 all-silicon gyros allowed the ADX XR to deliver a highly accurate and detailed 3 to 5-minute ship course prediction to the pilot. The CRH02 model is a compact, low noise, single axis gyroscope, similar to a fiber optic gyro, but more rugged, and with a lower size and weight.

Following this successful performance, AD Navigation has placed a production order for CRH02 gyros with Silicon Sensing.

Lorentz Ryan, Managing Director of AD Navigation commented: “The compact form factor along with the extremely precise performance of the new CRH02 gyro makes it a perfect component in our ADX XR ultra-precise and portable navigation system. We appreciate our long-standing relationship with Silicon Sensing and the excellent support from all their staff.”

More on components:

Conveyor Modules, Pallet Handlers, and Sealed Sensors
Variable Frequency Drives, IoT-based Energy Harvester
Servo Motor for Mobile Robot, New Lidar Options
New Sensors, Jetson-based AI Development Kit
Direct Drive Motor, Cybersecure LVDT Signal Conditioner
Vision Sensors, Vacuum Lifter, and AV Guidance Gear
New Grippers, Motors and Drives Hit the Market
Silicon Sensing Systems’ Latest Gyros Guide World’s Largest Construction Vessel