Four Ways AI Boosts Cobot Performance

The growing synergy between artificial intelligence (AI) and collaborative robot (cobot) technologies can empower design engineers to build affordable cobot-based solutions — even for advanced industrial applications that incorporate machine learning, deep learning, and computer vision.

Intelligence from the Ground Up

AI is not new to the cobot realm. But incorporating AI into a cobot-based application using a standard teach pendant or graphical user tool has been challenging and time-consuming, even for the most dedicated engineers.

In recent years, numerous companies have worked on this problem in collaboration with Universal Robots. The most recent is MathWorks, the company behind the MATLAB and Simulink mathematical software solutions. Earlier this year MATLAB and Simulink achieved UR+ Certification, which means that the software is certified for seamless use on Universal Robots cobots and is available through the UR+ ecosystem, enabling engineers to develop advanced cobot applications.

“MathWorks has been paying attention to the cobot space for many years, but this is our first official collaboration with a cobot maker. It shows the potential we see in the cobot space for applications that require AI, offline simulation, motion planning, and computer vision capabilities,” said YJ Lim, Technical Robotics Product Lead at MathWorks.

The collaboration between Universal Robots and MathWorks is more than a symbolic intertwining of cobots and AI. It enables robotics engineers to bring all the capabilities of MATLAB and Simulink into their cobot-based industrial applications and incorporate AI into their system design from the ground up. The collaboration between the two companies also allows engineers to deploy algorithms and AI on a cobot by generating C++ codes directly on embedded targets, such as GPU boards.

Cobots and AI are not new to each other as there are many cobot application kits out there that incorporate AI capabilities. However, the MathWorks and Universal Robots collaboration is unique — and a potential model for other cobot solution providers — because it provides engineers with the tools they need to start building advanced industrial automation systems out of affordable cobot hardware.

“Traditional automation is limited to large organizations. The introduction of MATLAB and Simulink to the cobot arena will enable more emerging and medium-sized companies to enjoy the benefits of AI and automation,” said Lim.

Humanlike Perception

Humans can look at disordered objects, such as parts in a bin, and immediately understand which of them can be handled without interference from the other objects. The path our hands take, even when grasping an object, will avoid collision with the surrounding environment. We can even take multiple objects and put them together with high precision.

Automation engineers know that this isn’t always the case for robots, to put it mildly. And, as a result, bin-picking of unstructured items has traditionally been thought of as a notoriously difficult problem to solve without huge capital investment.

MATLAB’s deep learning capabilities enable cobots to perform intelligent bin-picking applications. (Image: MathWorks)

Apera AI’s ‘4D Vision’ technology, which is also UR+ certified, is challenging that orthodoxy by providing cobots with “humanlike perception” — a claim that sounds hyperbolic at first but is borne out on several levels and enables faster, more effective robot performance, especially on bin-picking applications.

“Our system has total vision cycle time as low as 0.3 seconds (3Hz), which means that it can analyze a disordered situation and give instructions to a robot in the same time the human brain would take to process the same situation. Our vision system has to wait for the robot, not the other way around,” explained Eric Petz, Head of Marketing at Apera AI.

For perspective, the targeted rate for high-speed robotic bin picking automation is 2,000 picks per hour, meaning a total robot cycle time of 1.8 seconds. Since there are limits on how fast a robot can move, the vision cycle time has to be reduced as much as possible.

The process begins with training an AI neural network using CAD drawings or 3D scans of the product to be handled. The scene on the factory floor — a cluttered bin, for example — is captured by two 2D cameras and those images are combined into a 3D understanding of the scene. The 4D Vision system then identifies the “most pickable” objects and informs the cobot of the fastest and safest path to handle them. The cobot is provided with pose estimation and path planning data via Apera Vue software, which is embedded in its controller, ensuring that the robot takes a safe, collision-free path to accomplish its goal.

“Identifying and prioritizing pickable objects is a human trait and one that we train our AI neural networks to perform, shortening the time needed to identify objects and give the robot movement instructions,” explained Petz.

The Rapid Machine Operator runs millions of simulations to ensure speedy deployment and effective performance. (Image: Rapid Robotics)

Part of the initial CAD/3D scan and AI training step of the process involves running through approximately 1 million permutations of how the products to be picked could appear to a robot in real-world conditions. This is achieved via a digital twin environment and includes training for variations in ambient light from full sunlight to near darkness.

“If a human can see it, we can help a robot see it. Most conventional systems require structured light systems, lasers or sensors to identify objects and guide the robots,” added Petz.

Enhanced Flexibility

With an increasing number of manufacturers looking for flexible automation solutions that enable them to adjust quickly to product customizations and mixes, the synergy between AI and cobots provides a unique opportunity for design engineers to provide systems that support High Mix/ Low Volume (HMLV) production.

Cobots are highly mobile, flexible, and easy to program, which enables them to be easily moved around between applications as diverse as palletizing, inspection, sanding, and machine tending. When combined with the learning capabilities of AI systems, the result is a marriage of highly flexible automation with intelligence that is also capable of applying itself to a wide variety of tasks.

“Our AI can be trained to understand whether a part is placed or assembled correctly. Just like a human can analyze whether a manufacturing step is done properly,” said Apera AI’s Petz.

Apera AI offers a total vision cycle time of just 0.3 seconds (3 Hz). This means that robotic work cells using the company’s vision software can achieve productivity levels that were never before possible — the vision system has to wait for the robot, not the other way around. The vision solution uses AI to conduct millions of simulated cycles before going into production, so the vision system deeply understands the object in every orientation, and in combination with the chosen robot, end-of-arm tool and operating environment. (Image: Apera AI)

In one deployment at a Fortune 500 manufacturing company, a cobot from Universal Robots paired with Apera AI vision intelligence is successfully handling the task of high-precision dispensing of liquid gasket material to the edges of metal valve assemblies of various sizes and shapes.

The system has the flexibility to recognize the part and automatically dispense the material in a specific pattern. These capabilities eliminate the need to set up a specific fixture for each part to ensure it’s in the correct place for the dispensing process.

Another Apera AI customer, PA-based Precision Cobotics, has developed standardized machine tending solutions for CNC and laser marking that combine cobots from Universal Robots with Apera AI technology.

The resulting systems provide bin-picking of randomized items and very precise placement of unmachined pieces into the machine. The cobot can move finished parts on to another manufacturing step or place them in a specified area such as on a conveyor or pallet.

“The status quo is to stage the unmachined parts in structured grids, which requires an operator, or additional automation. Taking items from bins removes the need for these fixtures and enables flexible, high-mix manufacturing, and more efficient labor utilization,” explained Petz.

Simplicity

AI can make it much easier for robot engineers to produce advanced, cobot-based applications. And, of course, end users derive the benefits of automation that is both intelligent and easy to use. This doesn’t mean that AI has to clutter and complicate the forefront of the end-user experience, however.

When speed is of the essence and especially when the company is an SME that doesn’t have much in-house robotics experience, it can be useful to keep all the AI capabilities in the background to ensure a smooth end-user experience and a faster deployment. Companies facing labor shortages want solutions that they can put to work on a specific application quickly and easily. And if AI can help that process, all the better.

That’s the premise behind Rapid Robotics ‘Rapid Machine Operator’ — a flexible collaborative automation system developed for speedy deployment. Prior to deployment, Rapid Robotics runs the product through millions of permutations in a digital twin environment using third-party AI software, effectively teaching the cobot the best “pick points” to select and the finer points of path planning. The end user doesn’t need to see or handle any of that complexity though.

As Director of Computer Vision at Rapid Robotics, John Novak puts it: “The customer doesn’t care what’s going on in the black box, all they want, is some automation because they don’t have enough staff and they need to run their machines.”

Novak makes an important point. Not every cobot-based application requires deep learning or machine learning functionality and if it does, there is much to be said for keeping all that complexity away from the end user.

This article was written by Joe Campbell, Senior Manager, Strategic Marketing & Applications Development at Universal Robots (Ann Arbor, MI). For more information, visit here  .