Smart Machines & Factories
Game changing technology
Published:  13 December, 2017

Smart Machines & Factories looks at how Augmented Reality (AR) and cobots are driving the next wave of automation.

Augmented reality (AR) technology, usually incorporated into special headsets, eyewear, or projections, superimposes data or graphics over real-world images and uses sensors and cameras to capture the operator’s motions for feedback and control. Until now, AR’s primary application has been in gaming. But as the technology has become commoditised, it’s now finding a surprising new role in robotics research, and it may soon have a huge impact on manufacturing and logistics automation, and eventually even home and service robots.

Robot programming was traditionally done by writing code, which was time-consuming and expensive. That meant that robots were programmed for a single specific task that they did over and over. Cobots made programming much easier by letting even untrained operators simply move the robot arm as desired and use a teach pendant to set way points and actions. This makes programming more intuitive and flexible, so the robot can be quickly reprogrammed for new tasks. But while the robot’s movements are precise and consistent, they’re generally not as smooth or as fast as human movements, and the robot still only knows how to perform the exact tasks it’s been programmed for.

AR changes the game

It allows a human operator to get inside the robot’s head, so to speak. The operator uses AR to control the robot using natural, smooth movements, giving the robot precise instructions simply by doing the tasks he or she wants the robot to emulate. This new approach is ideal for cobots, which allow human operators to work directly with the robot arm without the interference of safety cages or fencing.

Demonstrating prototyping and collaboration

ITAMCO, an Indiana-based manufacturer of precision-machined components, recently demonstrated an augmented reality application using a UR robot and a Microsoft HoloLens headset, which includes Xbox Kinect 360 sensors and is typically used for gaming. The HoloLens is a wearable computer that projects information on top of the reality of the robot’s actions, and allows the operator to control the robot using hand movements. Because the HoloLens has a camera, it can record both real-life and virtual images to share with other individuals—for example, an engineer or operator could demonstrate a robot setup to someone in another plant or department.

Joel Neidig, business development & technology manager at ITAMCO commented: “I think it’s going to bring a lot of collaboration between operators and engineers, even going out to the point-of-use on the manufacturing floor, where the UR robot is being used every day. You can capture work flows and the motion of the robot, and people can record their setups and display some of the virtual models inside the machine before they actually manufacture it.”

Using a virtual environment prior to manufacturing could be especially valuable to experiment with setup for expensive parts, or to plan for parts that haven’t been manufactured yet. The AR system can show the user how the part will be loaded in the machine without having the actual parts on-hand. “It’s really important to have more prototype tools like this throughout the industry, and being able to rapidly prototype and test your design,” Neidig explains. This type of system will allow engineers, manufacturers, and operators to collaborate and to make changes so that when parts go to production, the processes are as efficient as possible.

Neidig looked specifically for a collaborative robot so that people using the AR system could safely stand next to the robot while it was in action, without being separated by a safety cage. The robot needed to be lightweight enough to be easily moved, and ease of integration was also key. Neidig explained: “We chose the UR robot for this application because it’s an open platform. We can communicate with Python scripts and secure sockets, and it’s got a nice Ethernet port that’s already set up. UR brings it all together, and just being intuitive, it’s very easy to maneuver around, and we like the platform as a whole.”

New level of collaboration

The combination of AR technology and cobots can also bring a whole new level of collaboration to the table. Kubica Corporation, a Michigan-based engineering firm, recently demonstrated an AR automobile door panel assembly programme using a UR10 robot and a Light Guide Systems projection system. Carol Choma, operations & new business development manager at Kubica, explained the application: “This is a great example of a true collaborative cell work environment where the UR10 robot from Universal Robots is working with the operator at an assembly cell for the automotive industry.”

In this application, the Light Guide system projects assembly instructions directly onto the work environment. The operator swipes his hand over the virtual “start button” projection and watches for additional directions as the robot begins its process. The projection highlights the robot’s actions and prompts the operator for his tasks. The projection guides the proper assembly process by lighting up and color-coding the path for the operator to install a wire harness accurately while the robot is working on another part of the assembly.

The operator can respond to quality control messages, such as a missing pin, and use the projection system’s virtual controls to instruct the robot on additional processes. Once all processes are complete, the projection system takes a picture of the assembly for traceability purposes.

AR research is still in its early stages, but promises to expand the use of robots into more complex applications, improve quality and consistency, and increase opportunities for collaboration with human workers.

For further information please visit: