Smart Machines & Factories
Robots, AI and the collaborative workplace
Published:  25 May, 2018

How can artificial intelligence be applied to accelerate the transition to more connected factories, where human and machine work collaboratively? What technological advances are being made – and how can we encourage robots (and humans) to learn? Steve Sands, head of product management at Festo, explains what the future workplace might look like.

The future of production is flexibility – not only in terms of the products manufactured, but also in the adaptability of the workplace and the design of the working environment. Ultimately, the aim is to be capable of producing down to batch size 1 in the factory of the future.

At the same time, it is becoming increasingly important to ensure that employees can rapidly and intuitively adapt to new tasks. This calls for new forms of collaboration between people, machinery and software. A key role will be played by self-learning systems with artificial intelligence and by robot-based automation solutions that can work hand in hand with the human operator and can form networks with each other.

One example of this concept is the BionicCobot, which Festo developed in 2017 as part of the Bionic Learning Network. Based on the human arm in terms of its construction, this pneumatic, lightweight robot can solve many tasks intuitively and can interact directly and safely with people. Clearly, the potential for much greater collaboration between human and machine becomes possible as the strict separation between worker and robot for safety reasons becomes obsolete.

Added to this, artificial intelligence and machine learning have the potential to transform workplaces into learning systems that constantly develop and optimally adapt themselves to the requirements at hand.

Interactive learning

Let us consider an example of an interactive workplace, in which a human works together with a robotic arm, along with numerous assistance systems and peripheral devices that are connected and communicate with each other.

Crucial to the success of this collaborative workplace is the gathering, sharing and interpretation of data. At the centre of the worker’s field of vision is a large projection screen. It supplies the worker with all the relevant information and reacts dynamically with its contents as required for the task in hand. Positioned around the screen, various camera systems constantly record the positions of the worker, components and tools.

The system is able to recognise the worker and their movements from their special clothing. This consists of a shirt which is equipped with inertia sensors and a work glove with integrated infrared markers. The interaction of cameras and wearables allows the entire workplace to be handled safely and intuitively. With the help of the recorded sensor data, the bionic robot arm is able to hand objects to its human colleague with pinpoint accuracy and move out of their way if necessary – an essential requirement for direct collaboration between human and robot.

A special 3D camera with depth perception registers the worker’s direction of view and head movements. The system uses eye tracking to constantly check whether the worker’s attention is on the workplace or not. If the person directs their eyes to a particular area of the projection screen, the content adjusts accordingly. If the robot is supposed to hand the worker a part or tool, another camera supports it by working out the co-ordinates for the ideal gripping points on the object.

Voice control is another key element in the collaborative workplace. With the appropriate software, it is possible to interpret semantic details as well as the relevant linguistic context, so the system can converse with a human in a natural way. So that the cobot can execute a command, the voice recognition system turns the spoken sentence into text by comparing frequency patterns with databases of words and their patterns. To understand the meaning of the sentence, the software then sends the text to a language interface.

Once the interface has identified the meaning of the command it supplies a context object, which is a software code, with which the robot’s control system can work. The clear handling instructions to the cobot are then taken care of by a special, self-learning automation software program with artificial intelligence. This intelligent software evaluates the contents of the context object and simultaneously processes all data and recorded inputs using sensors from the various peripheral devices.

With every action solved, the system learns something new. This creates a so-called semantic map that grows continuously. Along the network paths, the stored algorithms constantly draw dynamic conclusions. As a result, a controlled, programmed and set sequence gradually turns into a much freer method of working.

Realising customised production

This may sound very well in theory, but Festo demonstrated the BionicWorkplace in action at Hannover Messe 2018. To demonstrate how collaborative working and clever technology can enable the manufacture of customised products down to batch size 1, we set it the task of producing a unique model of a head made of acrylic glass.

For this purpose, a laser cutter was integrated into the worktop in the BionicWorkplace. The facial features of a person previously scanned using a smartphone with a depth-sensing camera were provided. A software program converts the stored facial features into a CAD model, which it then breaks down into separate slices. The laser cutter then cuts the elements out of acrylic on the basis of this 3D template. The BionicCobot takes the slices directly from the cutter and gives them to the operator in the right sequence, who then assembles them to make a unique model.

The constant automatic feed of material in this scenario is ensured by a Robotino (mobile robot), which autonomously travels back and forth between the stations and safely finds its way by means of a laser scanner. It is loaded by a refined version of the BionicMotionRobot, a soft robotic structure with pneumatic compartments and a 3D woven textile covering. This configuration thus combines all key elements of robot technology.

Future adoption

Once learnt and optimised, the processes and skills of the BionicWorkplace can be very easily transferred to other systems of the same type in real time and made available worldwide. It will be possible in future, for example, to integrate workplaces into a global network in which knowledge modules can be shared, with the communication effected in the various national languages. Production will then become not only more flexible, but also more decentralised: the operators could call up production orders via Internet platforms, for instance, and carry them out autonomously in cooperation with the machinery – in keeping with individual customer desires and requirements.

For further information please visit:

To download a video of the BionicWorkplace, please visit: