Cobot Safety Vision Systems

Cobot Safety Systems: The importance of Vision Technology

The new generation of Cobots incorporate vision systems. The vision system is integrated into the operating software of the Cobot and provides a wide variety of solutions for a number of production tasks.

Vision capabilities (not limited to):

  • Colour check
  • Size check
  • Orientation checking (to enable the Robot to pick the component)
  • Checking if the part bin is empty
  • Part production state (such as detecting unwanted flash on a moulded part)
  • Part presence (for example automated machine tooling)
  • Counting
  • QR barcode reading for traceability

3D machine vision is making it safer for humans and collaborative robots (cobots) to work together. Machine vision lets a robot sense the world, process the information, and mimic the way two humans can intuitively adjust to working around each other.

The first class of collaborative robots sense human contact and stop to avoid harm to human collaborators. This method is effective in many applications, but it’s a crude solution. A sophisticated 3D machine vision system lets a robot stop moving before hitting a human worker. In order to be effective, this system must detect, map, classify, and predict trajectory quickly.

2D LIDAR lets robots “see” their environments, but they don’t provide the richness of data that 3D machine vision has to offer. A 2D LIDAR setup may stop for movement several meters away, in case the person unexpectedly stretches out an arm that could come in contact with the robot, but a 3D system could operate until the person actually stretches out their arm toward the robot.

Manufacturers place demands on collaborative robots (cobots) to be faster and more powerful every year. But, engineers have to keep in mind these cobots need to be safe for the employees working around them. Some may think simply adding fencing can make a cobot safer, but there’s an alternative that often works better.

AI

People are fast and clever. They often find workarounds for physical barriers they wish to cross. Due to safety, if a work cell is fenced-in and designed so a robot needs to be shut down before a person can enter, it is possible a human worker will try to bypass the safety mechanism instead of interrupting the cobot’s work.

Many fenced-in cobot systems still have force and speed limitations as additional safety precautions. Artificial Intelligence could remove the need for these kinds of limitations. Instead, separation monitoring and advanced vision technology could allow the increase of a collaborative robot’s capabilities.

AI systems could make any robot collaborative. Humans could work even closer to cobots than they do now without a threat to their safety. And since cobots could work faster with more force, production cycles could be improved as heavier loads could be moved and manipulated quicker. Most cobot maximum payloads are now limited to about 10 kg because of safety concerns, but that could increase with AI.

Vision Systems for Collaborative Robots

To give AI the information it needs about the work environment, the collaborative robot will need a way to “see.” Machine vision and motion-sensing technology will need to be integrated into automated systems. Multiple vision cameras are needed to overlap and monitor the work cell.

One solution is to perform this scanning with cameras and computer vision software. Infrared flashes at 30 flashes per second are used to map every object near the cobot. The system combines the camera’s data to look for occlusions (obstructions). When an occlusion is detected, it’s assumed a human has entered the work cell. Custom procedures can be followed so the human is not harmed.

Artificial Intelligence – Not Machine Learning

The use of AI with these vision systems instead of machine learning is an important distinction. Machine learning is probabilistic – it’s based on probability and subject to chance variation. AI classification makes occlusion analysis more efficient and ensures human safety at all times.

Current safety standards make it clear no statistical approaches for triggering safety measures should be allowed. A statistical approach to safety would program a robot to assess, “there is a 78% chance this human will be injured if action isn’t taken.” At what point does the system act? 50%? 75%?

Humans should always be able to count on 100% safe working conditions. AI is the route for this assurance.

 

Menu