|
Human-robot collaboration can be a powerful tool for increasing productivity in production systems by combining
the strengths of humans and robots. Assembly operations, in particular, have shown great potential for utilizing the
unique abilities of both parties. However, for robots to efficiently perform assembly tasks, components and parts
must be presented in a known location and orientation, which is achieved through a process called parts feeding.
Traditional automation methods for parts feeding, such as vibratory bowl feeders, are limited in their ability to
accommodate variations in parts design, shape, location, and orientation, making them less flexible for use in hu-
man-robot collaboration. Recent advancements in machine vision technology have opened up new possibilities for
flexible feeding systems in human-robot assembly cells. This paper explores the application of the vision system
in the collaborative robot ABB Yumi and its ability in object detection. In this case, the characteristic of the vision
system was determined experimentally by changing the light intensity on the test rig. The system was validated,
if the angle of incidence of light affects the stability of the vision system. The results of the study demonstrate the
efficiency of vision system in collaborative robot and provide insights into its industrial application.
|