Research Article
BibTex RIS Cite

Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator

Year 2020, Volume: 24 Issue: 4, 703 - 711, 01.08.2020
https://doi.org/10.16984/saufenbilder.655716

Abstract

This work deals with the likelihood of merging a 3D sensor into a robotic manipulator, with an objective to automatically detect, track and grasp an object, placing it in another location. To enhance the flexibility and easy functionality of the robot, MATLAB, a versatile and powerful programming language is used to control the robot. For this work, a common industrial task in many factories of pick and place is implemented. A robotic system consisting of an ABB IRB120 robot equipped with a gripper and a 3D Kinect for Windows camera sensor is used. The three-dimensional data acquisition, image processing and some different parameters of the camera are investigated. The information in the image acquired from the camera is used to determine the robot’s working space and to recognize workpieces. This information is then used to calculate the position of the objects. Using this information, an automatic path to grasp an object was designed and developed to compute the possible trajectory to an object in real time. To be able to detect the workpieces, object recognition techniques are applied using available algorithms in MATLAB’s Computer Vision Toolbox and Image Acquisition Toolbox. These give information about the position of the object of interest and its orientation. The information is therefore sent to the robot to create a path through a server-to-client connection over a computer network in real time.

References

  • [1] J. Hill and W. Park, "Real time control of a robot with a mobile camera," 9th International Symposium on Industrial Robots, p. 233–246, 1979.
  • [2] K. Rezaie, S. Nazari Shirkouhi, and S.M. Alem, "Evaluating and selecting flexible manufacturing systems by integrating data envelopment analysis and analytical hierarchy process model," Asia International Conference on Modelling and Simulation, pp. 460–464,, 2009.
  • [3] K. Hashimoto, "Visual Serving: Real Time Control of Robot Manipulators Based on Visual Sensory Feedback,," 1993.
  • [4] Hutchinson, F. and Chaumette, S., "Visual servo control basic approaches," Robotics & Automation Magazine, IEEE, vol. 13, pp. 82-90, 2006.
  • [5] F. C. S. Hutchinson, "Visual servo control. ii. advanced approaches [tutorial]," Robotics & Automation Magazine, IEEE, vol. 14, p. 109–118, 2007.
  • [6] D. Kragic, and H. I. Christensen, "Survey on visual servoing for manipulation," Computational Vision and Active Perception Laboratory Fiskartorpsv, vol. 15,, 2002.
  • [7] H. Wu, W. Tizzano, T. Andersen, N. Andersen, and O. Ravn, "Hand-Eye Calibration and Inverse Kinematics of Robot Arm using Neural Network," Springer, p. 581–591, 2013.
  • [8] H. Wu, L. Lu, C.-C. Chen, S. Hirche, and K. Khnlenz, "Cloud-based networked visual servo control," I E E E Transactions on Industrial Electronics, vol. 60, no 2, pp. 554 – 566,, 2013.
  • [9] Meyer, R. D. and Schraft, C., "The need for an intuitive teaching method for small and medium enterprises," ISR Robotik, Germany, 2012.
  • [10] B. Akan, "Human Robot Interaction Solutions for Intuitive Industrial Robot Programming," Västerås: Mälardalen University, 2012., 2012.
  • [11] Juan R. Terven and Diana M. Cordova, Kin2 User Guide, 2016.
  • [12] MathWorks, "Image Processig Toolbox User’s Guide,," 2014.
  • [13] N. B. Fernández, "“Generación de trayectorias y evitación de obstáculos para el robot IRB120 en entorno Matlab”," UNIVERSIDAD DE ALCALÁ, p. 47, 2015.
Year 2020, Volume: 24 Issue: 4, 703 - 711, 01.08.2020
https://doi.org/10.16984/saufenbilder.655716

Abstract

References

  • [1] J. Hill and W. Park, "Real time control of a robot with a mobile camera," 9th International Symposium on Industrial Robots, p. 233–246, 1979.
  • [2] K. Rezaie, S. Nazari Shirkouhi, and S.M. Alem, "Evaluating and selecting flexible manufacturing systems by integrating data envelopment analysis and analytical hierarchy process model," Asia International Conference on Modelling and Simulation, pp. 460–464,, 2009.
  • [3] K. Hashimoto, "Visual Serving: Real Time Control of Robot Manipulators Based on Visual Sensory Feedback,," 1993.
  • [4] Hutchinson, F. and Chaumette, S., "Visual servo control basic approaches," Robotics & Automation Magazine, IEEE, vol. 13, pp. 82-90, 2006.
  • [5] F. C. S. Hutchinson, "Visual servo control. ii. advanced approaches [tutorial]," Robotics & Automation Magazine, IEEE, vol. 14, p. 109–118, 2007.
  • [6] D. Kragic, and H. I. Christensen, "Survey on visual servoing for manipulation," Computational Vision and Active Perception Laboratory Fiskartorpsv, vol. 15,, 2002.
  • [7] H. Wu, W. Tizzano, T. Andersen, N. Andersen, and O. Ravn, "Hand-Eye Calibration and Inverse Kinematics of Robot Arm using Neural Network," Springer, p. 581–591, 2013.
  • [8] H. Wu, L. Lu, C.-C. Chen, S. Hirche, and K. Khnlenz, "Cloud-based networked visual servo control," I E E E Transactions on Industrial Electronics, vol. 60, no 2, pp. 554 – 566,, 2013.
  • [9] Meyer, R. D. and Schraft, C., "The need for an intuitive teaching method for small and medium enterprises," ISR Robotik, Germany, 2012.
  • [10] B. Akan, "Human Robot Interaction Solutions for Intuitive Industrial Robot Programming," Västerås: Mälardalen University, 2012., 2012.
  • [11] Juan R. Terven and Diana M. Cordova, Kin2 User Guide, 2016.
  • [12] MathWorks, "Image Processig Toolbox User’s Guide,," 2014.
  • [13] N. B. Fernández, "“Generación de trayectorias y evitación de obstáculos para el robot IRB120 en entorno Matlab”," UNIVERSIDAD DE ALCALÁ, p. 47, 2015.
There are 13 citations in total.

Details

Primary Language English
Subjects Electrical Engineering
Journal Section Research Articles
Authors

Tichaona Jonathan Makomo 0000-0002-9860-6179

Kenan Erin 0000-0003-4714-1161

Barış Boru 0000-0002-0993-3187

Publication Date August 1, 2020
Submission Date December 5, 2019
Acceptance Date May 24, 2020
Published in Issue Year 2020 Volume: 24 Issue: 4

Cite

APA Makomo, T. J., Erin, K., & Boru, B. (2020). Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator. Sakarya University Journal of Science, 24(4), 703-711. https://doi.org/10.16984/saufenbilder.655716
AMA Makomo TJ, Erin K, Boru B. Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator. SAUJS. August 2020;24(4):703-711. doi:10.16984/saufenbilder.655716
Chicago Makomo, Tichaona Jonathan, Kenan Erin, and Barış Boru. “Real Time Application for Automatic Object and 3D Position Detection and Sorting With Robotic Manipulator”. Sakarya University Journal of Science 24, no. 4 (August 2020): 703-11. https://doi.org/10.16984/saufenbilder.655716.
EndNote Makomo TJ, Erin K, Boru B (August 1, 2020) Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator. Sakarya University Journal of Science 24 4 703–711.
IEEE T. J. Makomo, K. Erin, and B. Boru, “Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator”, SAUJS, vol. 24, no. 4, pp. 703–711, 2020, doi: 10.16984/saufenbilder.655716.
ISNAD Makomo, Tichaona Jonathan et al. “Real Time Application for Automatic Object and 3D Position Detection and Sorting With Robotic Manipulator”. Sakarya University Journal of Science 24/4 (August 2020), 703-711. https://doi.org/10.16984/saufenbilder.655716.
JAMA Makomo TJ, Erin K, Boru B. Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator. SAUJS. 2020;24:703–711.
MLA Makomo, Tichaona Jonathan et al. “Real Time Application for Automatic Object and 3D Position Detection and Sorting With Robotic Manipulator”. Sakarya University Journal of Science, vol. 24, no. 4, 2020, pp. 703-11, doi:10.16984/saufenbilder.655716.
Vancouver Makomo TJ, Erin K, Boru B. Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator. SAUJS. 2020;24(4):703-11.