Research Article
BibTex RIS Cite

A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space

Year 2022, Volume: 10 Issue: 5, 77 - 90, 26.12.2022
https://doi.org/10.29130/dubited.1109850

Abstract

Autonomous vehicles are increasingly used in daily life and industrial applications. Mobile robot technologies lead to autonomous architectures in these areas. The path planning methods of mobile robots contain differences in the purpose they realize. This trajectory planning from a determined starting point to the target point brings many techniques from image processing to artificial intelligence. In the study, an application with a unique design has been carried out on the tracking of circular objects with different diameters and colors by a mobile robot. Moving object is detected with CIE L^* a^* b^* color space with RGB-D camera by utilizing the ROS server-client architecture. The mobile robot tracks the detected object at a certain distance at a constant speed. Image filtering parameters are processed by the mobile robot in the Matlab environment together with the publisher-subscriber parameters. Thus, two circular objects with different colors, detected because of image processing and determined beforehand, are continuously followed by the mobile robot at a certain speed. Experiments were carried out using different diameter, size tolerance and color parameters in the image depending on the CIE L^* a^* b^* color space.

Thanks

We would like to thank the Robot Technologies and Smart Systems Application and Research Centre (ROTASAM) for providing every opportunity to carry out this study. This study was also supported by Sakarya University of Applied Sciences Scientific Researches Coordinatorship with project number 010-2020.

References

  • [1]R. Munoz-Salinas, E. Aguirre and M. Garcia-Silvente, “People detection and tracking using stereo vision and color,” Image and Vision Computing, pp. 995-1007, 2007.
  • [1]R. Munoz-Salinas, E. Aguirre and M. Garcia-Silvente, “People detection and tracking using stereo vision and color,” Image and Vision Computing, pp. 995-1007, 2007.
  • [2]A. Treptow, G. Cielniak and T. Duckett, “Real-time people tracking for mobile robots using thermal vision,” Robotics and Autonomous Systems, pp. 729-739, 2006.
  • [2]A. Treptow, G. Cielniak and T. Duckett, “Real-time people tracking for mobile robots using thermal vision,” Robotics and Autonomous Systems, pp. 729-739, 2006.
  • [3]M. Munaro, F. Basso and E. Menegatt, “Tracking people within groups with RGB-D data,” International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal, 2012.
  • [3]M. Munaro, F. Basso and E. Menegatt, “Tracking people within groups with RGB-D data,” International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal, 2012.
  • [4]C. Martin, E. Schaffernicht and H. M. Gross, “Multi-modal sensor fusion using a probabilistic aggregation scheme forpeople detection and tracking,” Robotics and Autonomous Systems, pp. 721-728, 2006.
  • [4]C. Martin, E. Schaffernicht and H. M. Gross, “Multi-modal sensor fusion using a probabilistic aggregation scheme forpeople detection and tracking,” Robotics and Autonomous Systems, pp. 721-728, 2006.
  • [5]K. A. Joshi and D. G. Thakore, “A Survey on Moving Object Detection and Tracking in Video Surveillance System,” International Journal of Soft Computing and Engineering (IJSCE), pp. 44-48, 2012.
  • [5]K. A. Joshi and D. G. Thakore, “A Survey on Moving Object Detection and Tracking in Video Surveillance System,” International Journal of Soft Computing and Engineering (IJSCE), pp. 44-48, 2012.
  • [6]R. C. L. Fellow, Y. J. Chen, C. T. Liao and A. C. Tsai, “Mobile Robot Based Human Detection and Tracking Using Range and Intensity Data Fusion,” 2007 IEEE Workshop on Advanced Robotics and Its Social Impacts, Taiwan, 2007.
  • [6]R. C. L. Fellow, Y. J. Chen, C. T. Liao and A. C. Tsai, “Mobile Robot Based Human Detection and Tracking Using Range and Intensity Data Fusion,” 2007 IEEE Workshop on Advanced Robotics and Its Social Impacts, Taiwan, 2007.
  • [7]B. Jung and G. S. Sukhatme, “Detecting Moving Objects using a Single Camera on a Mobile Robot in an Outdoor Environment,” In the 8th Conference on Intelligent Autonomous Systems, Amsterdam, The Netherlands, 2004.
  • [7]B. Jung and G. S. Sukhatme, “Detecting Moving Objects using a Single Camera on a Mobile Robot in an Outdoor Environment,” In the 8th Conference on Intelligent Autonomous Systems, Amsterdam, The Netherlands, 2004.
  • [8]I. Markovic, F. Chaumette and I. Petrovic, “Moving object detection, tracking and following using an omnidirectional camera on a mobile robot,” IEEE International Conference on Robotics & Automation (ICRA), Hong Kong, China, 2014.
  • [8]I. Markovic, F. Chaumette and I. Petrovic, “Moving object detection, tracking and following using an omnidirectional camera on a mobile robot,” IEEE International Conference on Robotics & Automation (ICRA), Hong Kong, China, 2014.
  • [9]M. Yokoyama and T. Poggio, “A Contour-Based Moving Object Detection and Tracking,” Proceedings 2nd Joint IEEE International Workshop on VS-PETS, Beijing, 2005.
  • [9]M. Yokoyama and T. Poggio, “A Contour-Based Moving Object Detection and Tracking,” Proceedings 2nd Joint IEEE International Workshop on VS-PETS, Beijing, 2005.
  • [10]M. Wu and J.-Y. Sun, “Moving Object Detecting and Tracking with Mobile Robot Based on Extended Kalman Filter in Unknown Environment,” International Conference on Machine Vision and Human-machine Interface, 2010.
  • [10]M. Wu and J.-Y. Sun, “Moving Object Detecting and Tracking with Mobile Robot Based on Extended Kalman Filter in Unknown Environment,” International Conference on Machine Vision and Human-machine Interface, 2010.
  • [11]A. Badar, F. I. Khawaja, A. Yasar and M. Naveed, “Human detection and following by a mobile robot using 3D features,” International Conference on Mechatronics and Autonomation, Takamatsu, Japan, 2013.
  • [11]A. Badar, F. I. Khawaja, A. Yasar and M. Naveed, “Human detection and following by a mobile robot using 3D features,” International Conference on Mechatronics and Autonomation, Takamatsu, Japan, 2013.
  • [12]N. Bellotto and H. Hu, “Multisensor-Based Human Detection and Tracking for Mobile Service Robots,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, pp. 167-181, 2013.
  • [12]N. Bellotto and H. Hu, “Multisensor-Based Human Detection and Tracking for Mobile Service Robots,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, pp. 167-181, 2013.
  • [13]X. Zou and B. Ge, “The Image Recognition of Mobile Robot Based on CIE Lab Space,” I.J. Information Technology and Computer Science, pp. 29-35, 2014.
  • [13]X. Zou and B. Ge, “The Image Recognition of Mobile Robot Based on CIE Lab Space,” I.J. Information Technology and Computer Science, pp. 29-35, 2014.
  • [14]S. Pal, A. Pramanik, J. Maiti, and P. Mitra, “Deep learning in multi-object detection and tracking: state of the art,” Applied Intelligence, 2021.
  • [14]S. Pal, A. Pramanik, J. Maiti, and P. Mitra, “Deep learning in multi-object detection and tracking: state of the art,” Applied Intelligence, 2021.
  • [15]M. Elhoseny, “Multi-object Detection and Tracking (MODT) Machine Learning Model for Real-Time Video Surveillance Systems,” Circuits, Systems, and Signal Processing, 2019.
  • [15]M. Elhoseny, “Multi-object Detection and Tracking (MODT) Machine Learning Model for Real-Time Video Surveillance Systems,” Circuits, Systems, and Signal Processing, 2019.
  • [16]A. Pramanik, S. Pal, J. Maiti and P. Mitra, “Granulated RCNN and Multi-Class Deep SORT for Multi-Object Detection and Tracking,” IEEE Transactions On Emerging Topics In Computational Intelligence, pp. 1-11, 2021.
  • [16]A. Pramanik, S. Pal, J. Maiti and P. Mitra, “Granulated RCNN and Multi-Class Deep SORT for Multi-Object Detection and Tracking,” IEEE Transactions On Emerging Topics In Computational Intelligence, pp. 1-11, 2021.
  • [17]P. R. Narkhede and A. V. Gokhale, “Color Particle Filter Based Object Tracking using Frame Segmentation in CIELab* and HSV Color Spaces,” IEEE ICCSP 2015 conference, 2015.
  • [17]P. R. Narkhede and A. V. Gokhale, “Color Particle Filter Based Object Tracking using Frame Segmentation in CIELab* and HSV Color Spaces,” IEEE ICCSP 2015 conference, 2015.
  • [18]A. Mondal, A. Ghosh and S. Ghosh, “Partially Camouflaged Object Tracking using Modified Probabilistic Neural Network and Fuzzy Energy based Active Contour,” International Journal of Computer Vision, p. 116–148, 2017.
  • [18]A. Mondal, A. Ghosh and S. Ghosh, “Partially Camouflaged Object Tracking using Modified Probabilistic Neural Network and Fuzzy Energy based Active Contour,” International Journal of Computer Vision, p. 116–148, 2017.
  • [19]Y.-C. Li and S.-Y. Huang, “Fast-Moving Object Tracking in Air Hockey,” International Conference on Mechatronics and Automation, Takamatsu, Japan, 2017.
  • [19]Y.-C. Li and S.-Y. Huang, “Fast-Moving Object Tracking in Air Hockey,” International Conference on Mechatronics and Automation, Takamatsu, Japan, 2017.
  • [20]Q. Peng, S.-d. Zhong and L.-f. Tu, “Cast shadow detection for moving objects based on binocular stereo vision,” Journal of Central South University, pp. 651–658, 2014.
  • [20]Q. Peng, S.-d. Zhong and L.-f. Tu, “Cast shadow detection for moving objects based on binocular stereo vision,” Journal of Central South University, pp. 651–658, 2014.
  • [21]S. Sankarasrinivasan, E. Balasubramanian, F. Y. Hsiao and L. J. Yang, “Robust Target Tracking Algorithm for MAV Navigation System,” International Conference on Industrial Instrumentation and Control (ICIC), India, 2015.
  • [21]S. Sankarasrinivasan, E. Balasubramanian, F. Y. Hsiao and L. J. Yang, “Robust Target Tracking Algorithm for MAV Navigation System,” International Conference on Industrial Instrumentation and Control (ICIC), India, 2015.
  • [22]H. Kim, W. Chung and Y. Yoo, “Detection and tracking of human legs for a mobile service robot,” International Conference on Advanced Intelligent Mechatronics, Montreal, Canada, 2010.
  • [22]H. Kim, W. Chung and Y. Yoo, “Detection and tracking of human legs for a mobile service robot,” International Conference on Advanced Intelligent Mechatronics, Montreal, Canada, 2010.

CIE L^* a^* b^* Renk Uzayı Kullanan Mobil Robotun Nesne Algılama ve Takibi Üzerine Bir Çalışma

Year 2022, Volume: 10 Issue: 5, 77 - 90, 26.12.2022
https://doi.org/10.29130/dubited.1109850

Abstract

Otonom araçlar günlük yaşamda ve endüstriyel uygulamalarda giderek daha fazla kullanılmaktadır. Mobil robot teknolojileri bu alanlarda otonom mimarilere öncülük etmektedir. Mobil robotların yol planlama yöntemleri, gerçekleştirdikleri amaca yönelik farklılıklar içermektedir. Belirlenen bir başlangıç noktasından hedef noktaya kadar olan bu yörünge planlaması temelde görüntü işlemeden yapay zekaya kadar birçok tekniği beraberinde getirmektedir. Çalışmada, farklı çap ve renklerde dairesel nesnelerin mobil bir robot tarafından takibi konusunda özgün tasarımlı bir uygulama gerçekleştirilmiştir. ROS sunucu-istemci mimarisi kullanılarak RGB-D kamera ile CIE L^* a^* b^* renk uzayı ile hareketli nesne algılanır. Mobil robot, algılanan nesneyi belirli bir mesafede sabit bir hızla takip eder. Görüntü filtreleme parametreleri, yayıncı abone parametreleri ile Matlab ortamında mobil robot tarafından işlenir. Böylece görüntü işleme sonucunda algılanan ve önceden belirlenen farklı renklerde iki dairesel nesne, mobil robot tarafından belirli bir hızda sürekli olarak takip edilmektedir. CIE L^* a^* b^* renk uzayına bağlı olarak görüntüde farklı çap, boyut toleransı ve renk parametreleri kullanılarak deneyler yapılmıştır.

References

  • [1]R. Munoz-Salinas, E. Aguirre and M. Garcia-Silvente, “People detection and tracking using stereo vision and color,” Image and Vision Computing, pp. 995-1007, 2007.
  • [1]R. Munoz-Salinas, E. Aguirre and M. Garcia-Silvente, “People detection and tracking using stereo vision and color,” Image and Vision Computing, pp. 995-1007, 2007.
  • [2]A. Treptow, G. Cielniak and T. Duckett, “Real-time people tracking for mobile robots using thermal vision,” Robotics and Autonomous Systems, pp. 729-739, 2006.
  • [2]A. Treptow, G. Cielniak and T. Duckett, “Real-time people tracking for mobile robots using thermal vision,” Robotics and Autonomous Systems, pp. 729-739, 2006.
  • [3]M. Munaro, F. Basso and E. Menegatt, “Tracking people within groups with RGB-D data,” International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal, 2012.
  • [3]M. Munaro, F. Basso and E. Menegatt, “Tracking people within groups with RGB-D data,” International Conference on Intelligent Robots and Systems, Vilamoura, Algarve, Portugal, 2012.
  • [4]C. Martin, E. Schaffernicht and H. M. Gross, “Multi-modal sensor fusion using a probabilistic aggregation scheme forpeople detection and tracking,” Robotics and Autonomous Systems, pp. 721-728, 2006.
  • [4]C. Martin, E. Schaffernicht and H. M. Gross, “Multi-modal sensor fusion using a probabilistic aggregation scheme forpeople detection and tracking,” Robotics and Autonomous Systems, pp. 721-728, 2006.
  • [5]K. A. Joshi and D. G. Thakore, “A Survey on Moving Object Detection and Tracking in Video Surveillance System,” International Journal of Soft Computing and Engineering (IJSCE), pp. 44-48, 2012.
  • [5]K. A. Joshi and D. G. Thakore, “A Survey on Moving Object Detection and Tracking in Video Surveillance System,” International Journal of Soft Computing and Engineering (IJSCE), pp. 44-48, 2012.
  • [6]R. C. L. Fellow, Y. J. Chen, C. T. Liao and A. C. Tsai, “Mobile Robot Based Human Detection and Tracking Using Range and Intensity Data Fusion,” 2007 IEEE Workshop on Advanced Robotics and Its Social Impacts, Taiwan, 2007.
  • [6]R. C. L. Fellow, Y. J. Chen, C. T. Liao and A. C. Tsai, “Mobile Robot Based Human Detection and Tracking Using Range and Intensity Data Fusion,” 2007 IEEE Workshop on Advanced Robotics and Its Social Impacts, Taiwan, 2007.
  • [7]B. Jung and G. S. Sukhatme, “Detecting Moving Objects using a Single Camera on a Mobile Robot in an Outdoor Environment,” In the 8th Conference on Intelligent Autonomous Systems, Amsterdam, The Netherlands, 2004.
  • [7]B. Jung and G. S. Sukhatme, “Detecting Moving Objects using a Single Camera on a Mobile Robot in an Outdoor Environment,” In the 8th Conference on Intelligent Autonomous Systems, Amsterdam, The Netherlands, 2004.
  • [8]I. Markovic, F. Chaumette and I. Petrovic, “Moving object detection, tracking and following using an omnidirectional camera on a mobile robot,” IEEE International Conference on Robotics & Automation (ICRA), Hong Kong, China, 2014.
  • [8]I. Markovic, F. Chaumette and I. Petrovic, “Moving object detection, tracking and following using an omnidirectional camera on a mobile robot,” IEEE International Conference on Robotics & Automation (ICRA), Hong Kong, China, 2014.
  • [9]M. Yokoyama and T. Poggio, “A Contour-Based Moving Object Detection and Tracking,” Proceedings 2nd Joint IEEE International Workshop on VS-PETS, Beijing, 2005.
  • [9]M. Yokoyama and T. Poggio, “A Contour-Based Moving Object Detection and Tracking,” Proceedings 2nd Joint IEEE International Workshop on VS-PETS, Beijing, 2005.
  • [10]M. Wu and J.-Y. Sun, “Moving Object Detecting and Tracking with Mobile Robot Based on Extended Kalman Filter in Unknown Environment,” International Conference on Machine Vision and Human-machine Interface, 2010.
  • [10]M. Wu and J.-Y. Sun, “Moving Object Detecting and Tracking with Mobile Robot Based on Extended Kalman Filter in Unknown Environment,” International Conference on Machine Vision and Human-machine Interface, 2010.
  • [11]A. Badar, F. I. Khawaja, A. Yasar and M. Naveed, “Human detection and following by a mobile robot using 3D features,” International Conference on Mechatronics and Autonomation, Takamatsu, Japan, 2013.
  • [11]A. Badar, F. I. Khawaja, A. Yasar and M. Naveed, “Human detection and following by a mobile robot using 3D features,” International Conference on Mechatronics and Autonomation, Takamatsu, Japan, 2013.
  • [12]N. Bellotto and H. Hu, “Multisensor-Based Human Detection and Tracking for Mobile Service Robots,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, pp. 167-181, 2013.
  • [12]N. Bellotto and H. Hu, “Multisensor-Based Human Detection and Tracking for Mobile Service Robots,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, pp. 167-181, 2013.
  • [13]X. Zou and B. Ge, “The Image Recognition of Mobile Robot Based on CIE Lab Space,” I.J. Information Technology and Computer Science, pp. 29-35, 2014.
  • [13]X. Zou and B. Ge, “The Image Recognition of Mobile Robot Based on CIE Lab Space,” I.J. Information Technology and Computer Science, pp. 29-35, 2014.
  • [14]S. Pal, A. Pramanik, J. Maiti, and P. Mitra, “Deep learning in multi-object detection and tracking: state of the art,” Applied Intelligence, 2021.
  • [14]S. Pal, A. Pramanik, J. Maiti, and P. Mitra, “Deep learning in multi-object detection and tracking: state of the art,” Applied Intelligence, 2021.
  • [15]M. Elhoseny, “Multi-object Detection and Tracking (MODT) Machine Learning Model for Real-Time Video Surveillance Systems,” Circuits, Systems, and Signal Processing, 2019.
  • [15]M. Elhoseny, “Multi-object Detection and Tracking (MODT) Machine Learning Model for Real-Time Video Surveillance Systems,” Circuits, Systems, and Signal Processing, 2019.
  • [16]A. Pramanik, S. Pal, J. Maiti and P. Mitra, “Granulated RCNN and Multi-Class Deep SORT for Multi-Object Detection and Tracking,” IEEE Transactions On Emerging Topics In Computational Intelligence, pp. 1-11, 2021.
  • [16]A. Pramanik, S. Pal, J. Maiti and P. Mitra, “Granulated RCNN and Multi-Class Deep SORT for Multi-Object Detection and Tracking,” IEEE Transactions On Emerging Topics In Computational Intelligence, pp. 1-11, 2021.
  • [17]P. R. Narkhede and A. V. Gokhale, “Color Particle Filter Based Object Tracking using Frame Segmentation in CIELab* and HSV Color Spaces,” IEEE ICCSP 2015 conference, 2015.
  • [17]P. R. Narkhede and A. V. Gokhale, “Color Particle Filter Based Object Tracking using Frame Segmentation in CIELab* and HSV Color Spaces,” IEEE ICCSP 2015 conference, 2015.
  • [18]A. Mondal, A. Ghosh and S. Ghosh, “Partially Camouflaged Object Tracking using Modified Probabilistic Neural Network and Fuzzy Energy based Active Contour,” International Journal of Computer Vision, p. 116–148, 2017.
  • [18]A. Mondal, A. Ghosh and S. Ghosh, “Partially Camouflaged Object Tracking using Modified Probabilistic Neural Network and Fuzzy Energy based Active Contour,” International Journal of Computer Vision, p. 116–148, 2017.
  • [19]Y.-C. Li and S.-Y. Huang, “Fast-Moving Object Tracking in Air Hockey,” International Conference on Mechatronics and Automation, Takamatsu, Japan, 2017.
  • [19]Y.-C. Li and S.-Y. Huang, “Fast-Moving Object Tracking in Air Hockey,” International Conference on Mechatronics and Automation, Takamatsu, Japan, 2017.
  • [20]Q. Peng, S.-d. Zhong and L.-f. Tu, “Cast shadow detection for moving objects based on binocular stereo vision,” Journal of Central South University, pp. 651–658, 2014.
  • [20]Q. Peng, S.-d. Zhong and L.-f. Tu, “Cast shadow detection for moving objects based on binocular stereo vision,” Journal of Central South University, pp. 651–658, 2014.
  • [21]S. Sankarasrinivasan, E. Balasubramanian, F. Y. Hsiao and L. J. Yang, “Robust Target Tracking Algorithm for MAV Navigation System,” International Conference on Industrial Instrumentation and Control (ICIC), India, 2015.
  • [21]S. Sankarasrinivasan, E. Balasubramanian, F. Y. Hsiao and L. J. Yang, “Robust Target Tracking Algorithm for MAV Navigation System,” International Conference on Industrial Instrumentation and Control (ICIC), India, 2015.
  • [22]H. Kim, W. Chung and Y. Yoo, “Detection and tracking of human legs for a mobile service robot,” International Conference on Advanced Intelligent Mechatronics, Montreal, Canada, 2010.
  • [22]H. Kim, W. Chung and Y. Yoo, “Detection and tracking of human legs for a mobile service robot,” International Conference on Advanced Intelligent Mechatronics, Montreal, Canada, 2010.
There are 44 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Articles
Authors

Gökhan Atalı 0000-0003-1215-9249

Meltem Eyüboğlu 0000-0002-2268-6539

Publication Date December 26, 2022
Published in Issue Year 2022 Volume: 10 Issue: 5

Cite

APA Atalı, G., & Eyüboğlu, M. (2022). A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space. Düzce Üniversitesi Bilim Ve Teknoloji Dergisi, 10(5), 77-90. https://doi.org/10.29130/dubited.1109850
AMA Atalı G, Eyüboğlu M. A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space. DUBİTED. December 2022;10(5):77-90. doi:10.29130/dubited.1109850
Chicago Atalı, Gökhan, and Meltem Eyüboğlu. “A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space”. Düzce Üniversitesi Bilim Ve Teknoloji Dergisi 10, no. 5 (December 2022): 77-90. https://doi.org/10.29130/dubited.1109850.
EndNote Atalı G, Eyüboğlu M (December 1, 2022) A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space. Düzce Üniversitesi Bilim ve Teknoloji Dergisi 10 5 77–90.
IEEE G. Atalı and M. Eyüboğlu, “A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space”, DUBİTED, vol. 10, no. 5, pp. 77–90, 2022, doi: 10.29130/dubited.1109850.
ISNAD Atalı, Gökhan - Eyüboğlu, Meltem. “A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space”. Düzce Üniversitesi Bilim ve Teknoloji Dergisi 10/5 (December 2022), 77-90. https://doi.org/10.29130/dubited.1109850.
JAMA Atalı G, Eyüboğlu M. A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space. DUBİTED. 2022;10:77–90.
MLA Atalı, Gökhan and Meltem Eyüboğlu. “A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space”. Düzce Üniversitesi Bilim Ve Teknoloji Dergisi, vol. 10, no. 5, 2022, pp. 77-90, doi:10.29130/dubited.1109850.
Vancouver Atalı G, Eyüboğlu M. A Study on Object Detection and Tracking of a Mobile Robot Using CIE L^* a^* b^* Color Space. DUBİTED. 2022;10(5):77-90.