Ball Detection and Color Identification for a Mobile Robot using a 2D Camera

Authors

  • Khac Trung Chu Testing and Assessment Centre, Hanoi University of Industry, Vietnam
  • Minh Hieu Hoang School of Mechanical and Automotive Engineering, Hanoi University of Industry, Vietnam
  • Quoc Bao Tran School of Mechanical and Automotive Engineering, Hanoi University of Industry, Vietnam
Volume: 15 | Issue: 2 | Pages: 21665-21670 | April 2025 | https://doi.org/10.48084/etasr.9821

Abstract

In this study, a novel method is developed to help the mobile robot system accurately detect and recognize the color of a ball in environments with light disturbances using deep learning. The YOLOv8 algorithm is applied to detect the ball and identify its color. The effectiveness of the algorithm is tested in various lighting conditions and when the balls are inside a silo and when they are outside. The developed algorithm identifies balls even when they are partially obscured by shadows.

Keywords:

YOLOv8, ball detection, 2D camera, disturbance environments, mobile robot

Downloads

Download data is not yet available.

References

P. Goswami, L. Aggarwal, A. Kumar, R. Kanwar, and U. Vasisht, "Real-time evaluation of object detection models across open world scenarios," Applied Soft Computing, vol. 163, Sep. 2024, Art. no. 111921.

R. Gaddam, A. R. Bavanthula, M. K. Asoori, D. P. Chintakunta, and A. Volki, "A Comprehensive Object Detection Model Empowered with Deep Learning," in IEEE International Conference on Computing, Langkawi, Malaysia, Oct. 2023, pp. 380–385.

W. Tarimo, M. M.Sabra, and S. Hendre, "Real-Time Deep Learning-Based Object Detection Framework," in IEEE Symposium Series on Computational Intelligence, Canberra, ACT, Australia, Dec. 2020, pp. 1829–1836.

A. Singh Rajawat et al., "Reformist Framework for Improving Human Security for Mobile Robots in Industry 4.0," Mobile Information Systems, vol. 2021, no. 1, 2021, Art. no. 4744220.

N. Kalra, P. Verma, and S. Verma, "Advancements in AI based healthcare techniques with FOCUS ON diagnostic techniques," Computers in Biology and Medicine, vol. 179, Sep. 2024, Art. no. 108917.

T. B. Sheridan, "Human–Robot Interaction: Status and Challenges," Human Factors, vol. 58, no. 4, pp. 525–532, Jun. 2016.

J. Redmon and A. Farhadi, "YOLOv3: An Incremental Improvement." arXiv, Apr. 08, 2018.

A. Afdhal, K. Saddami, S. Sugiarto, Z. Fuadi, and N. Nasaruddin, "Real-Time Object Detection Performance of YOLOv8 Models for Self-Driving Cars in a Mixed Traffic Environment," in 2nd International Conference on Computer System, Information Technology, and Electrical Engineering, Banda Aceh, Indonesia, Aug. 2023, pp. 260–265.

W. Hong, Z. Ma, B. Ye, G. Yu, T. Tang, and M. Zheng, "Detection of Green Asparagus in Complex Environments Based on the Improved YOLOv5 Algorithm," Sensors, vol. 23, no. 3, Jan. 2023, Art. no. 1562.

J. Wang et al., "Research on Improved YOLOv5 for Low-Light Environment Object Detection," Electronics, vol. 12, no. 14, Jan. 2023, Art. no. 3089.

M. Talib, A. H. Y. Al-Noori, and J. Suad, "YOLOv8-CAB: Improved YOLOv8 for Real-time object detection," Karbala International Journal of Modern Science, vol. 10, no. 1, Jan. 2024, Art. no. 5.

G. Yang, J. Wang, Z. Nie, H. Yang, and S. Yu, "A Lightweight YOLOv8 Tomato Detection Algorithm Combining Feature Enhancement and Attention," Agronomy, vol. 13, no. 7, Jul. 2023, Art. no. 1824.

R. Kaur and S. Singh, "A comprehensive review of object detection with deep learning," Digital Signal Processing, vol. 132, Jan. 2023, Art. no. 103812.

A. Wang et al., "YOLOv10: Real-Time End-to-End Object Detection." arXiv, Oct. 30, 2024.

M. Hussain, "YOLO-v1 to YOLO-v8, the Rise of YOLO and Its Complementary Nature toward Digital Manufacturing and Industrial Defect Detection," Machines, vol. 11, no. 7, Jul. 2023, Art. no. 677.

R. Varghese and M. Sambath, "YOLOv8: A Novel Object Detection Algorithm with Enhanced Performance and Robustness," in International Conference on Advances in Data Engineering and Intelligent Computing Systems, Chennai, India, Apr. 2024, pp. 1–6.

B. Ma et al., "Using an improved lightweight YOLOv8 model for real-time detection of multi-stage apple fruit in complex orchard environments," Artificial Intelligence in Agriculture, vol. 11, pp. 70–82, Mar. 2024.

V.-T. Nguyen, C.-D. Do, T.-D. Hoang, D.-T. Nguyen, and N. T. Le, "Person Detection for Monitoring Individuals Accessing the Robot Working Zones Using YOLOv8," in International Conference on Green Technology and Sustainable Development, Ho Chi Minh, Vietnam, Jul. 2024, pp. 50–59.

V.-T. Nguyen, C.-D. Do, T.-V. Dang, T.-L. Bui, and P. X. Tan, "A comprehensive RGB-D dataset for 6D pose estimation for industrial robots pick and place: Creation and real-world validation," Results in Engineering, vol. 24, Dec. 2024, Art. no. 103459.

V.-T. Nguyen and D.-T. Chu, "Implementation of Real-Time Human Tracking System Based on Deep Learning Using Kinect Camera," in Intelligent Systems and Networks, N. L. Anh, S.-J. Koh, T. D. L. Nguyen, J. Lloret, and T. T. Nguyen, Eds. New York, NY, USA: Springer, 2022, pp. 230–236.

V.-T. Nguyen and D.-T. Chu, "Study on Tracking Real-Time Target Human Using Deep Learning for High Accuracy," Journal of Robotics, vol. 2023, no. 1, 2023, Art. no. 9446956.

V.-T. Nguyen, D.-T. Chu, D.-H. Phan, and N.-T. Tran, "An Improvement of the Camshift Human Tracking Algorithm Based on Deep Learning and the Kalman Filter," Journal of Robotics, vol. 2023, no. 1, 2023, Art. no. 5525744.

H.-R. Sun, B.-J. Shi, Y.-T. Zhou, J.-H. Chen, and Y.-L. Hu, "A Smoke Detection Algorithm Based on Improved YOLO v7 Lightweight Model for UAV Optical Sensors," IEEE Sensors Journal, vol. 24, no. 16, pp. 26136–26147, Dec. 2024.

J. A. Vicente-Martinez, M. Marquez-Olivera, A. Garcia-Aliaga, and V. Hernandez-Herrera, "Adaptation of YOLOv7 and YOLOv7_tiny for Soccer-Ball Multi-Detection with DeepSORT for Tracking by Semi-Supervised System," Sensors, vol. 23, no. 21, Jan. 2023, Art. no. 8693.

Downloads

How to Cite

[1]
Chu, K.T., Hoang, M.H. and Tran, Q.B. 2025. Ball Detection and Color Identification for a Mobile Robot using a 2D Camera. Engineering, Technology & Applied Science Research. 15, 2 (Apr. 2025), 21665–21670. DOI:https://doi.org/10.48084/etasr.9821.

Metrics

Abstract Views: 2
PDF Downloads: 1

Metrics Information