Gesture-based robots manipulation using computer vision

Authors

Keywords:

Gestual control, human-robot interation, Computer vision, Teleoperation, Robotics

Abstract

This article addresses the gestural manipulation of articulated robots through computer vision, with the aim of enabling their remote control for the execution of specific tasks. Gestural manipulation has become an intuitive and efficient alternative within teleoperation systems, as it allows for natural interaction between humans and robots. A brief taxonomy of gestural control systems is presented, classifying them into two main approaches: those based on physical sensors (such as gloves of inertial devices) and those that use computer vision. This study analyzes the second approach, exploring various methodologies that enable the recognition of human gestures from images or video sequences processed by computer vision algorithms. Existing proposals that map these gestures into commands for the control of articulated robots are analyzed, considering aspects such as recognition accuracy, latency, and system adaptability. Through a comparative analysis of these methodologies, advantages, limitations, and potential areas for improvement are identified. The results highlight the potential of computer vision-based systems to offer freedom and more natural interaction, without the need for additional equipment, which can lead to more accessible and versatile solutions.

Downloads

Download data is not yet available.

References

Tsarouchi, P., Athanasatos, A., Makris, S., Chatzigeorgiou, X., & Chryssolouris, G. (2016). High level robot programming using body and hand gestures. Procedia Cirp, 55, 1-5.

Padilla, A. F., Peña, C. A., & Moreno-Contreras, G. G. (2020, November). Advances in industrial robots programming applying gestural guidance techniques. In Journal of Physics: Conference Series (Vol. 1704, No. 1, p. 012001). IOP Publishing.

Qi, J., Ma, L., Cui, Z., & Yu, Y. (2024). Computer vision-based hand gesture recognition for human-robot interaction: a review. Complex & Intelligent Systems, 10(1), 1581-1606.

Almansour, A. M. (2024). The Effectiveness of Virtual Reality in Rehabilitation of Athletes: A Systematic Review and Meta-Analysis. Journal of Pioneering Medical Sciences, 13, 147-154.

Yavuz, E., Şenol, Y., Özçelik, M., & Aydın, H. (2021). Design of a String Encoder‐and‐IMU‐Based 6D Pose Measurement System for a Teaching Tool and Its Application in Teleoperation of a Robot Manipulator. Journal of Sensors, 2021(1), 6678673.

Gómez Echeverry, L. L., Jaramillo Henao, A. M., Ruiz Molina, M. A., Velásquez Restrepo, S. M., Páramo Velásquez, C. A., & Silva Bolívar, G. J. (2018). Sistemas de captura y análisis de movimiento cinemático humano: Una revisión sistemática. Prospectiva, 16(2), 24-34.

Xie, J., Xu, Z., Zeng, J., Gao, Y., & Hashimoto, K. (2025). Human–Robot Interaction Using Dynamic Hand Gesture for Teleoperation of Quadruped Robots with a Robotic Arm. Electronics, 14(5), 860.

Angelidis, G., & Bampis, L. (2025). Gesture-Controlled Robotic Arm for Small Assembly Lines. Machines, 13(3), 182.

Chen, L., Li, C., Fahmy, A., & Sienz, J. (2024). GestureMoRo: an algorithm for autonomous mobile robot teleoperation based on gesture recognition. Scientific Reports, 14(1), 6199.

Zick, L. A., Martinelli, D., Schneider de Oliveira, A., & Cremer Kalempa, V. (2024). Teleoperation system for multiple robots with intuitive hand recognition interface. Scientific Reports, 14(1), 1-11.

Bamani, E., Nissinman, E., Meir, I., Koenigsberg, L., & Sintov, A. (2024). Ultra-range gesture recognition using a web-camera in human–robot interaction. Engineering Applications of Artificial Intelligence, 132, 108443.

Cucurull, X., & Garrell, A. (2023). Continual Learning of Hand Gestures for Human-Robot Interaction. arXiv preprint arXiv:2304.06319.

Ramalingam, B., & Angappan, G. (2023). A deep hybrid model for human-computer interaction using dynamic hand gesture recognition. Computer Assisted Methods in Engineering and Science, 30(3), 263-276.

Bonci, A., Cen Cheng, P. D., Indri, M., Nabissi, G., & Sibona, F. (2021). Human-robot perception in industrial environments: A survey. Sensors, 21(5), 1571.

Mazhar, O., Ramdani, S., & Cherubini, A. (2021). A deep learning framework for recognizing both static and dynamic gestures. Sensors, 21(6), 2227.

Kobzarev, O., Lykov, A., & Tsetserukou, D. (2025). GestLLM: Advanced Hand Gesture Interpretation via Large Language Models for Human-Robot Interaction. arXiv preprint arXiv:2501.07295.

Downloads

Published

2025-05-30

How to Cite

Gesture-based robots manipulation using computer vision. (2025). International Journal of Information Science and Technological Applications-UAS IJISTA, 1(1), 52 – 57. https://revistas.uas.edu.mx/index.php/IJISTA/article/view/1176