The International Arab Journal of Information Technology (IAJIT)

..............................
..............................
..............................


Convolutional Neural Network Based Hand Gesture Recognition in Sophisticated Background for Humanoid Robot Control

Hand gesture recognition is a preferred way for human-robot interactions. Conventional approaches are generally based on image processing and recognition of hand poses with simple backgrounds. In this paper, we propose deep learning models, and humanoid robot integration for offline and online (real-time) recognition and control using hand gestures. One thousand and two hundred of hand images belonging to four participants are collected to construct the hand gesture database. Five class (forward, backward, right, left and stop) images in six sophisticated backgrounds with different illumination levels are obtained for four participants, and then one participant's images are kept as testing data. A lightweight Convolutional Neural Network (CNN), and transfer learning techniques using VGG16, and Mobilenetv2 are performed on this database to evaluate user independent performance of the hand gesture system. After offline training, real-time implementation is designed using a mobile phone (Wi-Fi and camera), Wi-Fi router, computer with embedded deep learning algorithms, and NAO humanoid robot. Streamed video by the mobile phone is processed and recognized using the proposed deep algorithm in the computer, and then command is transferred to robot via TCP/IP protocol. Thus, the NAO humanoid robot control using hand gesture in RGB and HSV color spaces is evaluated in sophisticated background, and the implementation of the system is presented. In our simulations, 95% and 100% accuracy rates are yielded for the lightweight CNN, and transfer learning, respectively.

 


[1] Ahmad N. and Asif H., “Perceptual Quality Assessment of Digital Images Using Deep Features,” Computing and Informatics, vol. 39, no. 3, pp. 385-409, 2020.

[2] Chen L., Fu J., Wu Y., Li H., and Zheng B., “Hand Gesture Recognition Using Compact CNN Via Surface Electromyography Signals,” Sensors, vol. 20, no. 3, pp. 672, 2020.

[3] D’Orazio T., Marani R., Reno V., and Cicirelli G., “Recent Trends in Gesture Recognition: How Depth Data Has Improved Classical Approaches,” Image and Vision Computing, vol. 52, pp. 56-72, 2016.

[4] Dardas N. and Georganas N., “Real-Time Hand Gesture Detection and Recognition Using Bag-Of- Features and Support Vector Machine Techniques,” IEEE Transactions on Instrumentation and Measurement, vol. 60, no. 11, pp. 3592-3607, 2011.

[5] Erozen A., “A New CNN Approach for Hand Gesture Classification Using sEMG Data,” Journal of Innovative Science and Engineering, vol. 4, no. 1, pp. 44-55, 2020.

[6] Hafiane S., Salih Y., and Malik A., “3D Hand Recognition for Telerobotics,” in Proceedings of the IEEE Symposium on Computers and Informatics, Langkawi, pp. 132-137, 2013.

[7] Hu B. and Wang J., “Deep Learning-Based Hand Gesture Recognition and UAV Flight Controls,” International Journal of Automation and Computing, vol. 17, no. 1, pp. 17-29, 2020.

[8] Hu Y., Sirlantzis K., Howells G., Ragot N., and Rodr´ıguez P., “An Online Background Subtraction Algorithm Deployed on A Nao Humanoid Robot-Based Monitoring System,” Robotics and Autonomous Systems, vol. 85, pp. 37- 47, 2016.

[9] Huang D., Hu W., and Chang S., “Gabor Filter- Based Hand-Pose Angle Estimation for Hand Gesture Recognition under Varying Illumination,” Expert Systems with Applications, vol. 38, no. 5, pp. 6031-6042, 2011.

[10] Jacob M., Wachs J., and Packer R., “Hand- Gesture-Based Sterile Interface for The Operating Room Using Contextual Cues for The Navigation of Radiological Images,” Journal of the American Medical Informatics Association. vol. 20, pp. 183- 186, 2013.

[11] Krizhevsky A., Sutskever I., and Hinton G., “Imagenet Classification with Deep Convolutional Neural Networks,” Communications of the ACM, vol. 60, no. 6, pp. 84-90, 2017.

[13] Levinson L., Gvirsman O., Gorodesky I., Perez A., Gonen E., and Gordon G., “Learning in Summer Camp with Social Robots: A Morphological Study,” International Journal of Social Robotics, vol. 13, no. 2, pp. 999-1012, 2021.

[13] Li D., Li Y., Li J., and Fu Y., “Gesture Recognition Based on BP Neural Network Improved by Chaotic Genetic Algorithm,” International Journal of Automation and Computing, vol. 15, no. 3, pp. 267- 374 The International Arab Journal of Information Technology, Vol. 20, No. 3, May 2023 276, 2018.

[14] Li G., Tang H., Sun Y., Kong J., Jiang G., Jiang D., Tao B., Xu S., and Liu H., “Hand Gesture Recognition Based on Convolution Neural Network,” Cluster Computing, vol. 22, no. 2, pp. 2719-2729, 2019.

[15] Liu H. and Wang L., “Gesture Recognition for Human-Robot Collaboration: A Review,” International Journal of Industrial Ergonomics, vol. 68, pp. 355-367, 2018.

[16] Mazhar O, Navarro B, Ramdani S., Passama R., and Cherubini A., “A Real-Time Human-Robot Interaction Framework with Robust Background Invariant Hand Gesture Detection,” Robotics and Computer-Integrated Manufacturing, vol. 60, pp. 34-48, 2019.

[17] Mazhar O., Ramdani S., Navarro B, Passama R., and Cherubini A., “Towards Real-Time Physical Human-Robot Interaction Using Skeleton Information and Hand Gestures,” in Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Madrid, pp. 1-6, 2018.

[18] Mirza N., “Robotic Path Planning and Fuzzy Neural Networks,” The International Arab Journal of Information Technology, vol. 17, no. 4A, pp. 615-620, 2020.

[19] Nolker C. and Ritter H., “Visual Recognition of Continuous Hand Postures,” IEEE Transactions on Neural Networks, vol. 13, no. 4, pp. 983-994, 2002.

[20] Phung V. and Rhee E., “A Deep Learning Approach for Classification of Cloud Image Patches on Small Datasets,” Journal of Information and Communication Convergence Engineering, vol. 16, no. 3, pp. 173-178, 2018.

[21] Pisharady P. and Saerbeck M., “Recent Methods and Databases in Vision-Based Hand Gesture Recognition: a Review,” Computer Vision and Image Understanding, vol. 141, pp. 152-165, 2015.

[22] Pisharady P., Vadakkepat P., and Loh A., “Attention Based Detection and Recognition of Hand Postures against Complex Backgrounds,” International Journal of Computer Vision, vol. 101, no. 3, pp. 403-419, 2013.

[23] Russakovsky O., Deng J., Su H., Krause J., Satheesh S., Ma S., Huang Z., Karpathy A., Khosla A., Bernstein M., Berg A., and FeiFei L., “Imagenet Large Scale Visual Recognition Challenge,” International Journal of Computer Vision, vol. 115, no. 3, pp. 211-252, 2015.

[24] Sagayam K. and Hemanth D., “Hand Posture and Gesture Recognition Techniques for Virtual Reality Applications: a Survey,” Virtual Reality, vol. 21, no. 2, pp. 91-107, 2017.

[25] Sandler M., Howard A., Zhu M., Zhmoginov A., and Chen L., “MobilenetV2: Inverted Residuals and Linear Bottlenecks,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, pp. 4510-4520, 2018.

[26] Sokhib T. and Whangbo T., “A Combined Method of Skin-and Depth-Based Hand Gesture Recognition,” The International Arab Journal of Information Technology, vol. 17, no. 1, pp. 137- 145, 2020.

[27] Szegedy C., Liu W., Jia Y., Sermanet P., Reed S., Anguelov D., Erhan D., Vanhoucke V., and Rabinovitch A., “Going Deeper with Convolutions,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, pp. 1-9, 2015.

[28] Tang G. and Webb P., “The Design and Evaluation of An Ergonomic Contactless Gesture Control System for Industrial Robots,” Journal of Robotics, vol. 2018, no. 2, pp. 1-10, 2018.

[29] Yu N., Xu C., Wang K., Yang Z., and Liu J., “Gesture-Based Telemanipulation of a Humanoid Robot for Home Service Tasks,” in Proceedings of the IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems, Shenyang pp. 1923-1927, 2015.