Treffer: Gesture Recognition Using Tensor flow, Opencv and Python.
Weitere Informationen
The field of gesture recognition in computer vision and human-computer interaction is fast developing and has a wide range of uses, including immersive gaming and the interpretation of sign languages. In order to create a powerful gesture detection system, our project combines OpenCV, Python, Deep Learning with SSD (Single Shot MultiBox Detector), and TensorFlow. This work investigates the use of these techniques to create a reliable gesture recognition system. The technology efficiently records and decodes hand and body movements, enabling real-time gesture detection. Gesture detection has numerous uses in fields including augmented reality, sign language translation, and human-computer interface. Gesture recognition improves user experiences in the field of human-computer interaction by allowing touchless control of gadgets like gaming consoles and cellphones. It helps decrease communication gaps between the deaf and hearing communities by translating sign language into text or voice in the context of sign language translation. Gesture detection is also essential in augmented reality since it allows users to interact with virtual items through natural hand motions[1]. In our research, we present a method for creating a dataset for gesture or sign detection via webcam video. The creation of a real-time Gesture Recognition system results from the use of transfer learning to train a TensorFlow model. Surprisingly, our system performs admirably well in terms of accuracy even when trained on a little dataset. [ABSTRACT FROM AUTHOR]
Copyright of Amity Journal of Computational Sciences is the property of Amity University Greater Noida Campus and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)