Transfer Learning for Classifying Single Hand Gestures on Comprehensive Bharatanatyam Mudra Dataset

Autor: Rajshekhar Sunderraman, Heta P. Desai, Michael Weeks, Anuja P. Parameshwaran
Rok vydání: 2019
Předmět:
Zdroj: CVPR Workshops
DOI: 10.1109/cvprw.2019.00074
Popis: For any dance form, either classical or folk, visual expressions - facial expressions and hand gestures play a key role in conveying the storyline of the accompanied music to the audience. Bharatanatyam – a classical dance form which has origins from the southern states of India, is on the verge of being completely automated partly due to an acute dearth of qualified and dedicated teachers/gurus. In an honest effort to speed up this automation process and at the same time preserve the cultural heritage, we have chosen to identify and classify the single hand gestures/mudras/hastas against their true labels by using two variations of the convolutional neural networks (CNNs) that demonstrates the exceeding effectiveness of transfer learning irrespective of the domain difference between the pre-training and the training dataset. This work is primarily aimed at 1) building a novel dataset of 2D single hand gestures belonging to 27 classes that were collected from Google search engine (Google images), YouTube videos (dynamic and with background considered) and professional artists under staged environment constraints (plain backgrounds), 2) exploring the effectiveness of Convolutional Neural Networks in identifying and classifying the single hand gestures by optimizing the hyperparameters, and 3) evaluating the impacts of transfer learning and double transfer learning, which is a novel concept explored in this paper for achieving higher classification accuracy.
Databáze: OpenAIRE