Music can be a powerful tool to describe the human mood. Hand Gestures and Facial emotions are forms of fast non-linguistic communication. The current research on Music recommendation either using a hand gesture music controller (that only controls the operations for playing music) or an emotion based music player but not both. In this work, a new and hybrid approach for playing music both using hand gestures and facial emotions is proposed that can help the user to recommend and play music. In this research facial expression recognizer(FER) algorithm is used that extract the features from the image for emotion detection and the MediaPipe framework and Tensorflow library are used for hand detection and gesture recognition respectively. The music will play based on the most recent gesture and emotion by using a pygame. First, priority is given to hand gestures and then to facial emotions. The accuracy of the proposed work is also compared with existing approaches to music recommendation.
Avadhut P. MoreSwanand P. GholapAniket A. GaykeUddhav D. HonSharad M. Rokade
P. SharathG. Senthil KumarBoj K.S. Vishnu
Kalpesh JoshiRashi BhansaliGaurav M RathiNihaal A RathiPurva P RathiR. RathiAtharva R Raskar
Kirti R NambiarSuja Palaniswamy
Nizmi ShaikMalarvizhi NandagopalAbirami Jayaraman