Detection of mood states in unrestricted environments

Keywords: SVM, KNN, Driver’s emotion, Facial expression, Face Landmarks, Temporal features

Abstract

An important factor that influences car accidents is driving under non-optimal conditions, such as stress, anger, fear, depression, among others, in which the drivers face up the situations that increase the probability of an accident while driving. Therefore, various the driver’s emotion detector based on the facial expression have been proposed to date. However, most of them use only one video frame (an image) and operate in restricted conditions that are rarely encountered in real driving.  To be able to solve this problem, this paper presents an algorithm for driver’s emotion recognition using the facial expression, in which vectors of temporal characteristics are extracted from the video sequence using the Face Landmarks of each frame. The extracted feature vectors are fed into different classifiers, such as the Support Vector Machine (SVM) and the K-Nearest Neighbors (KNN) for performance comparison.

Downloads

Download data is not yet available.

References

Bazarevsky, V., Kartynnik, Y., Vakunov, A., Raveendran, K., Grundmann, M. (2019). BlazeFace: Sub-millisecond Neural Face Detection on Mobile GPUs. Proceedings of Computer Vision & Pattern Recognition. arXvis:1907.05047v2

Cortes, C., Vapnik, V. (1995). Support vector networks. Machine Learning. 20, 237-297.

Fujii, Y. (2014). Study on driver’s anger emotions and their coping behaviors”, The Journal of Clinical Research Center for Child Development and Educational Practices

Hima, MD., Guan, H., Ramdane-Cherif, Amar. (2017). Novel approaches in human-vehicle interaction interface of a vehicle driving assistance system.

Hongyu, H., Zhou, X., Zhu, Z., Wang, Q., Xiao, H. (2019). A Driving simulator study of young driver’s behavior under angry emotion, SAE Technical Paper 2019-01-0398, 2019, https://doi.org/10.4271/2019-01-0398.

Karthynnik, Y., Ablavatski, A., Grishchenko, I., Grundmann, M. (2019). Real-time facial Surface geometry from monocular video on mobile GPUs. CVPR Workshop on Computer Vision for Augmented and Virtual Reality. https://doi.org/10.48550/arXiv.1907.06724

Malaescu, A., Duju, L-C., Sultana, A., Filip, D., Ciuc M. (2019). Improving in-car emotion classification by NIR database augmentation. Proceedings - 14th IEEE International Conference on Automatic Face and Gesture Recognition, 8756628.

Singh, S., Benedict, S. (2019). Indian semi-acted facial expression (iSAFE) dataset for human emotions recognition. Advances in Signal Processing and Intelligent Recognition Systems, Communications in Computer and Information Science, 1209.

Yoshimoto, H., Sakai, K., Hiramatsu, Y., Ito A. (2021). Building a sensor network to measure driver’s emotions. 9th Int. Symp. on Computing and Networking Workshops. pp.77-80.

Zadobrischi, E., Cosovanu, L-M., Negru, M., Dimian, M. (2020). Detection of emotional states through the facial expressions of drivers embedded in a portable system dedicated to vehicles. 28th Telecommunications forum TELFOR.

Published
2022-10-05
How to Cite
Sánchez-Ruiz, M., Flores-Monroy, J., Escamilla-Hernández, E., Nakano-Miyatake, M., & Perez-Meana, H. (2022). Detection of mood states in unrestricted environments. Pädi Boletín Científico De Ciencias Básicas E Ingenierías Del ICBI, 10(Especial4), 110-115. https://doi.org/10.29057/icbi.v10iEspecial4.9142