A Study of Stressed Facial Recognition Based on Histogram Information
DOI:
https://doi.org/10.31449/inf.v46i2.3861Abstract
Stress represents our subconscious emotions. The majority of the unconscious content is unacceptable or unpleasant such as pain, anxiety, or conflict. Most individuals do not realize that they are experiencing stress. Prolonged stressful experiences are likely to lead to health problems and affect one's facial appearance, specifically wrinkles shown in the face. This paper discussed the introduction of facial stress with histogram information. There are three stages in recognizing the stress pattern on the face of the registered image, feature extraction and classification. The registered image process takes three important parts of the face, i.e. the eyes, nose, and mouth. The feature extraction process was performed using the histogram method, i.e. Gabor filter and HOG feature. Each extracted feature was used as the model input to determine whether or not an individual is suffering from stress. Two classification methods were applied to learn stress patterns from the extracted feature. The classification process was performed using SVM with six kernel functions and a Tree algorithm with three numbers of split. Each model is trained using ten cross-fold validation strategies. The test results showed that the Gabor filter and HOG feature accuracy was 55% and 65%, respectively.References
H. Selye, Stress in health and disease. Butterworth-Heinemann, 2013.
S. P. Robin, “Organizational Behavior,” New York Prentice Hall, 2001.
R. S. Weinberg and D. Gould, Foundations of sport and exercise psychology, 7E. Human Kinetics, 2019.
H. Yaribeygi, Y. Panahi, H. Sahraei, T. P. Johnston, and A. Sahebkar, “The impact of stress on body function: A review,” EXCLI J., vol. 16, p. 1057, 2017.
B. J. Carducci, The psychology of personality: Viewpoints, research, and applications. John Wiley & Sons, 2009.
B. . Tovian, S.; Thorn, B.; Coons, H.; Susan, L.; Matthew, B.; Richard, S.; Daniel, “Stress effects on the body,” Am. Psychol. Assoc.
W. H. Organization and others, “Depression and other common mental disorders: global health estimates,” 2017.
J. H. Dunn and J. Koo, “Psychological Stress and skin aging: a review of possible mechanisms and potential therapies,” Dermatol. Online J., vol. 19, no. 6, 2013.
C. Daudelin-Peltier, H. Forget, C. Blais, A. Deschênes, and D. Fiset, “The effect of acute social stress on the recognition of facial expression of emotions,” Sci. Rep., vol. 7, no. 1, pp. 1–13, 2017.
L. Surace, M. Patacchiola, E. Battini Sönmez, W. Spataro, and A. Cangelosi, “Emotion recognition in the wild using deep neural networks and Bayesian classifiers,” in Proceedings of the 19th ACM international conference on multimodal interaction, 2017, pp. 593–597.
A. Adriyendi, “A Rapid Review of Image Captioning,” J. Inf. Technol. Comput. Sci., vol. 6, no. 2, pp. 158–169, 2021.
A. Martinez and S. Du, “A model of the perception of facial expressions of emotion by humans: research overview and perspectives.,” J. Mach. Learn. Res., vol. 13, no. 5, 2012.
I. Azam and S. A. Khan, “Feature extraction trends for intelligent facial expression recognition: A survey,” Informatica, vol. 42, no. 4, 2018.
M. Wegrzyn, M. Vogt, B. Kireclioglu, J. Schneider, and J. Kissler, “Mapping the emotional face. How individual face parts contribute to successful emotion recognition,” PLoS One, vol. 12, no. 5, p. e0177239, 2017.
S. D. Viet and C. L. T. Bao, “Effective Deep Multi-source Multi-task Learning Frameworks for Smile Detection, Emotion Recognition and Gender Classification,” Informatica, vol. 42, no. 3, 2018.
M. K. Benkaddour, “CNN based features extraction for age estimation and gender classification,” Informatica, vol. 45, no. 5, 2021.
D. S. Trigueros, L. Meng, and M. Hartnett, “Face recognition: From traditional to deep learning methods,” arXiv Prepr. arXiv1811.00116, 2018.
D. A. Kalmbach, J. R. Anderson, and C. L. Drake, “The impact of stress on sleep: pathogenic sleep reactivity as a vulnerability to insomnia and circadian disorders,” J. Sleep Res., vol. 27, no. 6, p. e12710, 2018.
R. Sarkar, R. Ranjan, S. Garg, V. K. Garg, S. Sonthalia, and S. Bansal, “Periorbital hyperpigmentation: a comprehensive review,” J. Clin. Aesthet. Dermatol., vol. 9, no. 1, p. 49, 2016.
C. Parrado, S. Mercado-Saenz, A. Perez-Davo, Y. Gilaberte, S. Gonzalez, and A. Juarranz, “Environmental stressors on skin aging. Mechanistic insights,” Front. Pharmacol., vol. 10, p. 759, 2019.
A. Sitek, E. Zadzinska, and I. Rosset, “Effects of psychological stress on skin and hair pigmentation in Polish adolescents,” 2012.
M. Usman, S. Latif, and J. Qadir, “Using deep autoencoders for facial expression recognition,” in 2017 13th International Conference on Emerging Technologies (ICET), 2017, pp. 1–6.
K. Tsuchida and M. Kobayashi, “Oxidative stress in human facial skin observed by ultraweak photon emission imaging and its correlation with biophysical properties of skin,” Sci. Rep., vol. 10, no. 1, pp. 1–7, 2020.
L. El Shafey, R. Wallace, and S. Marcel, “Face verification using gabor filtering and adapted gaussian mixture models,” in 2012 BIOSIG-Proceedings of the International Conference of Biometrics Special Interest Group (BIOSIG), 2012, pp. 1–6.
K. Yan, Y. Chen, and D. Zhang, “Gabor surface feature for face recognition,” in The first asian conference on pattern recognition, 2011, pp. 288–292.
T. Barbu, “Gabor filter-based face recognition technique,” Proc. Rom. Acad., vol. 11, no. 3, pp. 277–283, 2010.
A.-A. Bhuiyan and C. H. Liu, “On face recognition using gabor filters,” World Acad. Sci. Eng. Technol., vol. 28, 2007.
M. Sharif, A. Khalid, M. Raza, and S. Mohsin, “Face Recognition using Gabor Filters.,” J. Appl. Comput. Sci. & Math., no. 11, 2011.
T. I. Dhamecha, P. Sharma, R. Singh, and M. Vatsa, “On effectiveness of histogram of oriented gradient features for visible to near infrared face matching,” in 2014 22nd International Conference on Pattern Recognition, 2014, pp. 1788–1793.
S. Paisitkriangkrai, C. Shen, and J. Zhang, “Face detection with effective feature extraction,” in Asian Conference on Computer Vision, 2010, pp. 460–470.
M. G. L. Putra, W. Ariyanti, and I. Cholissodin, “Selection and Recommendation Scholarships Using AHP-SVM-TOPSIS,” JITeCS (Journal Inf. Technol. Comput. Sci., vol. 1, no. 1, pp. 1–13, 2016.
L. Muflikhah and D. J. Haryanto, “High performance of polynomial kernel at SVM Algorithm for sentiment analysis,” JITeCS (Journal Inf. Technol. Comput. Sci., vol. 3, no. 2, pp. 194–201, 2018.
I. Oktanisa, W. F. Mahmudy, and G. Maski, “Inflation Rate Prediction in Indonesia using Optimized Support Vector Regression Model,” JITeCS (Journal Inf. Technol. Comput. Sci., vol. 5, no. 1, pp. 104–114, 2020.
P. Hansen, A. Hertz, and N. Quinodoz, “Splitting trees,” Discrete Math., vol. 165, pp. 403–419, 1997.
Y. Bengio, Y. Grandvalet, and others, No unbiased estimator of the variance of K-fold cross-validation. Citeseer, 2003.
N. Joshi, “Combinational neural network using Gabor filters for the classification of handwritten digits,” arXiv Prepr. arXiv1709.05867, 2017.
Downloads
Published
How to Cite
Issue
Section
License
I assign to Informatica, An International Journal of Computing and Informatics ("Journal") the copyright in the manuscript identified above and any additional material (figures, tables, illustrations, software or other information intended for publication) submitted as part of or as a supplement to the manuscript ("Paper") in all forms and media throughout the world, in all languages, for the full term of copyright, effective when and if the article is accepted for publication. This transfer includes the right to reproduce and/or to distribute the Paper to other journals or digital libraries in electronic and online forms and systems.
I understand that I retain the rights to use the pre-prints, off-prints, accepted manuscript and published journal Paper for personal use, scholarly purposes and internal institutional use.
In certain cases, I can ask for retaining the publishing rights of the Paper. The Journal can permit or deny the request for publishing rights, to which I fully agree.
I declare that the submitted Paper is original, has been written by the stated authors and has not been published elsewhere nor is currently being considered for publication by any other journal and will not be submitted for such review while under review by this Journal. The Paper contains no material that violates proprietary rights of any other person or entity. I have obtained written permission from copyright owners for any excerpts from copyrighted works that are included and have credited the sources in my article. I have informed the co-author(s) of the terms of this publishing agreement.
Copyright © Slovenian Society Informatika