Research Article
BibTex RIS Cite

Classification of Emotional State based on Eye Movements

Year 2020, Volume: 13 Issue: 2, 137 - 144, 30.04.2020
https://doi.org/10.17671/gazibtd.563830

Abstract

Emotional state is controlled by the autonomous nervous system (ANS). Thus, in the presence of a positive or negative type of a stimulus, ANS responses occur in a short time which can be observed as various physical output, depending on the type of emotion triggered by the stimulus type in the individual. One of these physical differences related to the type of stimulus is the pupil size variation and can be named as a physiological change to examine one’s emotional state. According to previous studies, pupil size and eye movement measurements were shown to be a useful input signal. Relying on that, emotion recognition by extracting eye gaze pattern is aimed in the present study. When a negative type of a stimulus triggers a person, pupil size seems to dilate. On the other hand, in the presence of a positive type of stimulus, pupil size is tightened. Based on this information, in the concept of this study, stimuli are applied to male and female volunteers. Stimuli, a total of 60 pictures, are selected from the IAPS image database, where different emotional stimuli sets are chosen concerning valence scores to form positive, neutral and negative stimulus classes. Thirteen volunteers participated in the study to perform the test paradigm and to attend the eye tracker measurement. Left and right pupil size values and fixation time parameters are used for classification purposes. The input features are classified for three classes using kNN, Naive Bayes, Support Vector Machine, Linear Discriminant Analysis, decision tree and logistic regression techniques. Low classification accuracy yields us to apply classification based on positive and negative stimuli. Analysis results demonstrated the best success rate with 68% for kNN algorithm for classification within these two emotion groups, where the application of Naïve Bayes and SVM results in a success rate of 55%, Linear Discriminant Analysis 50%, decision tree and logistic regression 48%. To conclude, eye movements can reflect the emotional responses of the subject and also predictions of the arousal level of the subjects might be performed.

References

  • P. R. Kleinginna, A. M. Kleinginna, “A categorized list of emotion definitions, with suggestions for a consensual definition,” Motivation and Emotion”, 5(4), 345–379, Kluwer Academic Publishers-Plenum Publishers, December, 1981.
  • A. S. R. Manstead, Psychology of Emotions, Sage Publications Ltd, 89, 2008.
  • M. Uğur, Medikal psikoloji, Sahhaflar Kitap Sarayı, 475-476, 1994.
  • P. Ekman, “An argument for basic emotions”, Cognition & Emotion, 6(3), 169- 200, 1992.
  • J. A. Russell, “A circumplex model of affect”, Journal of Pers. Soc. Psychol., 39, 1161–1178, 1980.
  • R. F. Staners, M. Coulter, A. W. Sweet, P. Murphy, “The papillary response as an indicator of arousal and cognition”, Motivation and Emotion, 3(4), 319-340, 1979.
  • G. S. Dichter, A. J. Tomarken, B. R. Baucom, “Startle modulation before, during and after exposure to emotional stimuli”, International Journal of Psychophysiology, 43, 191-196, 2002.
  • L. Ruiz-Padial, J. J. Sollers, J. Vila, J. F. Thayer, “The rhythm of the heart in the blink of an eye: Emotion-modulated startle magnitude covaries with heart rate variability”, Psychophysiology, 40, 306–313, 2003.
  • M. G. Calvo, P. J. Lang, “Gaze patterns when looking at emotional pictures: Motivationally biased attention”, Motivation and Emotion, 28, 221–243, 2004.
  • V. L. Kinner, L. Kuchinke, A. M. Dierolf, C. J. Merz, T. Otto, O. T. Wolf, “What our eyes tell us about feelings: Tracking pupillary responses during emotion regulation processes”, Psychophysiology, 54(4), 508-518, 2017.
  • R. Ambrosio, S. C. Schallhorn, S. E. Wilson, “The importance of pupil size in refractive surgery”, Refractive Surgery Outlook, American academy of ophtalmology, 2002.
  • R. Duffey, D. Leaming, “Trends in refractive surgery in the United States”, Journal of Cataract & Refractive Surgery, 30-8, 1781-1785, 2004.
  • I. E. Loewensfeld, “The pupil: Anatomy, physiology, and clinical applications”, Iowa City: Iowa State University, 1993.
  • E. H. Hess, J. M. Polt, “Pupil size as related to interest value of visual stimuli”, Science, 132(3423), 349-50, 1960.
  • J. L. Armony, R. J. Dolan, “Modulating of spatial attention by fear-conditioned stimuli: An event-related fMRI study”, Neuropsychologia, 40, 817–826, 2002.
  • K. Mogg, J. MacNamara, M. Powys, H. Rawlinson, A. Seiffer, B. P. Bradley, “Selective attention to threat: A test of two cognitive models of anxiety”, Cognition and Emotion, 14, 375–399, 2000.
  • E. H. Koster, G. Crombez, S. Van Damme, B. Verschuere, J. De Houwer J., “Does imminent threat capture and hold attention?”, Emotion, 4(3), 312-317, 2004.
  • A. Voßkühler, V. Nordmeier, L. Kuchinke, A. M. Jacobs, “OGAMA-Open Gaze And Mouse Analyzer: Open source software designed to analyze eye and mouse movements in slideshow study designs”, Behavior Research Methods, 40(4), 1150-1162, 2008.
  • P. J. Lang, M. M. Bradley, B. N. Cuthbert, “International affective picture system (IAPS): Affective ratings of pictures and instruction manual”, Technical Report A-8, University of Florida, Gainesville, FL, 2008.
  • V. Nitin Bhatia, “Survey of Nearest Neighbor Techniques”, (IJCSIS) International Journal of Computer Science and Information Security, 8(2), 2010.
  • I. Rish, “An empirical study of the naive Bayes classifier”, IJCAI Workshop on Empirical Methods in AI, 2001.
  • P. Domingos, M. Pazzani. “On the optimality of the simple Bayesian classifier under zero-one loss”, Machine Learning, 29, 103–130, 1997.
  • C. Cortes, V. Vapnik, “Support-vector networks”, Machine Learning, 20(3), 273-297, 1995.
  • A. Ben-Hur A, J. Weston. “A User's Guide to Support Vector Machines”, J. Methods Mol. Biol., 609, 223-239, 2010.
  • R. Coelho, et al.. “Survey of Evolutionary Algorithms for Decision-Tree Induction”, IEEE Transactıons On Systems, Man, And Cybernetics—Part C: Applications And Reviews, 42(3), 2012.
  • L. Rokach, O. Maimon, “Top-down induction of decision trees classifiers”, IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 35(4), 476- 487, 2002.
  • J.R. Quinlan, “Induction of Decision Trees”, Machine Learning, 1(1), 81-106, 1986.
  • M. Soleymani, M. Pantic, T. Pun, “Multimodal emotion recognition in response to videos”, IEEE Transactions on Affective Computing, 3(2), 211–223, 2012.
  • T. Balcıoğlu, D. Şahin, M. Assem, S. B. Selman, D. Göksel Duru, “Analysis of Gaze Characteristics with Eye Tracking in Elite Athletes: A Pilot Study”, Proc. of IEEE BIYOMUT 18th National Biomedical Engineering Conference, 1-4, 2014.
  • Ö. Battal, T. Balcıoğlu, A. D. Duru, “Analysis of gaze characteristics with eye tracking system during repeated breath holding exercises in underwater hockey elite athletes”, Proc. of IEEE BIYOMUT 20th National Biomedical Engineering Meeting, 2016.
  • Y. Kaya, A. D. Duru, “Masa Tenisi Çok Top Antrenmanının 9-12 Yaş Grubu Çocuklarda Görsel Reaksiyon Performansındaki Etkisinin İncelenmesi”, Marmara Üniversitesi Spor Bilimleri Dergisi, 1(2), 2016.
  • D. E. Barkana, A. Açık, D. Goksel Duru, A. D. Duru, “Erratum to: improvement of design of a surgical interface using an eye tracking device”, Theoretical Biology and Medical Modelling, 11(1), 48, 2014.
  • T. Partala, M. Jokiniemi, V. Surakka, “Pupillary responses to emotionally provocative stimuli”, Proceedings of the Eye Tracking Research & Application Symposium, ETRA 2000, Palm Beach Gardens, Florida, USA, November 6-8, 2000.
  • M. D. Basar, A. D. Duru, A. Akan, “Emotional state detection based on common spatial patterns of EEG”, SIViP (2019). https://doi.org/10.1007/s11760-019-01580-8.
  • Y. İnal, N. Özen Çınar, K. Çağıltay, “Kamu İnternet Sitelerinde Yer Alan Arama Alanlarının Kullanılabilirliği ve Buna Yönelik Kullanıcı Davranışlarının Belirlenmesi”, Bilişim Teknolojileri Dergisi, 9(1), 41-54, 2016.
  • E. Koç, O. Bayat, D. Göksel Duru, A. D. Duru, “Göz Hareketlerine Dayalı Beyin Bilgisayar Arayüzü Tasarımı”, International Journal of Engineering Research and Development, 12(1), 176-188, DOI: 10.29137/umagd.555494, 2020.
  • B. Karaöz Akın, U. T. Gürsoy Şimşek, “Adaptif Öğrenme Sözlüğü Temelli Duygu Analiz Algoritması Önerisi”, Bilişim Teknolojileri Dergisi, 11(3), 245-253, 2018.

Gözbebeği Hareketleri Temelli Duygu Durumu Sınıflandırılması

Year 2020, Volume: 13 Issue: 2, 137 - 144, 30.04.2020
https://doi.org/10.17671/gazibtd.563830

Abstract

İnsanlardaki duygu durumu otonom sinir sistemi tarafından kontrol edilmektedir. Bu sebeple, olumlu veya olumsuz bir uyaran ile karşılaştığında otonom sinir sistemi çok kısa bir süre içerisinde, uyaran çeşidinin bireyde tetiklediği duygu türüne göre çeşitli bedensel farklılıklara sebebiyet vermektedir. Bedensel bu farklılıklardan bir tanesi de kişilerin gözbebeğinin uyaran çeşitine göre gösterdiği fizyolojik farklılıklardır. Yapılan araştırmalar göz bebeği hareketlerinin ve boyutunun ölçülmesinin yararlı bir girdi sinyali olabileceğini göstermektedir. İnsanlar olumsuz bir uyaran gördüğünde gözdeki pupil boyutunda genişleme, olumlu bir uyaran gördüğünde ise pupil boyutunda daralma oluşmaktadır. Bu bilgiler ışığında çalışma kapsamında, erkek ve kadın katılımcılara uygulanan, türlü kategorilerden oluşan büyük bir fotoğraflar dizisi olan IAPS içerisinden, katılımcılarda fazlasıyla zıt duygulanımlar meydana getiren uyaran sınıfları değerlik puanlarına göre tercih edilmiş ve uyaranlar olumlu, olumsuz ve nötr olmak üzere üç sınıfa ayrılmıştır. Çalışma sırasında IAPS’ten seçilmiş olan toplamda 60 adet fotoğraf kullanılmış ve 13 adet katılımcıya sunulmuş ve göz takip cihazı kullanılarak katılımcıların göz verileri veri tabanına kaydedilmiştir. Sol ve sağ pupil büyüklükleri ve fiksasyon süresi sınıflama için girdi olarak kullanılmıştır. Üç sınıf kullanılararak, kNN, Naive Bayes, Destek Vektör Makinaları, Doğrusal Diskriminant analizi, karar ağacı ve lojistik regresyon teknikleri uygulanmıştır. Düşük sınıflandırma başarısından ötürü, işlem sadece pozitif ve negatif sınıflar için tekrar hesaplanmıştır. Bu iki emosyonel durum için %68’lik bir oran ile k-NN sınıflandırma yönteminde en yüksek sınıflandırma başarısına ulaşılmıştır. Naive Bayesçi Sınıflandırıcı ve DVM %55, LDA %50, karar ağaçları ve lojistik regresyon %48’lik başarıya ulaşmıştır. Kişilerin çeşitli uyaranlara verdiği emosyonel yanıtların göz hareketlerine yansıyabileceği ve göz hareketlerinden kişinin emosyonel uyarılma düzeyi hakkında fikir sahibi olunabileceği düşünülmektedir.

References

  • P. R. Kleinginna, A. M. Kleinginna, “A categorized list of emotion definitions, with suggestions for a consensual definition,” Motivation and Emotion”, 5(4), 345–379, Kluwer Academic Publishers-Plenum Publishers, December, 1981.
  • A. S. R. Manstead, Psychology of Emotions, Sage Publications Ltd, 89, 2008.
  • M. Uğur, Medikal psikoloji, Sahhaflar Kitap Sarayı, 475-476, 1994.
  • P. Ekman, “An argument for basic emotions”, Cognition & Emotion, 6(3), 169- 200, 1992.
  • J. A. Russell, “A circumplex model of affect”, Journal of Pers. Soc. Psychol., 39, 1161–1178, 1980.
  • R. F. Staners, M. Coulter, A. W. Sweet, P. Murphy, “The papillary response as an indicator of arousal and cognition”, Motivation and Emotion, 3(4), 319-340, 1979.
  • G. S. Dichter, A. J. Tomarken, B. R. Baucom, “Startle modulation before, during and after exposure to emotional stimuli”, International Journal of Psychophysiology, 43, 191-196, 2002.
  • L. Ruiz-Padial, J. J. Sollers, J. Vila, J. F. Thayer, “The rhythm of the heart in the blink of an eye: Emotion-modulated startle magnitude covaries with heart rate variability”, Psychophysiology, 40, 306–313, 2003.
  • M. G. Calvo, P. J. Lang, “Gaze patterns when looking at emotional pictures: Motivationally biased attention”, Motivation and Emotion, 28, 221–243, 2004.
  • V. L. Kinner, L. Kuchinke, A. M. Dierolf, C. J. Merz, T. Otto, O. T. Wolf, “What our eyes tell us about feelings: Tracking pupillary responses during emotion regulation processes”, Psychophysiology, 54(4), 508-518, 2017.
  • R. Ambrosio, S. C. Schallhorn, S. E. Wilson, “The importance of pupil size in refractive surgery”, Refractive Surgery Outlook, American academy of ophtalmology, 2002.
  • R. Duffey, D. Leaming, “Trends in refractive surgery in the United States”, Journal of Cataract & Refractive Surgery, 30-8, 1781-1785, 2004.
  • I. E. Loewensfeld, “The pupil: Anatomy, physiology, and clinical applications”, Iowa City: Iowa State University, 1993.
  • E. H. Hess, J. M. Polt, “Pupil size as related to interest value of visual stimuli”, Science, 132(3423), 349-50, 1960.
  • J. L. Armony, R. J. Dolan, “Modulating of spatial attention by fear-conditioned stimuli: An event-related fMRI study”, Neuropsychologia, 40, 817–826, 2002.
  • K. Mogg, J. MacNamara, M. Powys, H. Rawlinson, A. Seiffer, B. P. Bradley, “Selective attention to threat: A test of two cognitive models of anxiety”, Cognition and Emotion, 14, 375–399, 2000.
  • E. H. Koster, G. Crombez, S. Van Damme, B. Verschuere, J. De Houwer J., “Does imminent threat capture and hold attention?”, Emotion, 4(3), 312-317, 2004.
  • A. Voßkühler, V. Nordmeier, L. Kuchinke, A. M. Jacobs, “OGAMA-Open Gaze And Mouse Analyzer: Open source software designed to analyze eye and mouse movements in slideshow study designs”, Behavior Research Methods, 40(4), 1150-1162, 2008.
  • P. J. Lang, M. M. Bradley, B. N. Cuthbert, “International affective picture system (IAPS): Affective ratings of pictures and instruction manual”, Technical Report A-8, University of Florida, Gainesville, FL, 2008.
  • V. Nitin Bhatia, “Survey of Nearest Neighbor Techniques”, (IJCSIS) International Journal of Computer Science and Information Security, 8(2), 2010.
  • I. Rish, “An empirical study of the naive Bayes classifier”, IJCAI Workshop on Empirical Methods in AI, 2001.
  • P. Domingos, M. Pazzani. “On the optimality of the simple Bayesian classifier under zero-one loss”, Machine Learning, 29, 103–130, 1997.
  • C. Cortes, V. Vapnik, “Support-vector networks”, Machine Learning, 20(3), 273-297, 1995.
  • A. Ben-Hur A, J. Weston. “A User's Guide to Support Vector Machines”, J. Methods Mol. Biol., 609, 223-239, 2010.
  • R. Coelho, et al.. “Survey of Evolutionary Algorithms for Decision-Tree Induction”, IEEE Transactıons On Systems, Man, And Cybernetics—Part C: Applications And Reviews, 42(3), 2012.
  • L. Rokach, O. Maimon, “Top-down induction of decision trees classifiers”, IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 35(4), 476- 487, 2002.
  • J.R. Quinlan, “Induction of Decision Trees”, Machine Learning, 1(1), 81-106, 1986.
  • M. Soleymani, M. Pantic, T. Pun, “Multimodal emotion recognition in response to videos”, IEEE Transactions on Affective Computing, 3(2), 211–223, 2012.
  • T. Balcıoğlu, D. Şahin, M. Assem, S. B. Selman, D. Göksel Duru, “Analysis of Gaze Characteristics with Eye Tracking in Elite Athletes: A Pilot Study”, Proc. of IEEE BIYOMUT 18th National Biomedical Engineering Conference, 1-4, 2014.
  • Ö. Battal, T. Balcıoğlu, A. D. Duru, “Analysis of gaze characteristics with eye tracking system during repeated breath holding exercises in underwater hockey elite athletes”, Proc. of IEEE BIYOMUT 20th National Biomedical Engineering Meeting, 2016.
  • Y. Kaya, A. D. Duru, “Masa Tenisi Çok Top Antrenmanının 9-12 Yaş Grubu Çocuklarda Görsel Reaksiyon Performansındaki Etkisinin İncelenmesi”, Marmara Üniversitesi Spor Bilimleri Dergisi, 1(2), 2016.
  • D. E. Barkana, A. Açık, D. Goksel Duru, A. D. Duru, “Erratum to: improvement of design of a surgical interface using an eye tracking device”, Theoretical Biology and Medical Modelling, 11(1), 48, 2014.
  • T. Partala, M. Jokiniemi, V. Surakka, “Pupillary responses to emotionally provocative stimuli”, Proceedings of the Eye Tracking Research & Application Symposium, ETRA 2000, Palm Beach Gardens, Florida, USA, November 6-8, 2000.
  • M. D. Basar, A. D. Duru, A. Akan, “Emotional state detection based on common spatial patterns of EEG”, SIViP (2019). https://doi.org/10.1007/s11760-019-01580-8.
  • Y. İnal, N. Özen Çınar, K. Çağıltay, “Kamu İnternet Sitelerinde Yer Alan Arama Alanlarının Kullanılabilirliği ve Buna Yönelik Kullanıcı Davranışlarının Belirlenmesi”, Bilişim Teknolojileri Dergisi, 9(1), 41-54, 2016.
  • E. Koç, O. Bayat, D. Göksel Duru, A. D. Duru, “Göz Hareketlerine Dayalı Beyin Bilgisayar Arayüzü Tasarımı”, International Journal of Engineering Research and Development, 12(1), 176-188, DOI: 10.29137/umagd.555494, 2020.
  • B. Karaöz Akın, U. T. Gürsoy Şimşek, “Adaptif Öğrenme Sözlüğü Temelli Duygu Analiz Algoritması Önerisi”, Bilişim Teknolojileri Dergisi, 11(3), 245-253, 2018.
There are 37 citations in total.

Details

Primary Language Turkish
Subjects Computer Software
Journal Section Articles
Authors

Samet Mete This is me 0000-0001-7788-7883

Oğuz Çakır This is me

Oğuz Bayat This is me 0000-0001-5988-8882

Dilek Göksel Duru 0000-0003-1484-8603

Adil Deniz Duru 0000-0003-3014-9626

Publication Date April 30, 2020
Submission Date May 13, 2019
Published in Issue Year 2020 Volume: 13 Issue: 2

Cite

APA Mete, S., Çakır, O., Bayat, O., Göksel Duru, D., et al. (2020). Gözbebeği Hareketleri Temelli Duygu Durumu Sınıflandırılması. Bilişim Teknolojileri Dergisi, 13(2), 137-144. https://doi.org/10.17671/gazibtd.563830