Facial Expression Based Real-Time Emotion Recognition Mechanism for Students with High-Functioning Autism

Purchase or Subscription required for access

Purchase individual articles and papers

PayPal Logo

Receive full-text access to individual articles for $9.95 USD each.

Use PayPal button to purchase PDF copy of paper (8 pages)

Subscribe for faster access!

Subscribe and receive access to 100,000+ documents, for only $19/month (or $150/year).

Already have access?

Individual Subscription

If you have an individual subscription, sign in here for access

Institutional Subscription

You don't appear to be accessing the site through a subscribing institution (your IP address is 3.139.235.177).

If your university, college, or library subscribes to LearnTechLib, you may be able access full text articles through a login page.

You can search for your instition by name or by location.

Login via Institution

Authors

Hui-Chuan Chu, Department of Special Education, National University of Tainan, Taiwan, Taiwan ; William Wei-Jen Tsai, Institute of Manufacturing Information and Systems, National Chen Kung University, Taiwan ; Min-Ju Liao, Department of Psychology, National Chung-Cheng University, Taiwan, Taiwan ; Wei-Kai Cheng, Yuh-Min Chen, Su-Chen Wang, Institute of Manufacturing Information and Systems, National Chen Kung University, Taiwan

EdMedia + Innovate Learning, Jun 24, 2013 in Victoria, Canada ISBN 978-1-939797-03-2

Abstract

The emotional problems of students with autism may greatly affect their learning in e-learning environments. This paper presents the development of an emotion recognition mechanism based on a proposed emotional adjustment model for students with high-functioning autism in a mathematics e-learning environment. The physiological signals and facial expressions were obtained through evoking autistic students’ emotions in a mathematical e-learning environment, and used for training the emotion classification model and to verify the performance of the emotion classification mechanism. In total, 34 facial features were obtained experimentally that were conducted by using a counterbalanced design, and 46% of features were further extracted by the chi-square method, Information Gain (IG), and Wrapper feature selection methods. A Support Vector Machine was used to train the emotion recognition model and assess the performance of the proposed emotion recognition mechanism. Four emotional categori

Citation

Chu, H.C., Tsai, W.W.J., Liao, M.J., Cheng, W.K., Chen, Y.M. & Wang, S.C. (2013). Facial Expression Based Real-Time Emotion Recognition Mechanism for Students with High-Functioning Autism. In J. Herrington, A. Couros & V. Irvine (Eds.), Proceedings of EdMedia 2013--World Conference on Educational Media and Technology (pp. 1165-1173). Victoria, Canada: Association for the Advancement of Computing in Education (AACE). Retrieved August 13, 2024 from https://www.learntechlib.org/p/112105.