Facial recognition technology was introduced in the 1960s, languished through the AI winter, and in recent years has taken off — boosted by increasingly powerful deep neural networks. Facial recognition has been applied in Face ID device unlocking functions, public security services, smart payment systems and more. During Taylor Swift’s 2018 “Reputation” tour, the American singer-songwriter’s security team utilized the tech to safeguard her from stalkers.
Now, a research team from the Hong Kong University of Science and Technology and Harbin Engineering University has adopted facial recognition technology to analyze students’ emotions in the classroom through a visual analytics system called “EmotionCues.”
Paper co-author Huamin Qu says the system “provides teachers with a quick and convenient measure of students’ engagement level in a class. Knowing whether the lectures are too hard and when students get bored can help improve teaching.”
But is it really that simple?
The proposed EmotionCues system includes a data processing phase and a visual exploration phase. The system first processes a series of raw data inputs and uses computer vision algorithms to extract emotion information using steps such as face detection, face recognition, emotion recognition and feature extraction. In phase two, the interactive visual system uses granular visual analysis of classroom videos to predict students’ emotional state and also the evolution of each student’s emotional state — i.e. is Lily losing interest?
The research team tested their EmotionCues system in the Hong Kong University of Science and Technology and at a Japanese kindergarten. Results show that EmotionCues performs better in detecting “obvious emotions” such as the sense of joy when students experience a particularly interesting or intense learning interest. The system’s ability to interpret “anger” or “sadness” however still needs improvement. Students who are actually very focused on class content may for example purse their lips in contemplation, which unfortunately the system might easily interpreted as “anger.”
The new study is not the first use of tech to analyze students’ emotional states. Last year, students in a primary school in Jinhua, Zhejiang wore smart headbands which measured electric signals of brain neurons and translated the collected information into an attention score. The headbands on students who were focused displayed a red light, while the less focused students’ headbands glowed blue. Student attention scores were sent to the teacher’s laptop every 10 minutes and synchronized to a WeChat group so parents could remotely monitor their child’s status at any time.
Although the aim of the project was to help students study more efficiently and help teachers improve their teaching quality, concerns were raised regarding both student privacy and system effectiveness.
The paper EmotionCues: Emotion-Oriented Visual Summarization of Classroom Videos is available on IEEE.
Author: Yuqing Li | Editor: Michael Sarazen