Emotion Recognition in Affective Tutoring Systems Collection of Ground truth Data Procedia Computer Science 104 ( 2017 ) 437 – 444 Available online at www sciencedirect com 1877 0509 © 2017 The Author[.]
Trang 1Procedia Computer Science 104 ( 2017 ) 437 – 444
1877-0509 © 2017 The Authors Published by Elsevier B.V This is an open access article under the CC BY-NC-ND license
( http://creativecommons.org/licenses/by-nc-nd/4.0/ ).
Peer-review under responsibility of organizing committee of the scientific committee of the international conference; ICTE 2016
doi: 10.1016/j.procs.2017.01.157
ScienceDirect
ICTE 2016, December 2016, Riga, Latvia Emotion Recognition in Affective Tutoring Systems:
Collection of Ground-Truth Data Sintija Petrovicaa,*, Alla Anohina-Naumecaa, Hazım Kemal Ekenelb
a
Riga Technical University, Kalku street 1, Riga, LV-1658, Latvia
b
Istanbul Technical University, 34469 Maslak, Istanbul, Turkey
Abstract
For the last 50 years, intelligent tutoring systems have been developed with the aim to supporting one of the most successful educational forms – individual teaching Recent research has shown that emotions can influence human behavior and learning abilities, as a result developers of tutoring systems have also started to follow these ideas by creating affective tutoring systems However, adaptation skills of the mentioned type of systems are still imperfect The paper presents an analysis of emotion recognition methods used in existing systems to enhance ongoing research on the improvement of tutoring adaptation Regardless
of the method chosen, the achievement of accurate emotion recognition requires collecting truth data To provide ground-truth data for emotional states, the authors have implemented a self-assessment method based on Self-Assessment Manikin
© 2016 The Authors Published by Elsevier B.V
Peer-review under responsibility of organizing committee of the scientific committee of the international conference; ICTE 2016.
Keywords: Affective computing; Intelligent tutoring systems; Emotion recognition; Ground-truth data; Self-Assessment Manikin
1 Introduction
Progress in the field of affective computing and research carried out in education and psychology, which uncovers a close relationship between emotions and learning, have led to the emergence of a new generation of intelligent tutoring systems (ITSs) – affective tutoring systems (ATSs) In general, understanding emotions is a quite complicated process even for humans because each emotional state can have its own possible causes and it may
* Corresponding author Tel.: +371-26569654; fax: +371-67089584
E-mail address: sintija.petrovica@rtu.lv
© 2017 The Authors Published by Elsevier B.V This is an open access article under the CC BY-NC-ND license
( http://creativecommons.org/licenses/by-nc-nd/4.0/ ).
Peer-review under responsibility of organizing committee of the scientific committee of the international conference; ICTE 2016
Trang 2influence differently person's behavior However, teachers can evaluate student's emotions with a rather high reliability based on facial expressions, body language, and speech Consequently, experienced teachers can make changes in the teaching process evaluating not only the student's knowledge level but also observing other student's characteristics, including emotional state1 Similarly, computerized tutoring systems must be capable of assessing students' emotions and using this information, promote learning and achievement of better learning outcomes2 However, there still exists a drawback of these systems comparing their adaptation skills with the same skills possessed by human-teachers There is still a lack of an emotional intelligence3 Since there is a close correlation between the classification of various student's states (both knowledge and emotional) and appropriate adaption of the tutoring process (selection of tutoring strategies and tactics), each of the classifiers involved in the process must be highly accurate and must work in real time to manage student's emotions Only in this way, it will be possible to model an individual student and to provide a truly customized tutoring process for him/her4
2 Emotional intelligence and intelligent tutoring systems
This section is divided into two parts The first part reviews the concept of ITSs and their evolution to systems possessing emotional intelligence The second part provides the analysis of ATSs and their operating principles, as well as describes differences between these two types of tutoring systems from the architectural point of view
2.1 Intelligent tutoring systems and emotions
ITSs are a generation of software systems, which aim to support and improve the teaching and learning process in
a certain domain ITSs simulate human-tutors and provide benefits of the individual teaching by exploiting methods
of artificial intelligence to provide a learning environment adapted (personalized content, feedback, navigation, etc.)
to characteristics of an individual student2 Adaptation is possible because of special types of knowledge integrated into the traditional architecture of such systems (see Fig 1), which includes: a) a student diagnosis module collecting and processing information about the student (his/her learning progress, behavior, psychological characteristics, etc.) and a student model storing this information; b) a pedagogical module responsible for the implementation of the teaching process and a pedagogical model storing pedagogical knowledge; c) a problem domain module able to generate and solve problems in the domain and a domain model storing knowledge intended
to be taught; d) an interface module managing interaction between ITS and students through various input/output devices
Fig 1 The traditional architecture of ITSs
For decades, the field of ITSs inherited ideas from such learning theories as cognitivism and constructivism focusing more on student's cognitive processes However, in recent years researchers have shifted their emphasis from cognitive processes to emotionally-cognitive processes5 These changes can mostly be explained by the increased attention paid to the relationship between emotions and learning Research results show that emotions are
a significant factor in the learning process and they can affect student's motivation and learning abilities Studies demonstrate that students experience a wide diversity of positive and negative emotions during the learning process, e.g., interest, flow, surprise, and pride, as well as anger, boredom, frustration, anxiety, confusion, and shame6,7 This implies that more attention should also be given to these emotions in the development process of ITSs
Trang 32.2 Affective tutoring systems
The direction of affective computing (AC) started to evolve at the end of 1990-ties when the book “Affective Computing” was published by Rosalind Picard8 AC is a branch of artificial intelligence which focuses on the design
of systems and tools able to process, recognize, and explain human emotions Around 10 years ago, ideas from AC came also in the development of ITSs and, as a result, a new generation of educational systems appeared – ATSs Different reasons are found in the literature in relation to the necessity to consider emotions as one of the parameters
in computerized tutoring systems Some of these reasons are, for example, a possibility to ensure an optimal emotional state for the achievement of better learning results9,10, reduced risks of uncertainty and disturbing interventions of ITSs and improved system's adaption abilities11, a more effective imitation of pedagogical decisions made by human-teachers12, a timely recognition of negative emotions and minimization of their negative effect with the aim to increase students' motivation13, promotion of the students' involvement and confidence14 Taking into account the previously mentioned information about ATSs and their development purposes, an ATS can be defined
as follows: an ATS is an ITS that imitates a human-teacher and his/her abilities of adapting not only to student's knowledge but also to his/her emotional state with the aim to intervene (react accordingly) only in those situations when an emotional state can become a threat to student's willingness to engage in the learning process, as a result leaving a negative impact on knowledge acquisition and learning outcomes Supporting functionality of ATSs requires the extension of the traditional architecture of ITSs (see Fig 1) by adding additional components Commonly, three additional parts are incorporated into the architecture to form the so called affective behavior model that allows providing appropriate responses considering both student's knowledge and emotions13,15 The first component is usually responsible for the automatic identification of a student's emotional state15 Emotion recognition is carried out by detecting and analyzing different features (e.g facial expressions, body gestures, speech, physiological characteristics, etc.) and applying various classifiers to identify student's emotions15,16 Typically, the acquired emotional state is stored in the student model, thus expanding the available information about the student and forming the so called affective student model An emotion response module or an affective (behavior) pedagogical model is often defined as the second component It can be considered as an extension of the pedagogical module13,17 This component provides reasoning on a current tutoring situation and allows for further adaption of the tutoring process on the basis not only on the student's current knowledge level and learning characteristics but also on the student's emotional state18 Therefore, the main task of the affective pedagogical model is to match data about student's emotions and tutoring situation acquired from the student model with appropriate responses of ATS12,17 By analyzing variations of the architecture of different existing ATSs, an emotion expression module can be found as the third component This module can be referred as an extension of the interface module that allows ATS to express its own emotions as a response on student's actions and emotions Usually, this component is represented as a virtual tutor or a pedagogical agent (PA) with its own emotions18
3 Related work
ATSs would not be able to respond to a student's emotions if they were unable to determine it Consequently, most of the research carried out so far is directed to the identification of emotions19 This section provides an analysis of different ATSs integrating emotion recognition Several aspects are examined, e.g., sensors used for the acquisition of emotional data, methods used for emotion classification, and the most commonly modeled emotions
3.1 Review of affective tutoring systems
For this research, different ATSs were selected to cover various problem domains – both from “hard” sciences (for example, mathematics, physics, natural sciences, computer science) and “soft” sciences or humanities (e.g study of languages) However, it must be noted that most of the ITSs have been developed for well-defined problem domains, where more rules exist regarding task generation and solving, whereas development of ITSs for ill-defined domains still remains a challenge20 This is also the case for ATSs: most part of such systems has been mainly developed for well-defined domains This can also be explained by the fact that during the learning process in these areas, students most likely encounter difficulties and experience emotions inherent in the learning process such as
Trang 4confusion or frustration21 A summary of analyzed ATSs is provided in Table 1, reviewing such systems as MathSpring22, Prime Climb23, WaLLis24, Easy with Eve25, FERMAT26, Cognitive Tutor Algebra27 and PAT2Math21 intended for teaching mathematics, ITSPOKE28 and AutoTutor7 used for physics, EER-Tutor29 developed for computer science field, CRYSTAL ISLAND30 and Guru Tutor31 intended for biology, Inq-ITS32 developed for natural sciences, INES33 and MetaTutor34 for teaching topics related to medicine and VALERIE35 used for French language
Table 1 The summary of different emotion recognition aspects in the existent ATSs
ATS Sensors Detection of emotional data and emotion classification Recognized emotions
AutoTutor Video camera
Pressure sensitive chair
Posture and eye pattern extraction, analysis of log files
Classifiers: Nạve Bayes, neural networks, logistic regression, nearest neighbour, C4.5 decision trees
Flow, confused, bored, frustrated, eureka, neutral
Cognitive Tutor
Algebra
Not used Analysis of log files recording features related to the student's
behaviour, event and activity history in the learning process
Classifiers: J48 decision trees, K* algorithm, step regression, JRip, Nạve Bayes, REP-Trees
Bored, concentrated, frustrated, confused
CRYSTAL
ISLAND
Not used Analysis of surveys, interviews, and log files
Emotions are modelled using a Dynamic Bayesian Network
Anxious, bored, confused, curious, enthusiastic, focused, frustrated
Easy with Eve Video camera Facial feature extraction
Classifier: support vector machines (SVM)
Smiling, laughing, surprised, angry, scared, sad, disgusted, neutral
EER-Tutor Video camera Facial feature (eyes, eyebrows, lips) tracking and extraction
Features are classified by analyzing the calculated distances for each
of facial features comparing to the neutral face
Happy, smiling, angry, frustrated, neutral emotions
FERMAT Video camera Extraction of facial feature points and regions of interest
Classifiers: neural network, a fuzzy expert system
Angry, disgusted, scared, happy, sad, surprised, neutral Guru Tutor Eye tracker
Video camera
Eye tracking and gaze pattern extraction, analysis of log files
Analysis of the attention time paid to the screen
Disinterested, bored
Inq-ITS Not used Analysis of log files
Classifiers: J48 decision trees, step regression, JRip
Bored, confused, frustrated, concentrated
INES Not used Analysis of the student’s activity level, difficulty of the task, previous
progress, number of errors, severity of the error
Emotions are predicted by appraisal rules
Worried, confident, depressed, enthusiastic
ITSPOKE Microphone Extraction of acoustic-prosodic, lexical features (speech intensity,
energy, volume, duration, and pauses) and dialogue features (e.g the accuracy of the answer) Semantic analysis is used for the assessment
of answer accuracy and linear regression for confidence evaluation
Negative, positive and neutral emotions
MathSpring Not used Analysis of log files, self-assessment reports, behaviour patterns, etc
Classifier: linear regression
Confident, worried, excited, inactive, satisfied, frustrated, interested, bored
MetaTutor Eye tracker Extraction of gaze data features and features related to areas of
interest within system’s interface
Classifiers: random forests, Nạve Bayes, logistic regression, SVMs
Bored, curious, interested
PAT2Math Video camera Analysis of log files and extraction of facial feature points
Emotions are identified based on Facial Action Coding System and psychological model of emotions (OCC model)
Satisfied, disappointed, happy, sad, grateful, angry, ashamed
Prime Climb Physiological
(phys.) sensors
Determination of skin conductivity, heart rate, muscle activity, and analysis of log files
Biometrical data is analyzed via unsupervised clustering
Happy, sad (for the game), admiration, criticism (for PA), pride, shame (for himself) VALERIE Video camera
Microphone Mouse Phys sensors
Determination of skin conductivity, heart rate, extraction of facial and speech features, analysis of mouse movement
Classifiers: nearest neighbour, discriminant function analysis, Marquardt Back-propagation algorithm
Sad, angry, surprised, scared, frustrated, amused
WaLLis Not used Analysis of log files
Classifier: J4.8 decision tree algorithm
Frustrated, confused, bored, confident, happy, enthusiastic
Trang 53.2 Analysis: sensors, emotional features, and emotion classification
In general, the determination of a student's emotional state is implemented by analyzing various data sources providing features that can give information about emotions Ideally, a quantitative and continuous measurement of emotional experience is required in an objective and unobtrusive manner, e.g analysis of interactional content6 Two most commonly used categories of features for the identification of emotions are facial features and features acquired from log files Regarding facial features, mostly patterns of different facial features (eyes, eyebrows, and lips) are extracted25,29 Eye movement is tracked and gaze patterns are acquired indicating on regions of interest to which the student is paying attention26,34 Features recorded in log files are related mainly to student's interaction with the system27,33 Acquired features include both information linked to the student (e.g., behavior patterns, action history, activity level, etc.) and data characterizing a current tutoring situation (e.g., progress, content difficulty, made errors, etc.) This category of features can be considered as the least disturbing one from the student's point of view, because it does not require additional actions from the student However, the selection of features can be a challenging task for developers in relation to the achievement of sufficient emotion classification results Besides these two most common categories of features, other characteristics are also acquired for emotion classification, e.g., body language7, speech features (intensity, volume, duration)28, physiological signals (heart rate, muscle movement, skin conductance)23 , and usage patterns of input devices, e.g., mouse35
To perceive these features, various sensors are used Cooper et al.36 have grouped sensors in three possible categories considering the level of discomfort they cause to the student Physiological sensors (e.g., skin conductivity sensor, heart rate sensor, electromyograph) cause the greatest discomfort because they require a direct contact with certain parts of the body and they should be connected to the computer, which registers all the data Touch or haptic sensors (e.g., pressure-sensitive mouse or chair) induce less discomfort and very often students do not notice them; however, the usage of such sensors for emotion recognition requires a student to touch them in such
a way limiting his/her movement freedom Observational sensors (e.g., video cameras, eye trackers, microphones) are not physically intrusive; however, they can distract students and make them feel uncomfortable knowing that all actions are recorded Besides usage of sensors, emotion identification in some ATSs is based on results of students’ filled surveys or self-assessment reports, where students report their own feelings, emotions, or mood in a particular learning situation22,30 This can be considered as an "accurate" method for the emotion acquisition, if students are aware of their emotions; however, a possibility exists that students will consider such surveys as redundant and they will not provide correct information about their emotions, e.g., if their goal is to complete these surveys faster In general, from the developers' point of view this can be regarded as one of the less time consuming methods since its implementation does not require the use of sensors or the application of classification algorithms Other method used for emotion prediction is based on the analysis of causes of emotions21 For this purpose, the OCC emotion model37 is used, which includes appraisal of the world in self-relevant terms, mental representations, and factors The classification of emotions and used classifiers mainly depend on whether sensors are used for feature extraction or not If emotions are recognized on the basis of data received from a video camera, then algorithms are used to analyze the distances between features in relation to a neutral facial expression Neural networks, Nạve Bayes, logistic and linear regression, SVMs, nearest neighbor algorithm, various types of decision trees, and other methods are being used for feature classification and emotion recognition In most cases, more than one classifier is applied because some of classifiers provides higher classification results for specific emotions7
Regarding the most commonly modeled student’s emotions, a part (although minor) of existing ATSs carry out the recognition of facial expressions to identify basic emotions (anger, disgust, happiness, fear, sadness, and surprise) that mostly are not experienced in the learning process25,26,35 In addition, it should be noted that most of these emotions (except happiness and sadness, which are directed towards learning outcomes) rarely appear in the learning process, consequently the identification of basic emotions is largely insignificant for the adaption of the tutoring process However, it is only a small part of ATSs and emotion modeling trends are improving and developers have started focusing their attention on emotions that are felt during the learning process and can directly influence it Therefore, most of the analyzed ATSs are aimed at learning-specific emotions and are able to determine, whether the student is, for example, concentrated (interested or in flow state), confused, bored, frustrated, anxious, ashamed, etc7,22,24,27,30
Trang 64 Determination of an affective state through self-assessment
This section is divided into two parts The first part reviews Self-Assessment Manikin (SAM) used for acquiring
three emotional dimensions and the second part describes the implementation of SAM for collecting ground-truths
4.1 Self-assessment Manikin
Currently, a popular direction regarding the emotion recognition is the analysis of log files, which record
interaction between students and tutoring system and allow identifying behavior patterns in a particular learning
situation and linking them with potential students’ emotions This approach is considered as a sensor-free
approach27,32 Mainly, the development of new ATSs or modification of existing ATSs with the sensor-free approach
can be explained by the limited availability of sensors in real learning conditions (in the best case, computer classes
or students' laptops are equipped with microphones and video cameras but not with physiological sensors22) Since
the sensor-free approach does not provide very high accuracy of emotion recognition and can crucially decrease
accuracy of adaptation of the tutoring process, one of the possible solutions to overcome this problem is the so
called a sensor-lite approach requiring a minimal use of available sensors, e.g., built-in cameras or microphones19
Whatever approach is chosen, to achieve the emotion recognition as accurate as possible, the first step is
collecting a ground-truth emotion data set, which can be later used for training and comparing results of automatic
measurement of affect38 Regarding this issue, SAM as one of the most popular self-assessment methods is
analyzed It can be used independently from the sensor-based approaches (e.g., without requiring video cameras)
The method allows getting from students themselves their feelings using a graphic representation of three
fundamental emotional dimensions – pleasure, arousal and dominance (PAD)39 Pleasure indicates how pleasant
person feels about something; arousal describes the level of mobilization or energy for the person; and dominance
symbolizes an ability to cope with the situation After the self-assessment, it is possible to represent all three
emotional dimensions in the PAD space, where each graphic depiction can have its own value in the range [–1 1]
By combining values from all three dimensions in the PAD emotion space, the determination of emotions can be
done Russell and Mehrabian40 have published a complete list of emotions and their corresponding PAD values
In this research, it was decided that an initial step for emotion recognition is the implementation of SAM, which
will be used as an independent method for the acquisition of emotional data to identify students’ emotions while
they are going through various instructional activities (e.g., starting new topic, solving tasks, receiving feedback,
etc.) The collected data will serve as ground-truth for sensor-based emotion classification studies and will be
applied to testing the functionality of the pedagogical module in relation to the adaption of the tutoring process
4.2 Emotion identification
In relation to the SAM, possible solutions were analyzed that could be adopted for research purposes One of the
existing SAM implementations is AffectButton tool, which is freely available and can be customized and used in
other research projects to acquire emotional data from systems’ users41 The AffectButton is a measurement tool that
enables a user to give a detailed emotional feedback about his/her feelings, mood, and attitudes towards different
objects After clicking the button, three values are generated corresponding to each of the emotional dimensions in
the PAD emotional model Currently, the source code of AffectButton tool is already adapted and integrated in the
environment for research requirements Since this method provided only PAD values characterizing specific
emotions but not emotions themselves, a discrete emotion calculation based on acquired PAD values is implemented
to determine to which of learning-specific emotions the acquired PAD values correspond the most (see Fig 2)
Formula (1) is applied to determine a distance “d” between the acquired PAD values for emotion e jand the defined
learning-specific emotion e i The less is the distance value, the more similar emotions are In total, 15 different
emotions are incorporated (angry, anxious, bored, concentrated, confused, curious, excited, fearful, frustrated,
happy, helpless, interested, relaxed, sad, surprised) but only the closest five are represented on the screen:
2 2
) ( ) ,
(e i e j e P i e P j e A i e A j e D i e D j
Trang 7where P i , A i , D i corresponding PAD values for emotion e i and P j , A j , D j corresponding PAD values for emotion e j.
Fig 2 AffectButton and calculation of emotions based on the generated PAD values
Emotion self-assessment can be carried out during the whole learning process allowing students to report about their emotional changes at any time when they prefer to do this or when the ATS itself prompts them to do this while they are going through learning activities Despite possible inconveniences, which this method can cause, it will allow identifying emotions during the learning process
5 Conclusion
The more detailed analysis of applied emotion recognition methods in existing ATSs is performed covering sensors used for the acquisition of features characterizing specific emotions, most often extracted features, methods used for feature classification and most commonly identified emotions To provide ground-truths for automatic emotion identification, SAM is analyzed and its available developments are examined The existing SAM solution, AffectButton, has been adopted for research purposes and an additional functionality is implemented to calculate 15 possible discrete emotions based on acquired PAD values However, one of the problems is related to close PAD values for some of emotions One of the possible solutions could be to identify “mood type” or one of eight octants
in the PAD space to which the emotion belongs42 This could narrow a range of possible similar emotions, as well as could reduce calculation time
Acknowledgements
This work was supported by the COST Action IC1303 Algorithms, Architectures and Platforms for Enhanced Living Environments Short-Term Scientific Mission grant No IC1303-161115-068104, by TUBITAK Project No 113E067, and by a Marie Curie FP7 Integration Grant within the 7th EU Framework Programme
References
1. Meyer DK, Turner JC Scaffolding Emotions in Classrooms In: Emotions in Education; 2007 p 243-258
2. Petrovica S, Pudane M Simulation of Affective Student Int J Educ Learn Syst 1; 2016 p 99-108
3. Zakharov K et al Towards Emotionally-Intelligent Pedagogical Agents In: Proceedings of the ITS’2008; 2008 p 19-28
4 Sottilare R et al A Guide to Instructional Techniques, Strategies and Tactics to Manage Learner Affect, Engagement, and Grit In
Design Recommendations for Intelligent Tutoring Systems Vol 2 U.S Army Research Laboratory; 2014 p 7-33
5. Petrovica S Tutoring Process in Emotionally Intelligent Tutoring Systems Int J Technol Educ Mark 4 (1); 2014 p 72-85
6. Afzal S, Robinson P Modelling Affect in Learning Environments In: Proceedings of ICALT'2010; 2010 p 438-442
7. D’Mello SK, Picard RW, Graesser AC Toward an Affect-Sensitive AutoTutor IEEE Intell Syst 22 (4); 2007 p 53-61
8. Picard RW Affective Computing Cambridge, MA: MIT Press; 1997
Trang 810. Hickey TJ, Tarimo WT The Affective Tutor J Comput Sci Coll 29 (6); 2014 p 50-56
11. Landowska A Affect-awareness Framework for Intelligent Tutoring Systems In: Proceedings of the HIS'2013; 2013 p 540-547
12 Lin HCK, Wu CH, Hsueh YP The Influence of Using Affective Tutoring System in Accounting Remedial Instruction on Learning
Performance and Usability Comput Hum Behav 41; 2014 p 514-522
13. Kaklauskas A et al Affective Tutoring System for Built Environment Management Comp Educ 82; 2015 p 202-216
14. Spaulding SL Developing Affect-Aware Robot Tutors Master Thesis USA: Massachusetts Institute of Technology; 2015
15. Malekzadeh M et al A Review of Emotion Regulation in Intelligent Tutoring Systems Educ Technol Soc 18 (4); 2015 p 435-445
16. Sarrafzadeh A et al E-learning with Affective Tutoring Systems In: Intelligent Tutoring Systems in E-learning Environments IGI
Global; 2011 p 129-140
17. Hernández Y, Sucar E, Conati C An Affective Behavior Model for Intelligent Tutors In: Proc of the ITS'2008; 2008 p 819-821
18. Gu X et al Design of Emotional Intelligent Tutor System based on HMM In: Proc of the ICNC'2010; 2010 p 1984-1988
19 D’Mello SK, Graesser AC Feeling, Thinking, and Computing with Affect-Aware Learning Technologies In: Calvo RA, D'Mello SK,
Gratch J, Kappas A (eds.) The Oxford Handbook of Affective Computing UK: Oxford University Press; 2015 p 419-434
20 Nye BD, Goldberg B, Hu X Generalizing the Genres for ITS: Authoring Considerations for Representative Learning Tasks In:
Design Recommendations for Intelligent Tutoring Systems Vol 3 U.S Army Research Laboratory; 2015 p 47-64
21 Jaque PA et al Rule-Based Expert Systems to Support Step-by-Step Guidance in Algebraic Problem Solving: The Case of the Tutor
PAT2Math Expert Syst with Appl 40 (14); 2013 p 5456-5465
22. Wixon M et al The Opportunities and Limitations of Scaling Up Sensor-Free Affect Detection In: Proc of the EDM'2014 p.145-152
23 Amershi S et al Using Feature Selection and Unsupervised Clustering to Identify Affective Expressions in Educational Games In:
Ikeda M, Ashlay KD, Chan TW (eds.) Proceedings of the ITS'2006 Springer-Verlag; 2006 p 21-28
24. Porayska-Pomsta K et al Diagnosing and Acting on Student Affect User Model User-Adap 18 (1-2); 2008 p 125-173
25. Sarrafzadeh A et al How Do You Know that I Don’t Understand? Comput Hum Behav 24; 2008 p 1342-1363
26. Zataraín-Cabada R et al Affective Tutoring System for Android Mobiles In: Proceedings of the ICIC'2014; 2014 p 1-10
27. Baker RSJ et al Towards Sensor-free Affect Detection in Cognitive Tutor Algebra In: Proc of the EDM’2012 p 126-133
28 Litman D, Forbes-Riley K, Silliman S Towards Emotion Prediction in Spoken Tutoring Dialogues In: Hearst M, Ostendorf M (eds.)
Proc of the HLT/NAACL'2003 USA: Association for Computational Linguistics; 2003 p 52-54
29. Zakharov K Affect Recognition and Support in Intelligent Tutoring Systems Master thesis New Zealand: University of Canterbury;
2007
30 Sabourin JL et al Considering Alternate Futures to Classify Off-Task Behavior as Emotion Self-Regulation: A Supervised Learning
Approach J Educ Data Mining 5 (9); 2013 p 9-38
31. Olney A et al Guru: A Computer Tutor that Models Expert Human Tutors In: Proceedings of the ITS'2012 p 256-261
32 Paquette L et al Sensor-Free Affect Detection for a Simulation-Based Science Inquiry Learning Environment In: Trausan-Matu S,
Boyer KE, Crosby M, Panourgia K (eds.) Proc of the ITS'2014 Switzerland: Springer International Publishing; 2014 p 1-10
33. Heylen D, Nijholt A, Akker HJ Affect in Tutoring Dialogues J Appl AI 19 (3-4); 2005 p 287-310
34 Jaques N et al Predicting Affect from Gaze Data during Interaction with an Intelligent Tutoring System In: Trausan-Matu S, Boyer
KE, Crosby M, Panourgia K (eds.) Proceedings of the ITS'2014 Switzerland: Springer International Publishing; 2014 p 29-38
35 Paleari M, Lisetti C, Lethonen M VALERIE: Virtual Agent for Learning Environment Reacting and Interacting Emotionally In:
Proceedings of the AIED'2005 Amsterdam: IOS Press; 2005
36 Cooper D, Arroyo I, Woolf BP Actionable Affective Processing for Automatic Tutor Interventions In: Calvo RA, D'Mello SK
(eds.) New Perspectives on Affect and Learning Technologies New York: Springer-Verlag; 2011 p 127-140
37. Ortony A, Clores GL, Collins A The Cognitive Structure of Emotions MA: Cambridge University Press; 1988
38. Gunes H, Nicolaou MA, Pantic M Continuous Analysis of Affect from Voice and Face In: Salah AA, Gevers T (eds.) Computer
Analysis of Human Behaviour UK: Springer-Verlag London; 2011 p 255-291
39. Bradley M et al Measuring Emotion: Self-Assessment Manikin and the Semantic Differential J Behav Ther Exp Psy; 1994 p 49-59
40. Russell JA, Mehrabian A Evidence for a Three-Factor Theory of Emotions J Res Pers 11 (3); 1977 p 273-294
41. Broekens J, Brinkman WP AffectButton: A Method for Reliable and Valid Affective Self-Report Int J Hum-Comput St 71 (6); 2013
p 641-667
42 Mehrabian A Pleasure-Arousal-Dominance: A General Framework for Describing and Measuring Individual Differences in
Temperament Curr Psychol 14(4); 1996 p 261-292
Sintija Petrovica obtained Master’s degree in Computer Systems in 2011 at Riga Technical University (RTU), Latvia In 2011, she started to work as a scientific assistant in Department
of Artificial Intelligence and System Engineering at RTU She is a PhD student of the study program “Computer Systems” at RTU She is developing her PhD thesis related with the development of pedagogical module for affective tutoring systems to adapt tutoring process to student’s emotions Her research interests include intelligent tutoring systems and affective computing Contact her at sintija.petrovica@rtu.lv
1