Filters and Search 🔍
  • How Are Students’ Emotions Associated with the Accuracy of Their Note Taking and Summarizing During Learning with ITSs?

    Gated

    Abstract: The goal of this study was to examine 38 undergraduate and graduate students’ note taking and summarizing, and the relationship between emotions, the accuracy of those notes and summaries, and proportional learning gain, during learning with MetaTutor, an ITS that fosters self-regulated learning while learning complex science topics. Results revealed that students expressed both […]

  • Investigating multimodal affect sensing in an Affective Tutoring System using unobtrusive sensors

    Gated

    Abstract: Affect inextricably plays a critical role in the learning process. In this study, we investigate the multimodal fusion of facial, keystrokes, mouse clicks, head posture and contextual features for the detection of student’s frustration in an Affective Tutoring System. The results (AUC=0.64) demonstrated empirically that a multimodal approach offers higher accuracy and better robustness […]

  • Subtle behavioural responses during negative emotion reactivity and down-regulation in bipolar disorder: A facial expression and eye-tracking study

    Gated

    Abstract: Abnormal processing of emotional information and regulation are core trait-related features of bipolar disorder (BD) but evidence from behavioural studies is conflicting. This study aimed to investigate trait-related abnormalities in emotional reactivity and regulation in BD using novel sensitive behavioural measures including facial expressions and eye movements. Fifteen patients with BD in full or […]

  • Sensing and Learning Human Annotators Engaged in Narrative Sensemaking

    Gated

    Abstract: While labor issues and quality assurance in crowdwork are increasingly studied, how annotators make sense of texts and how they are personally impacted by doing so are not. We study these questions via a narrative-sorting annotation task, where carefully selected (by sequentiality, topic, emotional content, and length) collections of tweets serve as examples of […]

  • Found in Translation: Learning Robust Joint Representations by Cyclic Translations Between Modalities

    Gated

    Abstract: Multimodal sentiment analysis is a core research area that studies speaker sentiment expressed from the language, visual, and acoustic modalities. The central challenge in multimodal learning involves inferring joint representations that can process and relate information from these modalities. However, existing work learns joint representations by requiring all modalities as input and as a […]

  • Selective eye fixations on diagnostic face regions of dynamic emotional expressions: KDEF-dyn database

    Gated

    Abstract: Prior research using static facial stimuli (photographs) has identified diagnostic face regions (i.e., functional for recognition) of emotional expressions. In the current study, we aimed to determine attentional orienting, engagement, and time course of fixation on diagnostic regions. To this end, we assessed the eye movements of observers inspecting dynamic expressions that changed from […]

  • Words Can Shift: Dynamically Adjusting Word Representations Using Nonverbal Behaviors

    Gated

    Abstract: Humans convey their intentions through the usage of both verbal and nonverbal behaviors during face-to-face communication. Speaker intentions often vary dynamically depending on different nonverbal contexts, such as vocal patterns and facial expressions. As a result, when modeling human language, it is essential to not only consider the literal meaning of the words but […]

  • Learning Robust Joint Representations for Multimodal Sentiment Analysis

    Gated

    Abstract: Multimodal sentiment analysis is a core research area that studies speaker sentiment expressed from the language, visual, and acoustic modalities. The central challenge in multimodal learning involves inferring joint representations that can process and relate information from these modalities. However, existing work learns joint representations using multiple modalities as input and may be sensitive […]

  • Comparing the Affectiva iMotions Facial Expression Analysis Software with EMG

    Gated

    Abstract: People’s faces display emotions, informing others about their affective states. In order to measure facial displays of emotion, Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from videos. However, its validity and comparability to EMG is unclear. The aim of […]

  • An Analysis of Subliminal Static Images and Words Using Eye Tracking Techniques

    Gated

    This research used a Phenomenography approach of Eye Tracking to study the Biometric changes when participants were subjected to eight static subliminal images hidden in seven differently designed packages. Embeds or static subliminal stimulus in the form of pictures and words were hidden in seven different perfume packages and were used to study the changes […]

Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news