Special Issue on Behavior Analysis "in-the-wild"

Call for papers: 

Robots and intelligent virtual agents are called upon to interact with people in increasingly naturalistic ways. To accomplish this task, they must automatically perceive, process, and respond appropriately to social signals in real time. Social signals communicate people’s emotions, appraisals, and intentions. Subtle differences in their timing and in the complex packaging of multimodal signs can dramatically alter their meaning. Because people operate in a diversity of contexts, processing must be simultaneously sensitive and robust to the diversity of contexts in which social signals are exchanged. Until recently, affective computing was limited to posed behavior in highly controlled contexts with little attention to detection of the onsets and offsets of social signals, their timing or dynamics, their multimodal coordination or their context. Recent work has turned attention to un-posed, unscripted behavior, but contexts have remained relatively constrained and with relatively little attention to multimodal communication and the dynamics of displays. As an example, automatic spotting of subtle and fleeting expressions (i.e., micro-expressions) that may powerfully communicate emotion has only recently attracted attention. To meet the need for advanced human behavior understanding that is simultaenously sensitive and robust to context and accurately represents the flow and meaning of social displays, advances in databases and algorithms are critical. This special issue will bring together leading efforts in affective computing in-the-wild. We seek advances in databases and algorithms for human behavior understanding in diverse contexts beyond the laboratory. We seek the full range of modalities, social signals, and levels of analysis. We are especially interested in efforts that consider the “packaging” of multimodal signals and interpersonal coordination. Modalities include facial expression, body movement, and gesture from video, acoustics and prosody from audio, and physiology and motion from wearable sensors and infrared imaging. This special issue will present advances in databases, algorithms, benchmarks, and findings in support of the next generation of affective computing.

Topics include but are not limited to “in-the-wild”:

  • Unimodal and multimodal databases and benchmarks.
  • Supervised and unsupervised learning.
  • Micro-expression detection.
  • Facial expression, gesture, and body movement for human behavior understanding.
  • Timing and dynamics of intra- and interpersonal communication.
  • Verbal and nonverbal communication.
  • Integration of multiple modalities and sensor information.
  • Multimodal communication.

Human behaviour understanding in the wild is the common theme. This will be a criterion in evaluating submissions.

Guest Editors: 

 

Paper submission: 

Please submit your paper through IEEE Transactions on Affective Computing manuscript central by selecting SI - Human Behavior Analysis “in-the-wild”:  https://mc3.manuscriptcentral.com/taffc-cs

Papers should adhere to the same formatting guidelines as IEEE Transactions on Affective Computing. For important dates, please see below. 

Important Dates: 

  • Submission Deadline:
December 15, 2016
  • First Review:
March 16, 2017
  • Notification of acceptance:
June 15, 2017
  • Tentative publication date:
October, 2017