Mind - Emotion


In cognition terms emotion is usually defined in terms of the instant (millisecond to second duration) response to an event. That emotion may then influence (and be influenced by) the longer term ‘mood’ of the person, which will in turn be affected by their overall personality (Wilson, 1999).

Models of emotion have been categorized as Adaptional, Discrete, Motivational, Dimensional, Appraisal and Constructivist (Scherer 2010). Of these the Discrete and Appraisal categories have produced the two best-known models of emotion. Ekman (1989) is representative of the Discrete emotion category and produced a commonly accepted set of ‘primary’ emotions:

happiness,
sadness,
anger,
fear,
disgust, and
surprise (although surprise is sometimes treated as a separate category).

Other secondary emotions can be considered as combinations of these.

The Ortony, Clore & Collins (OCC) model (Ortony, 1990) is of the appraisal type and has been influential in AI research, and considers emotions as valenced reactions to events, people and objects.

Derived from OCC is the E-AI Emotional Architecture model developed by Wolverhampton University (Slater, 2009) and which has been implemented in part by Daden in the Halo virtual human discussed in Chapter 8. It includes the following stages – which are reflected in many other models:

detection of stimulus (Sensation/perception),
appraisal of stimulus – against an Emotion Alert Database (EAD),
unconscious reaction,
physiological changes,
motivation to act, and
conscious realization (Feelings).

The WASABI system (Becker-Asano, 2014), discussed in more detail in Chapter 6, uses a 3D pleasure – arousal – dominance (PAD) space to represent secondary emotions – bringing together both appraisal and dimensional approaches. As with many affective systems the dominant emotion has a decay rate that returns to virtual human to a neutral state in the absence of any further stimuli.