Emotions are certainly a defining characteristic of all mammals, and particularly humans. Emotions play a crucial role in the interaction between people. Because humans have grown accustomed to interaction using emotions, computer interfaces could emphasize this to improve their intuitiveness. The most useful and practically justifiable application of "emotional systems" therefore lies in improving the interaction between humans and machines by making the AI more lifelike.
Humans communicate emotions in a wide variety of ways. Researchers are attempting to convey artificial emotions in the following forms:
Expressions— Static facial expressions carry a tremendous amount of information about emotions (for instance, smile or frown). In fact, another perspective for studying emotions uses facial expressions to distinguish primary emotions [Ekman79].
Gestures— Body language and gestures are also strong indications of emotions. For example, slouching is a sign of depression, and nodding shows acceptance [Perlin96].
Behaviors— Over longer periods of time, behaviors are stronger manifestations of emotional state [Bécheiraz98]. For example, ignoring someone is a sign of rejection, taking care of people shows affection.
Language— The choice of words is an extremely strong indication of mood during a conversation (for instance, familiar or formal). Rhythm in sentences also conveys emotion; shorter sentences are more authoritative and sound angrier (as reflected by the artificial language Lojban [Lojban03]).
Voice— The tone of the voice the sentence is pronounced with also reflects mood. Loud voices indicate anger, faster speech often implies anxiety, and so forth.
As indicated by the preceding list, the portrayal of emotions in synthetic creatures requires more than just AI. The fields of modeling, animation, linguistics, and speech synthesis are key aspects of the development.