Interfaces for Communicating Emotions
When developing a complete animat capable of emotional response, there is often non-AI technology involved in portraying emotions—as discussed in Chapter 36, "Emotive Creatures." To abstract the AI from the animation (for example), an interface is needed between the two components. This section covers the design of such an interface.
The next few chapters assume that the platform has support for expressing emotions, and the AI is only responsible for driving them. This assumption is similar to all the other parts, with the engine always taking care of the lower-level details within the 3D world. (This is a challenge beyond the scope of this book.)
Because the platform provides the functionality to express the emotions, the AI must interact with it to express these emotions. The interface should have the following characteristics:
An interface with these properties will enable both AI engineers (creating emotions) and the game engine developers (portraying emotions) to program the functionality independently from each other. Naturally, they'll eventually need to agree on the particular emotions, but the framework will remain the same.
One option is to include the emotions with each effector. When an animat executes an action, the game engine will be provided with the details of how to execute it. This approach provides the flexibility of per-action emotion control. However, because moods are consistent over time, such fine control is generally not necessary. Including emotional status in each interface would probably prove somewhat cumbersome, difficult to extend, and backward incompatible.
A better approach is to separate the emotions from the effectors and sensors. This distinction allows the current mood to be set independently from the rest of the AI—and this approach reduces the overhead because the emotions can be changed. After the emotions have been communicated via a separate interface, the AI would assume the game engine does its best to portray them in every subsequent action.
With this approach, the game platform can just ignore select emotions—effectively providing backward-compatible support. This also makes the interface portable, because it can be integrated to any game engine regardless of the technology present to portray the emotions. Extending the system will be a matter of providing extra functionality via a single interface, which may not even require extending the system.
Each component of the AI that can be affected by emotions depends on the emotion interface. This interface provides information about the current emotional state that can be taken into account by the implementation.
The most obvious way to query emotions is via a function call. This returns the value of an emotion, with 0 indicating no emotion and 1 corresponding to full emotion (similarly to fuzzy logic definition):
float GetValue( const string& emotion );
This approach has a relatively low overhead, but is particularly suitable when components are not updated regularly, or do not need up-to-date emotion values. This polling approach could be replaced by event handling, whereby the implementation is passed messages when the emotions change. This second approach is more efficient, but requires memory to store copies of the emotion values passed by events.