Generating Emotive Expression on Robot

The emotion system influences the robot’s facial expression. The human can read the robot’s facial expression to interpret whether the robot is “distressed” or “content” and can adjust his interactions with the robot accordingly. The person accomplishes this by adjusting either the type and or the quality of the stimulus presented to Kismet. These emotive cues are critical for helping the human work with the robot to establish and maintain a suitable interaction where the robot’s drives are satisfied, where it is sufficiently challenged, yet where it is largely competent in the exchange.

The human observer perceives two broad affective categories on the face, arousal and pleasantness. It maps several emotions and corresponding expressions to these two dimensions. This scheme, however, seems fairly limiting for Kismet. First, it is not clear how all the primary emotions are represented with this scheme.

It also does not account for positively valence yet reserved expressions such as a coy smile or a sly grin. More importantly, “anger” and “fear” reside very close proximity to each other despite their very different behavioral correlates. From an evolutionary perspective, the behavioral correlate of anger is to attack, and the behavioral correlate for fear is to escape. These are stereotypical responses derived from cross-species studies – obviously human behavior can vary widely.

No comments:

Post a Comment