MetaChimp is an interactive media exhibit designed to teach museum and zoo goers chimpanzee facial expressions, vocalizations, and some accompanying gestures and movement. Installed, the player sees his or her self as a chimpanzee in a “mirror” on a . The player controls this on-screen chimpanzee avatar by manipulating his/her own face and body. Simple on-screen text and visual examples guide players to identify different chimp facial expressions and how these map on to social communication and mood.
A player interacting with this chimp mirror would be presented with a series of prompts to imitate a chimp’s expressions and even sounds. The player would learn that a human smile is similar to a chimpanzee barred teeth grin, which can signify that the animal feels afraid or submissive. The player could learn that chimps don’t furrow their brows in a frown when they don’t get what they want, but they can pout. And that chimps laugh when tickled — with a similar expression to humans, except with the lip covering the top teeth. With increased proficiency, the player could react to chimpanzee social situations, such as greetings, grooming, and display behavior. What would be fun about doing this in a public, zoo exhibit is that people could watch other people acting like chimps — and that could generate some genuine laughter. See DESIGN
MetaChimp is powered by facial tracking technology which maps human expressions on to a realistic 3D model of chimpanzee. Researchers have studied and compared human and chimpanzee facial anatomy, expressions, and emotions. Following Paul Ekman’s study of universal human expressions, the Facial Action Coding System (FACS), primatologists created the Chimp FACS. MetaChimp would employ both research on both species for people to learn the similarities and differences between the two.