Japanese producer and technical director Nobumichi Asai has not been quick to share the details behind a new projection mapping system combined with real-time face tracking that is capable of creating animations and virtual makeup on a person’s face. Asai and a team of CGI experts, makeup artists, and graphic designers have used a 3D mesh they developed as well as laser scans of a model’s face to create endless possibilities. Their state of the art technological development has earned the nickname “living makeup.”

Dots that appear on the model’s face are used to transform her face into robotic-like features, different makeup patterns, and even bubbling water via motion tracking. Asai has used a similar projection mapping system on cars, the stage, and buildings before. However, this is the first time his face tracking technology has been demonstrated on a human model. The digital makeup/illustrations stay consistent on the model’s face, even as she moves left, right, up, and down.

The Omote team is unclear as to what their projection mapping system’s real world applications will be. As of now they say it will not be available for general use and will only serve for a tech demo. Thanks to Asai’s background in theater (Omote’s inspiration came from Noh masks which are often featured in Japanese theater), the days of green screen and CGI in film could be numbered. A video demonstrating Omote’s capabilities can be found below:

OMOTE / REAL-TIME FACE TRACKING & PROJECTION MAPPING. from something wonderful on Vimeo.