I collaborated with artists Michael Kontopoulos and Nova Jiang to create the interactive piece Moon Theater. It is a public artwork that allows people to playfully participate and collaborate with each other by creating narratives through a digital shadow theater. My role in this group project was the programming of the software that detects the hands of the public and use them to control the puppets, beatifully designed by Nova.
The piece was first exhibited at the Glow festival in Santa Monica, in July of 2008. Then at the Night Lights event at the Scottsdale Waterfront, Arizona, in December. And it was very recently part of the lineup of the New Frontiers gallery, at the Sundance 2009 Film Festival.
The piece was placed outdoors for 5 days, and many people walking down Main Street on Park city approached our spot to play with the moon puppets. We have been improving the hand tracking algorithm since the first exhibition of Moon Theater at Glow, and here at Sundance it worked quite well.
The algorithm is based in the blob detection library for Processing, which allowed us to detect the hand sillouetes in the live feed coming from an infrared camera (the physical setup for the camera and infrared lights was designed by Michael). Points along the sillouetes were assigned and tracked from frame to frame. These points were then mapped to the puppet’s joints, so the changes in the hand’s contours resulted in the motions of the puppets that people saw on the screen.
People at Sundance playing with the installation
This video shows the hand tracking program running, displaying the underlying hand silhouettes and the points that are placed on the edges of the silhouettes. These points are constructed by subdividing the boundaries of the blobs by a constant number of segments. In this way, is possible to track these points from frame to frame. The joints of the puppets are attached to the tracked points. The video capture uses the GSVideo library.