VIP – Virtual Interactive Puppetry

The VIP project aims to create a user-friendly interface and portable technology system to allow disabled dancers and others with limited physical mobility to regain or construct a new sense of empowerment through virtual movement with puppets and people in real time and space. The system is modular, and has many different user outcomes and distribution opportunities. It is scaleable, and can be used in full or in part, depending on the total performance or experiential outcome desired.

• Construct a new performance environment and platform for virtual interactive puppetry: the CASS system (Collaborative Augmented Stage Set).
• Enable collaborative distributed performance over distance in synchronous and asynchronous modes.
• Create frameworks for collaborative awareness in distributed performance spaces linking audiences and performers.
• Apply multi-modal interfaces for capturing body expressions to be used for creating ‘liveness’ in telematic puppets.
• Develop a series of prototype systems enhancing visceral awareness and connections of performers and audiences, informing interaction and performance over distance.

The original planning of the project conceived of a sequence as follows:

  1. Puppeteering and playing with ‘input’ puppets generates motion and tactile information.
  2. Video-based gesture-capture tracks hand motions.
  3. Encoding and mapping onto a virtual actor (‘vactor’).
  4. Projection and re-embodiment of ‘vactor’ into a physical, full-size animatronic puppet.
  5. Performance with animatronic puppet in smart stage, testing collaborative awareness of participants at each VIP location.

The collaboration Partners in VIP early development included:
The SMARTlab Digital Media Institute , Jakub Segan of Bell Labs, Media Lab Europe, UCL VRCentre, CATlab NYU, FhG/FIT, UniS AI Lab/CVSSP, NYU Tisch School of the Arts, V2 Lab, and EPFL.

In the testing of ideas, the Butterfly Project was born. This project brought in a new collaboration partner, BBC Imagineering, and focussed in on the role of interactive music and dance with animated and robotic puppetry interfaces.

The collaboration team that brought the Butterfly Project and its first performance, the Flutterfugue, to life, was: The SMARTlab, NYU CATlab, MLE Dublin, and BBC Imagineering.

Skip to toolbar