Augmented PinocchioMichele Cremaschi
I created a so-called "Augmented Comedy" theatre show where, except for myself, all other characters and props on the stage are realtime generated holograms.
This is possible thanks to combining different pieces of software, including QLab, Syphon, and custom Quartz Composer patches which generate visuals based on a video stream coming from a webcam recording the stage action. With my body position known by data coming from sensors, visuals are projected in a "pepper ghost" system that makes it appear as ghosts with me on the stage.
The trick was to run each part of the system concurrently, making them communicate using OSC, MIDI and Syphon. This way, NiMate outputs OSC and a Syphon stream sent by a Kinect camera. A custom QC patched takes this, generates visuals, and maps everything to my body. QLab is great to let the operator trigger scenes, and with some custom Quartz Composer rendering, it also shows Syphon streams coming from the QC patch described above. Other software take care of light control, Osculator translates OSC messages between apps, and a QC patch does the final mapping.
All this flow required a lot of GPU and CPU power. The Macbook Pro we used initially soon started to become inadequate. Switching to a Mac Pro would have been the ideal solution, but at that time the new line of Mac Pros wasn't out yet, and anyway our budget didn't allow it. So, we networked a Mac Mini, running half of the software on that, communicating with the MacBook Pro through OSC over the network. We also split the graphical jobs over both machines, sharing the work thanks to the AirPlay protocol, using AirParrot as a server on the Mini and AirServer as the client on the MacBook Pro. This way, we got the power of a more powerful machine with one third of the required budget.