Video report of an Augmented Dance session test.
I finally managed to reduce the latency a bit, perceptually at least.
Using a Kinect for skeleton tracking,
QC to render the lines (+1024_ParticleWarfare + 1024_Rope + 1024_Skeleton),
MadMapper (+Syphon) to match video and real action,
MaxForLive for the randomly generated piano sounds.
And a Canon 5D SLR for the shooting.