Continuing to be busy over the last year - too distracted to be posting status updates, obviously - with projects involving the convergence of art, music, and technology, from LA to London to Europe to NYC ...
The last time we had the sensor rig out for a performance/demo, the only failure was the sensor mounts on the heel and toe.
Now the FSRs (force-sensing resistors) are built into footbeds in these cool Capezio dance shoes - full dynamic range and solid data stream ('cause they stay in place, right under the points where the dancer meets the floor).
Somehow we managed to mangle one of the sensors on the last go-around; replacements are on the way and then we'll be good to finalize the instruments and programs for the next show.
Also this week was the first test of the full-body gesture recognition and translation system, downtown at our warehouse space.
With the audio system, video projection, lighting and laptops to run the audio and video software set up, we strapped the sensor system on dancer Forest and ran a short improvised performance where her movements drove both the music and visual compositions.
Over the last few weeks I've been prototyping the sensor rig for the upcoming gesture-driven performance system.
Rotation at each major arm and leg joint is tracked by simple 50k potentiometers. Dual-axis accelerometers are mounted on each hand.
Some descriptions of custom instruments recently developed for the Los Angeles band Moodorgan have been posted, here.
These are custom MIDI controllers and hardware designed for live performance.
In 2005 I assembled a group of artists and musicians to create images and sounds in a live performance environment.
The NASA lunar mission experience is a centerpiece to this exhibit, allowing users to learn about NASA's plans to construct an outpost on the moon.
Interactive applications for individual experiences combine a variety of rich media types and visualization paradigms to provide the user with a deep understanding of a product or brand.
VS3: realtime visual synthesis - a visual instrument based on the architecture/approach of a music synthesizer: original images are created in a performance environment by selecting sources (as one would select waveforms on a synthesizer) and applying some sort of method (algorithm) to combine and/or modulate them. Unlike a typical post-production environment, the VS3 system is designed to allow creative decisions to be made on-the-fly in a live performance. This approach is also more advanced than the typical 'VJ' paradigm, as the system doesn't just mix existing sources but actually generates original footage in real-time.