Hi! I’m working on a Kinect/OSC controlled MIDI setup, inspired by Soundbeam:
I’m already quite happy with it, but there are two additional features in particular I would like to implement:
Step-by-step playback of MIDI files. When the user’s hand hits a trigger point, the MIDI file plays one step further. When trying this, using the MIDI Out help file, I got unreliable results, as if the starting points only go by bars, and not smaller note increments. I could perhaps alternatively send some sort of play/stop signal to Ableton Live, but then I imagine I would need to send the BPM information from Live to vvvv.
Gesture recording. I really enjoy FAAST, and its very easy-to-use interface, but it’s unfortunately possible to have two applications use the Kinect simultaneously, as far as I understand. What would be cool would be to be able to record gestures using the timeline, and perform checks on the current user input, to see if it matches. Has anyone attempted this previously? It sounds doable, but also very complex and challenging.
I’m gonna try seeing how the gestures in Mr. Vux’s example work with the Gestures (Kinect Microsoft) node work, but it would be really cool to be able to record them, as seen in the GesturePak .NET library. Gotta learn some programming beyond the basics of Processing.
- In addition, I also plan on putting more than one axis joint to use simultaneously, for generic MIDI control data (0-127), and add support for more scales. I’m thinking perhaps it could be pulled off with an XML file or something, to avoid all that hardcoding.
Do you have any further suggestions for a setup like this one? Your input is greatly appreciated. Thanks!