I’m researching regarding reactive visuals. My ultimate goal is to be able to perform music and have visuals that change according to the music, without the need of a human controlling them. I’m still early on the development of the idea, and one of the first things i’d like to come up with is a list of meaningful musical gestures/parameters that could drive the visuals.
I still don’t want to get into the details of implementation, but given that ableton live is my musical tool, the ideas will be somewhat tied to its way of working.
So far I have the following:
This category comprises the main aspects of working live with the sequencer:
- tempo change
- volume change
- pan change
- fx change
- clip start/stop
- scene start/stop
All transmited from the audio computer to the video computer via MIDI/OSC.
This category comprises data that can be obtained from analysing the audio data:
- spectrum change
- dynamics change
For both categories, visuals could react in an immediate fashion (such as drawing something on screen on every beat detected) or due to changes over time (such as a filter going up or down).
I think this list gives me quite an interesting point of departure for developing more the idea, what do you think? Any other parameters that would be really interesting to drive the music?
THANKS GUYS a lot, any input will help this research!