I created this real-time music visualization software from scratch over the past few months and plan to use it in my upcoming live set shows in Southern California. It actually makes the animation as the music plays inside the DAW, so you can do all sorts of crazy live mix-up stuff in Ableton & FL Studio, etc., like, you can play stuff live on a MIDI controller and having it show up on the screen mixed in with the other graphics.
What you see is a bare-bones proof-of-concept demonstration that I threw together using all the basic shapes implemented so far.
I'm seriously stumped for inspiration and wondering if anyone has any interesting ideas how this could be integrated more-so with the audience's experience to make the show even more engaging and live-esque? I could theoretically make it interact with a Microsoft Kinect from an XBox360? Like a fake light sabre battle?
Or, the system could support people in the audience sending text messages to a predefined number, and somehow a few choice selected messages could get displayed on the screen. E.g. you could have the people in the crowd text their favorite sports team the the most texted team gets displayed? (that kinda sounds cheesy, though).
Cheers, -Arpeggio Junkie