+ view (very) large image
Everyone, I’m caught up in a project at work that’s forcing me to put a hold on the release of this software. However, I realize I need to put out some proof that this thing exists. So, above is a screen cap from the software I’ve entitled visp.
And I want it to be no secret that the title comes from these words: visual performer. I also want to say that this software was designed as a prototype for a personal proof-of-concept; it isn’t intended to be software that’s supported or salable. And I actually haven’t taken the time to reflect on whether or not my concept was proved as a result of its development. Anyway, enough jibber jabber. Here’s why I built visp along with a general list of its features.
The Reason(s)
Too much UI management. That’s why I hate VJ software. More often than not, VJ software gets you sucked into managing the UI during your performance rather than putting the emphasis on performing. That’s something I never really understood. I always wanted to be gestural with my performance. I wanted to be able to physically say, “Now I want the visuals to blow up.” or “Now let’s start highlighting this sound.” Instead, I found myself using the keyboard and mouse to manage some UI element, forcing me to miss those precious cues. Granted, yes, I could use some sort of controller (MIDI or OSC or something hacked together). But that leads me to my next irk.
Too much focus on clip triggering. I don’t think like this. I’ve got an interactive media background. I think in dynamics. I think in variables and rule sets. I don’t think in terms of linear media (generally). Almost all the software out there has this huge focus on layering video clips and that’s just not for me. So, yes, while I can gesturally trigger clips using some sort of controller, adjusting layer opacities and filters using sliders, I’m still limited to the linear pre-rendered footage in my library. And it just doesn’t suit my tastes. So, you may ask, why not use something programmatic like Processing or Max/MSP/Jitter or Touch?
Maintaining visual continuity is so difficult. In a lot of these other software environments, you can build very beautiful, very dynamic, and very interactive visuals. However, moving from one visual construct to another is no simple task. There’s no “meta” application that really lets you string these pieces together in a unified environment (unless you build it yourself). I started off actually doing just that for Processing applets but was instantly thwarted due to a bug in the latest build. Fortunately, I had access to Adobe’s new desktop platform, codenamed Apollo.
The Solution
Apollo came at just the right time. I could leverage my abundant Flash / ActionScript kung-fu. I could develop a standalone desktop application (that gave me access to native OS windows). And I could leverage the very easy-to-use Flex paradigm of application development. The Flash 9 Player was miles ahead of the Flash 8 Player, so I thought I’d take her for a spin.
Also, I wanted as much of a hands-off-the-keyboard-and-mouse experience as possible. I need to evaluate still where I’ve succeeded and where I’ve failed. Frankly, right now if you don’t have a MIDI controller, this thing is useless.
The Features
I’m just going to list these off in no particular order and with no real explanation. That’ll all come when it gets released.
- Support for development of your own “Modules” that essentially give you a Flash canvas to play in and do whatever your heart desires
- Module browser using thumbnails you make yourself
- Full MIDI support using an external, standalone Java applet (also opensource)
- 6 slider-like MIDI inputs for use in your own custom “Modules”
- 4 velocity-aware button-like MIDI inputs for use in your own custom “Modules”
- Support for transitioning between Modules (using built-in Transitions). You choose the duration and the transition effect.
- Change the stage’s background color (wow)
- Two active filters can be going at a time, with up to three MIDI assignable sliders – and a UI interface that you build yourself
- Output your visuals in a separate, chrome-less window of 320×240 or 640×480 resolution (thank you Apollo team)
- Real-time 640×480 preview screen
- BPM Tempo Tapper for assigning some beat-driven automation to the input sliders and buttons. It’s not audio-driven but VJ-driven. You gotta tap the tempo yourself.
- FPS (frames per second) display
- OS X and Windows compatible — seriously — I’m actually able to jump back and forth with the same codebase and same modules without any hiccups or platform-specific code!
- It’s internet aware – that’s the beauty of Apollo.
That’s all I’m going to say about this thing for now. But feel free to post questions. I’ll be happy to answer them.