This time I wanted to add some actual volume — not just perceived volume — to the form. These are just some initial tests that you see above. I’ve got a ton to learn about lighting and optimization and matrix transformations. Robert says I need to use “per-vertex normals”. I say “OKAY!” Hopefully I can start adding some of these things over the next couple weeks and really use this project as a general learning tool.
For now, I’m developing all this in Processing using the PGraphicsOpenGL renderer. I’m not using any actual OpenGL calls directly; I’m only using the Processing API. One little discovery I made yesterday: PeasyCam turns out to be a really sweet little camera library for Processing. Adds mouse control over the camera with a single line of code.
The ultimate goal of this is to create two different pieces. One will be a rendered motion piece. It probably won’t have any audio and will function as an ambient looping video. It will be submitted to this call for entries: Flow Interrupted.
The second piece would actually be interactive. I’m going to toy around with camera, audio, and touch as discreet inputs (all separately). This will give me a chance to play with the new HP TouchSmart PC I got. So you could consider the second execution to be more of an interactive installation piece. And that’s going to be submitted to this call for entries: FLUID Interactive Digital Art.
Check out some of the stills: