Experiments with interactivity

2 04 2010

I’ve ┬átrying to figure out the mechanics of how I’m going make my short video clips interactive. Of course I intend to use the heart rate as a trigger but I am running into a few road blocks with the technology (please see the bottom of my Project Summary: Curatorial Notes). As a back up plan, I am thinking of incorporating motion sensors or touch pads (or even mics) to and around the suitcase so that interactivity can be established through motion and sound instead.

In my research into this I’ve been working with Processing. So far I’ve been able to get small video clips to move in relation to cursor control. In theory, if I were to incorporate an Arduino generated number string to the code (similar to the experiments I did with the potentiometer) , this code could be the engine for image generation in relation to the ┬ámovements of a suitcase. Below is an initial experiment using a clip of me cutting a mango.

Below is another example using some video I captured of palm trees that are thriving in South London. I like what’s happening here as one really gets the sense that one can move through the clip. It’s like scrolling through a large picture or document only it is a running movie with a sense of inner motion that is independent of one’s movements.

In the writing/rewriting of this code, I ran into memory problems which I was able to fix by a quick adjustment to the cache and a resizing of the video. That said, my project will involve a variety of different video clips and I will probably run into this problem again. I am aware that I will eventually have to shift away from Processing and probably use a Max/MSP or Pure Data (more weighty programming interfaces) but I am loathed to give up everything that I have learned with Processing. Maybe there is a way that I could incorporate the two or possibly simplify the concept so that I would not need as many large video files?