Final programming refinements

17 06 2010

I’ve been working through some final refinements to the Pd patch that I will be using for the installation. My main concern was determing a way to adjust the ambient noise versus heart beat noise in my audio interactive installation. Basically, I wanted to be sure that the heartbeat sound didn’t dominate over the ambient noise and vice versa when causing the moving image to pulse. After a couple of intense days with Ed Kelly from the London College of Communication I now have this final revision of the Pd patch!

Some recent and notable additions to this patch include an ambient sound calibrator and an audio playing object that allows for tempo and pitch adjustment of the heartbeat. The ‘wiring’ of this patch is now set up a little differently. My previous versions of this program had a direct link between the pulse of the videos and the audio that came in from the microphone. I added a ‘metronome’ and ‘drunk’ object to a’translateXYZ’  object to create a simple zoom effect on the videos that zoomed in depending on the pitch of the sounds the mic picked up. This was effective in a quiet space when all one heard was my looped heartbeat sound, but I knew the installation would be noisy at times which could cause the video to shake erratically.

With some key guidance and advice the patch now wires a direct number stream from the heartbeat wav file so that it doesn’t have to pass through the mic (brilliant idea Ed, and no, I don’t think this is cheating). I’ve kept the mic audio input part of the patch but have lessened its influence on the video through a set of objects that auto-calibrate for ambient sound (thank you for all your help with this one too Ed). Also, the fade between the videos is now linked to the heart beat sound directly. In effect what will happen is that the noisier the ambient sound is the less of video 2 (the text) you will see, that is until the auto-calibration kicks in. If it is silent or if noise levels maintain at a steady hum, the text will pulse and become more legible. If ambient noise is erratic all you will see is the video 1 (the traffic light themed images) pulsing with the heart beat and loud sounds in the environment.

Another thing that is newly added is adjustability to the patch’s sensitivity so that when I set up the installation I can digitally fiddle around with things until I have the output as I would like. My previous plan was to fiddle around with the position of the mic and speakers but this way allows for more fine tuning. In the end, I can still adjust the hardware side if I so choose 😉

I’ve also worked out how to eliminate the choppiness in the playback of the videos. I changed the compression of the video to  H.1263 codec which suits Macs and I reduced the file size by opting not to go HD. As a result the video plays better and is more reactive to sound. My previous videos were using 324% of my CPU (when running the program) and I have now reduced that to 36%!

I have also resolved a simple way to show the video fullscreen. I will set up using two screens: 1. my work screen  and 2. the projector screen. I’ll link the two screens on Pd then simply drag the video window into the projector screen. After that I can unplug my work screen and all that will be left is the playing video. Voila!

Hope this all makes sense. My head is still reeling from the stuff I picked up from the last two days with Ed!

Advertisements




Tweaking the Pd Patch

20 05 2010

I have been working on refining the Pd patch I designed as an image engine/video mixer for the project. So far I am quite happy with how it is looking.  As patches go, it’s not overly complicated: mainly a video mixer and an audio feed for a contact mic. The rest is connecting the dots and getting creative with where I put the mic. If anything, I could work on smoothening out the ‘pulsing’ effect. I’ll do this once I’ve settled on the mic/speaker combination I am going to go with and how it all works within the environment of a suitcase.

PD sketch for Final Piece





Projector Test

18 05 2010

This is a test on the studio projector (800 x 600 resolution). I quite like the look of things in general. The colours came out better than I had expected but I think I will play a bit more with increasing the ‘pulse’ of the video. I will also need to figure out how to lessen the lag between generated sound and video reaction.





Plan C: working through interactivity

17 05 2010

I have moved from a HRMI to an IR motion sensor to both flex and bend sensors but have not yet been fully satisfied the resluts. So now I am on ‘Plan C’.

I have worked through a very interesting pulsing effect with a few ‘objects’ in Pure Data. I have tested this sketch with my latest video and I am really liking the results. The Pd sketch uses an audio input from a mic and translates it to a pulsing of the movie. I am thinking that if I place both a mic and speaker (playing a pre-recorded heartbeat) within the suitcase I will be able to achieve a nice pulsing in the video, a level of interactivity from the inevitable ambient noise from the audience and stay within the theme of home is where the heart is!

This is a sample of some of the first tests I did with making the visual pulse with noise. In the video I snapped my fingers.





(Near) Finalized visual

15 05 2010

This is a fairly finished 5 min film that I intend to loop for the installation. I have maintained the colour theme of traffic signals and have worked with simplified horizontal movements of images and footage. The bars of red, amber and green will float each taking visual priority from time to time in the film.

Since the symposium, I have tweaked the pacing and the visual precedence each field takes. The red field has footage of foot traffic which I sped  up, while the green has palm trees which I adversley slowed down for contrast. The fields represent displacement and integration respectively.

The amber field is a photo montage of various parts of ‘Philippine-London’. Since the symposium I have added further images and dimensionality to the previously monotone bar. Amber is the transition between integration and displacement and now consumes most of the time on the clip. Within the amber field I have added splashes of green and red to acentuate it as a place in bewteen displacement and transition.





Working through ‘Plan B’

14 05 2010

After concluding that a heartrate monitor interface was not going to work due the fact that I would not be able to effectively imbed the HRMI into the handle of a suitcase, I reluctantly went to work on ‘Plan B’ (the motion sensor). I have always been hesitant to go this route as I didn’t want to lose the heartrate theme and its associations with home/displacement.

This last week has been spent finalizing the code/sketch for an interactive (sensor driven) installation in Pure Data (Pd) and refining the visuals the Pd sketch will project.  My initial back up plan, “Plan B” was to work with a motion sensor (in lieu of the HRMI) as a method to create interactivity between the suitcase, the viewer and the moving image.

PLAN B

Experiments with an infared (IR) sensor have left me unhappy with the sensitivity of the sensor. So far I can use it as a basic on/off switch but have been really struggling with getting the video to play in relation to specific movements. I am realizing that it is either going to take a bit more work to figure out an optimal combination of motion sensors (aimed at the suitcase and partcipants) to have decent results or I will have to limit the movement of my audience by directing them to only move the suitcase in certain locations and in specific ways.  Furthermore, due to the space constraints of the exhibition, I will most likely have to consider a less elaborate setup that maintains the interactive integrity that I’m looking for.

PLAN B (revisited)

This brought me to start experimenting with flex and touch sensors. Flex and touch sensors essentially work like potentiometers except that the electric current changes from the pressing on or the bending of a sensor instead of the turning of a dial.  In theory, I could either place the touch sensor below the suitcase the flex sensor in the suitcase handle to achieve some level of interactivity.

After ordering a variety of sensors from Sparkfun.com and modifying the initial code I hacked out a few months ago with the potentiometer (link here) I finally got something to work with the sensors. However I am still feeling dissatisfied with the overall feeling of things. The idea of picking up a suitcase just to play a video seems a bit gimmicky. Why go through the complication of sensors, arduino boards and a computer when I could just hook up some kind of on/off power switch to the projector that was set to play a looped video? How is this any less interactive? I could even place the switch just below the suitcase so that it would turn on and off as one lifts the suitcase. This would certainly leave less room for the unwanted ambient interference that I am experiencing with the IR motion sensors and would be much less complicated than feeding a touch sesor through an arduino board.

Touch and Flex Sensors attached to Arduino Board

With my time dwindling I’m coming to a crisis point and I am questioning the purpose of interactivity and why it is I feel it isn’t currently working as I want it too. What it comes down to is that initially my intent was to create interactivity based on a heartrate monitor interface. The HRMI worked as it fit with the theme of home and displacement and fulfilled my project aims of connecting my digital practice to my interest in the Philippine disapora in a very relevant way. The theme of ‘Home is where the heart is’ nicely encompassed sentiments of the community I was studying and the direction of research I had done into digital technologies.

Interactivity added a level of immersion to the project. Consequently the user-directed change of images from an HRMI highlighted the fluid nature of diaspora which cannot necessarily be pinned down to a looped film. I liked that each viewer would be privy to a unique modification of moving image montages that could never be repeated again. I like that this was connected to the suitcase which linked this fluidity of images to the physical movement of people. I like that this was connected to home and the heart.

I still do not want to compromise on this even though technically I have run into a few roadblocks…. Time will tell. I am going to have to do some hustling.





Symposium 2 work

1 05 2010

Below is a sample piece that I will use for the Symposium on Tuesday. In the next post I will include the text and any additional images that I will use in my presentation but for now I thought it prudent to show the example work that best exemplifies where the project is currently at. Below is the intended transition clip which will play as soon immediately after the viewer picks up the suitcase.