On Friday, after watching all the kickstarter videos from the groups, we were finally able to set up and test our installation in it’s full form. I had finished off all the Kinect code, and we had successfully linked up both the TUIO messages communicating to the visuals, and the MIDI messages communicating to the music.

It does get kind of annoying putting up and taking down the spandex every time, but I would much prefer it be safe and undamaged, so I guess I’m going to just have to put up with it :( Anyway, we set up our installation in a section of the room, and got to work calibrating the Kinect and projector to be at the perfect distance, which is again a time consuming part. Once we had set everything up, we ran the Kinect and the visuals first, and we all gave a collective sigh of relief when everything was working perfectly. One minor problem was the speed of my laptop, since I have been working on this from home on my much desktop it always runs very fast and smooth. My laptop is only really used to take notes and is horrible, so running the Kinect, projector, and visuals at the same time was taking it’s toll. Whilst it did all function correctly, there was a bit of a delay between the touches on the spandex and the visuals appearing, which almost defeats the purpose of them being interactive. Obviously we had never intended to run the program at the exhibit using my awful little laptop, but until now we had also never bothered to put it on a Mac Mini. So after a bit of hassle we got everything set up and working on the Mac Mini, ran everything, and watched on in adoration as the visuals were responding with zero delay.

After that, we set up the music side of things with Ableton, finally giving us a chance to test out the seven new funky tracks Naomi had pumped out over the week. Whilst the music sounded fantastic (after some incidents with the Mac Mini and the speakers…), because the x and y position of the blob on the entire spandex surface was being mapped to MIDI values 0 to 127, you would need to drag the blob from one side of the canvas to the complete other side in order to hear the full effect. Because you lose the blob when you take your hand off or aren’t pushing hard enough, as well as the elastic we are using to prevent large blobs being in the way, it is near impossible to achieve. Whilst we knew effects were being applied, and we could subtly hear them, it would have been unlikely that a visitor to the installation would have been able to tell.

As a result of this, I have now mapped the MIDI values into three separate sections, where the spandex is divided by the elastic, so now you will be able to hear the full range of the effects by moving the blob in a much smaller area. We are hoping to test this out in the session tomorrow, as well as get organised with the twitter bot, music, and streaming stuff so we can be super organised for the exhibit. We will also need to take a couple more shots of the installation working for the posters, flyers, and also to put into the video.