Moving organisms

As spoken of in tutorial 2, I imagined the next sensible step would be to make some kind of dynamic organism based on the lessons learned from my X-Swings. I made a preliminary sketch and wrote intuitively a key sentence underneath. It should be a dynamic organism that adapts to our being and expresses ongoing change. Well, these aims will not be achieved lightly, and certainly not within the first experiments, but they give me a driving force and direction. However, while working and reflecting on the thing at hand, I feel great interactive potential ahead.

 

One could argue that creating an organism consisting of mere technical parts is way too anthropomorphic. It might suggest that the thing involved is just because of its movement alone much too much humanised. But then I would like to add that the entity has sensors, there are chance principles involved and it uses brain power i.e. computer logic as well. Consequently it exhibits some essential characteristics of the properties of life, so I dare to use the term organism here.

Experiment 3101 started from the idea to use copper wire and building an flexible antenna like artefact as I had done before but quite differently. That resulted to the following piece to be found here. The idea was to use a servomotor and a small camera connected to the Arduino. In this way some form of interaction could be introduced, based also on the processing programs I made for X-Swings. The following drawing shows the idea.

 

First, my focus had to be on the mechanical things and forthcoming questions. Which servo was suitable, what camera would be connected, frame rates and delay rates for triggering, how long and thus flexible could the wire cope with the forces… one servo broke in the experiment, I opened it up, looked at the cogs and recognised what had happened while the first images were produced.

The Processing settings were not adjusted at all, as I just wanted to see how the tracking would respond to me. But soon I realised that ‘it’ reacted mainly to its own sweeping motion and the light situation of my room. Then I went to see what the camera saw by mixing the images simultaneously and got the blurred imagery as shown underneath.

 

 

Originally I did not anticipate on the images at all. Then I realised that analog and digital were stating something about the reality at the same time shown in my laptop images. Every second a new and sometimes really surprising visual came up. What a fine yield to build upon!

Testing and playing with the available parameters, I could make the system react softly and quite ‘scary’ (according to my daughter). By dimming the light, it became less sensitive and the images chanced also as one can see in the second part of the video of the experiment. I made the motion tracking lines black to white, depending on the z-axis motions. The grey toned lines felt more suitable than the bright colours from before, as they represented the sensed tracking data in a more neutral way.

When reflecting on the loop of events (> SCAN > TRACE > MAKE > SEND > TRIGGER > SCAN and so on), I noticed the resemblance with my experimental scheme in which feedback plays also an important role. I realised how essential ‘feeding-back’ is in my work. I recognise this as the way a system, organism alike, will adjust the latest information and do its next interesting thing.

 

See for a video of Experiment 3101 here.

 

Next steps will be a new experiment to find out how interaction of own movement can be build upon.

Also I want to do a test with the acceleration sensor in a phidget and see whether the results improve on the human interaction as well.

As a result of these and earlier experiments, I got a new and playful idea for an interactive immersive video installation, which I will develop next week. Hopefully this will be working out well, so I can use it for one of the two open calls I am working on this month.