After being introduced to Isadora less than 2 months ago, I am really pleased with my final product and experience. I set out to create an interactive nature experience, about the things people can find in their garden, using all the resources that Isadora offers, such as media playback, motion tracking, mouse and keyboard input, audio and video input, and more. I have successfully created the experience I proposed, and discovered new ideas and possibilities along the way – many of which never made it to the final project, but will be used in future explorations into Isadora. Despite this, there are some shortcomings in my experience that I really think would further enhance the experience.
Firstly, my experience is very visually interesting, yet I feel like the audio experience could do with some improvement. Despite there being some audio throughout, such as some of the video clips in the ‘Make a Documentary’ and the ‘Watch a Documentary’ experience, in addition to the use of sound tracks, I feel like using audio in more places would have provided a more immersive experience. The home screen for example, would really work well with some immersive wildlife sound effects. This could have been done using a random Isadora patch, much like I used to create the randomised documentary experience. Another consideration would be the use of spatial audio. Within Isadora, each Movie Player actor can have a different audio outputs, meaning each Movie Player or Sound Player could be outputted to a different audio device connected to the computer, or alternatively, if an external audio device is connected that allows multi-channel output, this would allow me to route each Movie Player actor to a corresponding audio channel. This would allow me to simulate a surround-sound experience, with each different audio device playing a different sound. If this project where to be exhibited, this would provide a much better experience, as the whole room can then become immersive , rather than just focusing on the screen. I could have simulated this by having my desktop speakers playing one sound, then a portable speaker behind me, playing another sound.
Another simpler, yet also effective audio technique that could have been used to simulate a surround sound experience, would be to manipulate the ‘pan’ of the audio on Movie and Sound Player actors. This would allow me to crossfade between left and right audio channels, so if the user is using headphones (like I was in in the demonstration) or a pair of stereo speakers, audio could be panned across the two channels, to create a spatial audio effect. This could even be implemented within motion tracking on the Intel Realsense depth camera, so when my head is turned one way, the audio will pan into the corresponding left and right channels.
Another experience that could further be improved is the 360 immersive experience. Overall, this was an experimentation of Isadora’s use of the 360 Player actor, and how a 3D object like a sphere can be mapped to a texture, to then create a 360 degrees experience. I decided to implement it into the final project, as it shows a brief demonstration of what other immersive experience can be made within Isadora. However, this is not really any different than a standard 360 YouTube video, and in many cases is in fact not as good, as the controls are more prone to error and the experience is overall not as smooth as a YouTube video (i.e., you have to use the X and Y axis control pad, rather than just panning round the image on the screen). A really interesting way this could be improved is to use the Intel RealSense Depth Camera, along with the Skeleton Decoder actor, to control which way the 360 video is pointing. This could be achieved with very slight movement of the head on the X, Y and Z axis, where, with a large enough screen, the user could move their head to change the position of the camera, yet still see the video output. Furthermore, the audio experience could have been significantly improved with the 360 experience. At the moment, the demonstration just uses the Insta 360 One R’s built in microphone to record the audio, which was then just outputted as normal in a Movie Player actor in Isadora. A way to improve this would be to create spatial audio. For example, I could have another actor attached to the output of the X and Y axis Limit-Scale Value actors, which would then attach to either an ‘Inside Range’ actor or a ‘Comparator’ actor, to then output:
IF [X value = (specified number)] AND [Y value = (specified number)] THEN (play a sound)].
This would mean if the X and Y position where each a certain value, it would trigger a sound. For example, if the camera was pointed towards a bird, a bird sound would play, and if it was pointed towards a squirrel, it would play crunching sounds. This would then allow for an experience that has far greater immersion than it currently offers.
As stated, I envision my experience to be placed in a museum, gallery or activity area. Generally, I feel like I have achieved the needs for this type of environment. However, confining the experience to a 1920 x 1080, 16:9 stage could be seen as somewhat limiting. I have implemented motion tracking, audio input and output and spatial awareness, which does, on the whole, fulfil the needs for the project. Yet I do feel like I could utilise the space that would be around the exhibition in a much more creative way. This could include projection mapping, where I have multiple stages within Isadora and each stage could display a different output, making the experience more ‘phygital’ – where the use of technology bridges the digital world with the physical world, for the purpose of providing a unique interactive experience for the user. Technician and lecturer Montgomery Martin spoke about the importance of this, suggesting that ‘it is so important to not lose sight of the space that you are working in, in particular because over the years, I have found that a simple effect created with Isadora, one that shows strong consideration for the space and the bodies within that space, is often more effective, meaningful and more memorable than the most complex bit of programming or generative art I’ve created with Isadora. What you make with Isadora needs to be part of the environment that it inhabits – it needs to feel responsive like it belongs there’ (2020).
Finally, there are some small adjustments I would make to the experience to make it appear more polished. In the ‘Make a Documentary’ experience, despite the videos playing in a random order, each with a random duration, the start position on each one of the videos is the same – they all start from the beginning of the video. Whilst each video will never play again next to each other, there is a possibility of the same video playing twice within the same 60 second sequence. Even though each of these repeated videos will be a different duration (one maybe 5 seconds and the other is 3), they both start from the same point in the video. This would make it appear like the same video is playing twice. An improvement would be to also change the start position randomly of each video, so it’s virtually very unlikely for the exact same part, of the exact same video, of the exact same duration, to play twice. I do however acknowledge that this would be more complicated, as the random duration of the video, could in fact be set longer than what’s actually available to play from the start position.
Of course, I could just add another 20 videos into the mix, so it literally becomes almost impossible for the 60 second sequence to repeat any videos at all.
Overall, I am incredibly pleased with the final project and demonstration. From a personal point of view, I have learned a brand new piece of software which I didn’t know existed two months ago, and am now looking forward to diving deeper into the possibilities of Isadora and the notions of creative and integrative digital media as a whole.
Below is a link to my final video demonstration of the experience, in addition to an overview of the Isadora patches:
troikatronix, 2020. Isadora 101 – #16: Ten Best Practices. [online video] Available at: https://www.youtube.com/watch?v=AQyFYey1NwU [Accessed 17 December 2020].