Immersive Media Lab

Developing a multi-sensory interactive prototype


My interest in immersive and interactive media has grown over the months I have been studying the MA Immersive Media degree. I have developed a range of skills and knowledge surrounding technologies that enable me to build rich experiences. Whilst the technology is only a tool to tell the story, it is important for me to focus on specific areas of the field so I can begin planning my further research and practice. Over the course of this module I have completed three projects that have demonstrated a common theme of mixed and augmented reality (MR/AR). Why this focus? I mentioned in my post about the AR iPhone app I made in Unity callled OrnARment that I quite like the way pressing a digital button makes something change in real life. It turns out that I like things the other way around too – we can interact with a seemingly ordinary object and get an unexpected sensory response. I’m not so sure why I like it, but I do, so let’s leave it at that.

It’s this sensory response that made me really understand what I would like to do with this medium going forward – a stimulated, sensory experience. Whether this is purely discovery based or has a narrative, I am not sure yet, but exploring sensory input and output is a way of developing the skills to be able to facilitate the narrative when the time comes. Last year, I created an interactive documentary called ‘Sensing Synaesthesia‘, based on the condition synaesthesia, which is;

‘a “union of the senses” whereby two or more of the five main senses that are normally experienced separately are involuntarily and automatically joined together. Some synaesthetes experience colour when they hear sounds or read words. Others experience tastes, smells, shapes or touches in almost any combination. These sensations are automatic and cannot be turned on or off’ (Wannerton, UKSA).

I had always said to myself that I did not want to try and replicate these sensations, because it would be impossible and maybe wrong to attempt. However, now that I am beginning to have the tools available to me, I want to at least try and develop a prototype that could begin to simulate these sensory responses, or at the very least, act as a metaphor for what these sensations could be. I will attempt to explore three types of synaesthesia;

  • sound-colour synaesthesia (chromesthesia)
  • colour-number synaesthesia (grapheme colour)
  • a form of synaesthesia that evokes touch/tactility-based responses, and could relate to most other forms of synaeshtesia.

When making my previous interactive documentary, I interviewed my friend who has synaesthesia. She uses her abilities to produce artwork that is as a response to her synaesthetic relationship between music and colour. This is one of the themes that I would like to explore upon developing this experience, allowing users to construct visual representations of sounds – they could even play the sounds themselves, almost like they are ‘painting’ a song.

This example of ‘Studio Play’ by the Cleveland Museum of Art is an example of what the aesthetics of my prototype could look like. This uses body and hand tracking sensors to allow visitors to paint and draw within their environment. I find this engaging as the users are not interacting with technology, the technology is interacting to them. There are no buttons or controllers, and it’s this lack of digital control that makes this feel so natural, like an extension of your own senses. Removing this connection is something I think is really important to attempt to create an experience that, much like synaesthesia, feels subliminal.

This was something that my interactive documentary lacked last time – it felt far too strict. I made a JavaScript experience where the user could press buttons on a little synthesiser, and shapes would appear. The shapes would, however, always come in at the exact same position, so the concept of painting a song was pretty pointless…because everyone ended up with the same painting. And god forbid if you tried to play two notes at once, it just wouldn’t work.

So, I will attempt to use Isadora as a method of developing this prototype. The fact that I know I can make this a different experience for every user using the platform is exciting, and also because of its extensive abilities to enable interactivity from internal inputs, which is a significant component that is required for sensory input, and where other platforms such as Unity or Unreal Engine lack without additional hardware. Furthermore, Isadora’s simple yet powerful visual representations will enable me to create interesting synaesthetic visuals. Hardware such as a basic MIDI piano keyboard, coupled with an audio DAW, could accept user input to create sounds, which could then be visually projected onto the Isadora stage, with each note producing a different shape/colour, with other varying factors such as position and size.

For grapheme colour synaesthesia, the easy option would be to use a regular keyboard, where each letter produces a visual or sound. Honestly, this is boring, but even if people don’t find it boring, I will not learn anything from it. Therefore, I envisage an experience where the user has some letter and number cutouts (think fridge magnets) that react when placed in the vision of a camera/sensor. Even better – someone can write something, physically or digitally, and an optical character recognition (OCR) system is run to turn these into an audio-visual representation. An OCR system could output these characters into the Isadora keyboard watcher. This one will be tricky, so let’s see what happens.

Finally, for the touch/tactility responses – I have access to a Magic Leap Motion – a hand tracking device that can interface with Isadora to allow all 10 of my fingers to create visuals interactively. I could perhaps use this in some way for specific gestures, but it may not provide the tactile feedback I am after. Using the Eyes++ actors or an infrared camera to detect when the user is touching a real life object may be a more suitable alternative.

In the end, maybe next year, I want this to develop into a mixed reality (MR) physical environment – a room full of nothing but the senses, the cross-wiring of the senses and items that are tactile and sensory-enriched. It would be pretty good if people came to see it too, but let’s not get too excited. But this prototype will at least give me a foundation of what is possible with my current skillset, and what I could do to make it even more immersive.

References

Wannerton, J., n.d. UK Synaesthesia Association. [online] UK Synaesthesia Association. Available at: <https://uksynaesthesia.com/> [Accessed 30 November 2021].

Hu, R., 2014. Infographic – Synesthesia. [image] Available at: <https://www.behance.net/gallery/18146345/Infographic-Synesthesia> [Accessed 9 December 2021].