“physics the study of wave phenomena, esp sound, and their visual representations.”
Very early audio-visualisation came from Cymatics which was first discovered by Hans Jenny. Cymatics are visual representations of sounds, whereby when sounds are passed through various different materials, the vibrations create a diverse range of visual patterns (as displayed in the Nigel Stanford video above). Liquid, particles, grains or pastes can be used to create these various effects, each with different outcomes. When varying viscosities of liquid are used, they can create almost ‘dancing’ figures:
Cymatics have been described as a way to “enable us to experience sound through vision” (Lewis, S. 2010), which is what has been a huge influence on my work; I have always been interested in the movement of natural forms, such as particles and liquid, and also the idea of bringing life to stationary objects, and to do this through the medium of sound is going to bring a new dimension to my work. Bringing life to objects was the idea behind my Ferrofluid project, as I was able to make a black liquid come to life with the use of magnets and sensors.
I also have a passion for music and audio, so the study of Cymatics and how objects move in reaction to audio is very interesting to me – which is where my idea of immersive music visualisation has come from. For my project, instead of having tangible objects like in the image and video above, I want to simulate these effects in After Effects to create the immersive experience.
Lewis, S. (2010) SEEING SOUND: HANS JENNY AND THE CYMATIC ATLAS. University of Pittsburgh. Available at: http://d-scholarship.pitt.edu/7448/1/StephenLewisBPhil2010.pdf (Accessed: 2 December 2016).
Music Visualisation is usually real-time generated visuals. They are generated to synchronise with music/audio, and are often either used for music videos, or for use in live music performances. My idea is to combine the live performance use of music visualisation, with an immersive and individual environment. Visuals are used very often in live music performances, and more often than not, are synchronized to the various characteristics of the music. This can either be through the use of lighting and lasers, or projected/screen based visuals. Many people have explored why synchronised visualisation is used in live music performances; Marco Filipe Ganança Vieira (2012) explains that through music visualisation, it is possible to get a “better understanding about the music feelings constructing an intensive atmosphere in the live music performance, which enhances the connection between the live music and the audience through visuals”. In a sense, the visuals anchor the music, and help portray the various sense of emotions that can be expressed through the music. For my project, I intent to create various different visuals, which will correlate to a range of different music genres, and therefore will express each song in a varied way.
Here are some examples of music visualisation:
Although slightly moving away from the idea of Binaural Audio, I am still wanting my project to be immersive for an audience. Therefore I have come up with two different ideas of how I can make an immersive music-visualisation experience. The first idea being a cylindrical projection:
This would involve projecting visuals onto a cylindrical surface. This would be something the audience would step into, and the visuals would surround them almost 360˚ (excluding the entrance way). The audience would also listen to the music via headphones; I then intent to edit the music so various different parts of the song play through the left and right channels, so the music sounds like it is surrounding them. The idea of this is to give a live immersive music experience for a user, much like being at a concert where visuals are displayed alongside the audio, except it is an individual experience rather than a group/crowd experience.
When I first thought of the idea I drew some initial (very rough) sketches as to what I intend the space to look like:
From this I then did my own mock up, using an screenshot from the visuals I have previously created.
This image shows the user/audience in the middle of the projection, wearing headphones, and the visuals surrounding them.
A second idea I had is to put the audio-visualisation into a virtual reality headset, so that the visuals would surround the user, creating an immersive audio-visual experience. With this method, I would have to map the audio to the 3D environment, so that when the user looks either left or right, the audio correlates to where they are looking, this can be done in Unity. The main premise of the visuals would be the main visuals in the middle (straight ahead), then when the user looks to the left or right, they are presented with the visuals matching to the audio that is coming out of the specific channel. The space all around will be filled with much smaller particles that will still be affected by the music.
I have done a mock up of how this would look inside the VR headset:
The mixed audio channels would be presented in the middle, and when the user would look either left or right, they would be able to see the visualisation of the specific left or right channel.
By definition, immersion is the “state of being deeply engaged or involved; absorption” (http://www.dictionary.com/browse/immersion). In today’s digital and gaming technologies, a lot of products are being made to ‘immerse’ and audience or user in a different world or environment. With the rise of Virtual Reality, new gaming technologies have been taken away from the screen, and into a whole new 3D world. And although VR can be seen by some as one of the few ways to completely immerse oneself into a game or visual, the idea of immersion can be achieved in many different ways. As McMahan states, “immersion is not totally dependent on the physical dimensions of the technology” (McMahan. A, 2011), therefore, the state of immersion can be felt in many different ways, VR being one of them. McMahan continues to explain that there are 3 conditions that need to be met to create a sense of immersion:
1. the user’s expectations of the game or environment must match the environment’s conventions fairly closely
2. the user’s actions must have a non-trivial impact on the environment
3. the conventions of the world must be consistent
I will remember these conditions when I am creating my work, because although my final outcome is not going to a game – which is what McMahan is referring to – these terms can be applied to other forms of immersion. I will make sure that the conventions of immersion, such as 360˚ and surround sound, are applied in my own work, to ensure that the audience/user are completely engaging with the project.
(McMahan. A, 2011, http://www.phil-fak.uni-duesseldorf.de/fileadmin/Redaktion/Institute/Kultur_und_Medien/Medien_und_
Whilst writing and creating my dissertation, I found a real enjoyment in exploring the relationship between audio and light, which gave me a new idea for my final project – Immersive Music Visualisation. Live music and the visuals that go hand-in-hand with it have always been of a real interest of mine, so to be able to create my own but for an individual audience in an immersive environment is something that really excites me. This will be something that I am going to be passionate about, because it is something that I love watching and would love to be a part of creating.
I started to explore After Effects more and the Trapcode plugin to create some initial audio-reactive visuals. Using the Trapcode Form plugin, I created the following visuals:
Due to the amount of detail in this, it takes a while to render, therefore for demonstration purposes I have only rendered out a small clip of it. I really like this technique as it is liquid like, but at the same time if the colour were to be changed to shades of orange, it could be like fire. This is what fascinates me so much about After Effects and the Trapcode plugin – the versatility and the range out outcomes that can be created. I will explore other techniques and other elements of Trapcode to display this versatilit040
Using Adobe Audition I edited a sound effect of a train passing by, and of some birds tweeting. I used the channel adjustment part of Audition for the train clip, and I used 5.1 editing for the bird clip. Listening with headphones, you should hear the difference in how the audio is heard through each ear.
If I choose to go down the route of simulating 360˚ audio, rather than recording it live, then this is then this is the method I will be using as it is easy to do and the outcome is very similar to real-time recording.