Immersive Music Visualisation
An individual visual experience in a cylindrical projection
To summarise my project, my idea is to create audio-reacting visuals, to display in a cylinder for an individual audience. I will be basing my idea around the theme of cymatics and bringing particles to life, with the theories of immersion and escapism at the forefront of the project. I will experiment with different genres of music, to gain insight into the different ways in which they can be visualised, and I intend to build on my skills of simulating 360˚ audio in Adobe Audition, to be able to edit an entire song so it seems as though it is surrounding the listener. I will learn as much of the Trapcode suite as I possibly can, so I am aware of all of the possibilities that can be created using After Effects. My research into other artists projects such as Projection Artworks, and groups like Analema will continue, to see if they give me further inspiration throughout my project.
As my work will follow an iterative process, I will continually learn and build from any feedback and evaluation on my work, until I reach my final goal of creating an immersive music visualisation experience.
Throughout this unit in our seminars, we have been talking and discussing our ideas within the group. Each week everyone would be given the chance to give constructive feedback and input into the project ideas of others. One of the comments I had on my idea was that I could think about music visualisation for people with hearing impairments – although I hadn’t specifically thought about making the project for people with impaired hearing, the project lends it to this way anyway. I am wanting the visuals to represent the music very closely, so I am hoping that even without listening to the music, it will be apparent what type of music is playing.
I was able to give feedback on other people’s work, and I like to think that my comments have helped other people’s work. For example, Alex Sisan (BoozeBible) pitched that he was going to have a search function in his app, this search function would show the closest pubs/bars to the user. My feedback was to make the search function more of a ‘filter’ system where the user could select the types of venue they were wanting to go to, and it would filter out anything that was not relevant. I suggested this because there was already a location based element to the app, so I did not think there needed to be two sections that included location.
In Semester 2 for the Graduate Project, I plan on collaborating with many different people, predominantly music artists and producers. I intend to get in contact with numerous artists and producers to ask for their permission to use their music in my project, I will also ask if they would like to have any contribution to how the visuals would look/how they feel their music should be portrayed visually. I think this could be very interesting to see how my views may differ from their views as it is their own music and chances are that they will have a pre-created image in their minds of how they see the music visually.
Here I did a small experiment on myself – I thought it would be a good idea to see what patterns I draw when I listen to genres of music, as this could help with my design process in regards to what visuals best suit which song/genre.
The first doodle is of a classical, instrumental song. I felt like the song was very flowing, however each change of note or chord made me draw a change of direction or new ‘squiggle’. I can imaging the visuals for this being very smooth and possibly long lines rather than shapes with any sharp corners.
The second doodle is of a modern rock song, it started off similar to the classical song but a strong beat was introduced, hence the peaks in the drawing. This was just for the intro part, when the chorus kicked in it became a faster, quicker beat. For this, I can imagine the visuals to look more like an abstract equaliser, so display these fast beats.
The final doodle is of a drum and bass song. It had a very long, slow intro (from the middle outwards) in which I drew a steady spiral. An individual beat was then introduced (triangle area), until the main drop happened where the spikes were then continuous for a while. For this, I can imagine something pulsing when the beat kicks in, then cutting to something a lot calmer when the song returns back to the low BPM sections.
My project will be exhibited as part of the Graduate Show for the entire course. For this, it means it will gather a specific demographic of audience; audiences that would be likely to attend and engage in an exhibition.
Exhibitions and galleries generally attract similar types of audiences; Arts Council England have devised an Arts audiences: insight (2011) report, in which they review audiences engagement in art. From the report it shows that there are three types of people who like to visit exhibitions: the highly engaged ‘Urban Arts Electic’, the ‘Fun, Fashion and Friends’ type and the ‘Mature Explorers’ who both only have some engagement in arts; each of these have specific demographics. What I find interesting however, is that out of the thirteen categories the Arts Council found, only three of them would be likely to attend an exhibition. The general demographics of these exhibition-attending type, are that they are likely to be well educated, they will be on average or above average pay in full time work (or still in higher education), they have a mixed gender, and they are mostly white.
Although I find it interesting to look at the demographics of certain audiences, my project does not have a specific intended target audience as it can be used by people of all ages and people with varying interests, thus, the audience of my project will be anyone who attends the exhibition. From attending the previous exhibition the general audience is fellow students, staff members, friends & family, and any prospective employers. However, if after the exhibition my project is displayed elsewhere, the general audience could change yet again, therefore I do not have an specific target audience for my project.
Arts Council England., 2011. Arts audiences: insight [online]. Arts Council England.
As well as the visuals I created previously, I have continued to explore the Trapcode suite and I have created a further four audio-reacting visuals using Trapcode Particular or Trapcode Form.
This video was made using Trapcode Form, I used the audio-react function of the plugin to link the music (Muzzy – Junction Seven) to the visuals. The effect in the middle was created using the spherical field layer, where I adjusted the strength to ‘stick out’ of the base form. When the new beat comes in, I then created a new camera layer and adjusted where the camera was pointing, to get a different angle on the form.
This is one of the second visuals I have been working with, it was based on a Northern Lights preset using Trapcode Particular, which I have edited to make it more abstract. The movement is randomly generated, and the glow is the one element that is connected to the music. I feel that this kind of visual would work best with a slower genre of music, so that the flowing movements of the visuals connect with slower/longer notes and chords in a song.
This is the third visual I have created, it is a simple spherical particle field, which reacts to the audio by dispersing the particles predominantly from the top of the sphere. This is the very basic form that I have created but it has the scope to be a lot more flexible and create really interesting visuals.
After uploading this video I even discovered the ‘Kaleidospace’ element of the Trapcode Form plugin, whereby you can put a mirroring effect onto the composition.
I then watched some further tutorials about the Trapcode Form plugin, and I was able to make this visual which looks like a distorting floorboard. I really like this idea as it can be the basis for many different music genres, as this can be at the bottom of the projection and I can create a main visual to be ‘floating’ in the middle. Using the camera on After Effects and the depth of field setting can drastically change how a composition looks – I think it works very well in this situation as I can imagine to someone watching in Virtual Reality it would look 3D.
I will continue to experiment with different forms of visuals to create different effects, because if I have time, I would like to be able to create a number of different visuals to match different genres of music.
“Escapism is a way of avoiding an unpleasant or boring life, especially by thinking, reading etc. about more exciting but impossible activities.” (http://dictionary.cambridge.org/dictionary/english/escapism)
Although I agree with this definition, I feel that in the digital age, this definition has gone a lot further than just merely avoiding reality. I feel it is more about making time from everyday reality, to situate oneself in somewhere other than their current reality, such as watching a fiction TV show or playing a video game. As fashion designer Iris van Herpen explains, “escapism is about the addiction of constantly escaping reality by digital entertainment”. I almost completely agree with this statement, however I do not agree that it is an addiction – most explanations of escapism state it as a “cause to ‘leave’ the reality in which they live” (Vorderer, 1996, p. 311), therefore meaning that it is not all the time, and only if something is triggered. Escapism can be perceived and explored differently by each individual, and some may not even think that they are escaping anything by exploring digital entertainment, so no ‘general’ statements can be made about it. However, I hope that the audience of my project could accept it as a means of escapism – the idea that they will be completely immersed in the music and audio-reacting visuals, it will automatically be taking them away from the other realities that surround them.
After looking into the likelihood of me being able to borrow an Oculus Rift or HTC Vive for the duration of the project, I have decided to proceed with the Cylindrical projection Immersion idea instead. Although I would have really enjoyed learning a new plugin to create VR, it is not going to be practical. I also prefer the projection idea as I like working on tactile objects rather than a virtual space.
To experiment with projecting onto a curved surface, I practiced using a bin. For this, I used the visuals I have already produced and put the video into MadMapper. In MadMapped I resized and reshaped the video to how it would best look projected.
As you can see, the video only covers part of the bin’s curved surface, if I were to make the video any larger, the image would have seeped onto the wall rather than wrapping around the bin. This shows that for a full 360˚ projection around this object, I would need at least 4 projectors to project onto each ‘side’.
I also made a small-scale prototype of the cylinder (above) to demonstrate how I imagine the space to look. Even though I am not projecting anything onto the cylinder, the light from my room shows that I will not be able to project from outside of the cylinder at all due to shadows appearing inside. Furthermore, this proves that I will have to use multiple projectors from inside the space, to avoid lighting issues and to make sure the entire surface of the space is covered with the image.
The company Projection Artworks recreated a large-scale version of the famous Faberge eggs for a Harrods window display, using a model and projection mapping. They used 16 projectors and stitched the images together to cover the entire surface of the sculpture. Their task was particularly tricky as they wanted the projection to be viewable in daylight, meaning the brightness of the image had to be a lot greater than it would have been in darkness.
In relation to my project, this shows the amount of projectors that are needed to project onto a round surface, so although I’m sure I wont need 16, I am going to need at least 3/4 to cover the surface area of an almost 360˚ cylinder. My next stage will be to look further into cylindrical projection, and the technicalities of what I will need for the projection to work.
After recently attending a workshop with Oliver Gingrich from Analema Group (https://analemagroup.wordpress.com), I was really inspired by the work that the group produces and how their work fits into my own project. Drawing on my recent research of Cymatics as well as audio-visualisation, the group have a project called KIMA which stems from research of Cymatics – they describe the works as “Music for your eyes and ears.”
Their installations have been displayed at Incloodu Deaf Arts festival, whereby the audience can feel the vibrations of sound through the bench they sit on, and they can see the cymatic visuals displayed via projection.
I can see many similarities in the work Analema Group produce, and the ideas I have for my final project. KIMA focuses on sounds and how they are represented in a visual format, as well as a lot of their installations being interactive, therefore I will be taking inspiration from Analema for my work as I can see how their work has evolved over time with the feedback they have from events, and the use of new technologies.
Qiao Ma (2009) explains that prioritising is based on the requirements needed in a project, but most of the time projects do not have unlimited resources, so “stakeholders need to decide which requirements should be implemented. Requirements prioritisation helps the project developers to select the final candidate requirements within resource constraints”. Therefore, for my project I am implementing the MoSCoW method to prioritise the requirements that are a necessity to the outcome, over the requirements that are not necessary. From this, I have then created a Gantt Chart to organise my time throughout the project, which not only prioritises requirements but also has a logical structure to make sure everything is done in the time.
The MOSCOW method is a prioritisation technique, used to display in a project what it MUST have, SHOULD have, COULD have and what it WON’T have. This is used to clearly understand the main elements that the project MUST have to succeed, as well as elements that will help it to succeed but not entirely necessary, and to explain things that the project definitely will not have. It helps a team or individual to “view of what is essential for launch and what is not” (Waters. K, 2009)
For my project, my prioritisation will be as follows:
- Audience/user immersion
- Music visualisation
- Cylindrical projection
- Music through headphones
- Different music genres
- Different visuals to suit different genres of music
- Music edited to play in 360˚ around the listener
- A way of users choosing their own song/genre
- Option for complete 360˚ projection, rather than nearly 360˚ – leaving space for user to enter
- Space for multiple people – it will be for individual use at this stage
- A screen interface for music choice
I have created a Gantt Chart to help me organise my time for the project. It is divided into a week-by-week timetable, with various different sections that I will need to work on throughout the project, such as learning software and audio gathering.
As explained in my Iteration Process post, there will be various iterations to my work, which have been taken account for in the gantt chart – I will create a prototype that I will then test and gain feedback on; I will then work on my outcomes from this testing and feedback cycle until I am completely happy with the final project.
Ma, Q. (2010) The Effectiveness of Requirements Prioritization Techniques for a Medium to Large Number of Requirements: A Systematic Literature Review. Dissertation thesis. Auckland University of Technology. Available at: http://aut.researchgateway.ac.nz/bitstream/handle/10292/833/MaQ.pdf?sequence=3 (Accessed: 3 December 2016).
Waters, K. (2009) Prioritization using MoSCoW AllAboutAgile
. Available at: https://cs.anu.edu.au/courses/comp3120/local_docs/readings/Prioritization_using_MoSCoW_AllAboutAgile.pdf
(Accessed: 30 November 2016).