The relationship between music and architecture has been shaped over the millennia by a series of complex causal relationships between composer, musician, instrument, space, and the emotional responses of listeners. The sounds and spaces that arise out of these relationships form a continuity over time that has informed the way we listen to music and the way we design rooms to accommodate it. However, the development of acoustic theory in the early twentieth century gave rise to a series of rigid conventions in acoustic design, affecting both musical and architectural composition, and considerably slowing down the mutual co-evolution of both.
Current research into emotional response and music undertaken at Aalto University by Tapio Lokki and Jukka Pätynen uses anechoic extracts of key pieces of Western music – Beethoven, Mozart, Bruckner – to investigate the effect of classical music on the occupants of differing room types. Lokki and Pätynen’s research provides essential clarity on critical, historical decisions previously mired in woolly, subjective thinking.
This paper builds on this work by breaking down the concept of ‘music’ into digitally evolved sounds, and ‘space’ into an emergent virtual entity capable of adapting to a user’s emotional response. It follows two strands of investigation; firstly, the study of evolving sound in fixed space and, secondly, the evolution of space in relation to fixed music.
Using virtual auditory systems, bio-metric sensing, and evolutionary computation, this research deploys a new tool-set with which to optimise the development of our engagement with sound, space and emotional response. This paper reviews a series of tests on listeners that aims to accelerate the coevolution of sound in space, and space with sound.