Augmented Soundscapes

Sounds, along with other senses, have a profound influence on our perception of the environment. The multi-modality of perceptual processing is influential in cognitive interpretation, semantic and aesthetic judgments related to environmental scenes.

In Fall 2008, a series of surveys were conducted at the University of California Berkeley campus with a volunteering group of participants, all graduate students from the College of Environmental Design. The surveys took place at four different pre-selected locations within the campus. Participants were randomly assigned to three locations on campus, which they were asked to visit and evaluate. For two out of the three places, they were asked to wear a mobile audio-augmented reality device and freely explore their surroundings by walking. For the third location they removed the headphones and interacted with the environment in an unmediated manner. After exploring each environment, participants responded to a short survey and an interview.

The mobile audio-augmented reality system was designed to deliver its wearer a geo-located, immersive acoustic experience. The participants were able to see, smell and touch the physical environment and freely move around while wearing the headphones connected to the system. Their acoustical experience was substituted by a virtual 3D soundscape delivered through the headphones. By dynamically tracking the users’ geo-location and head direction, the system presented stable virtual sound sources with simulated spatial and directional cues.

The study examined the effects of the soundscape on environmental perception in situations where visual and acoustic information were in combinations featuring varying degrees of congruence to each other. The type and degree of congruence were classified for the inherent aesthetic characteristics of the sounds themselves and their semantic and cultural compatibilities with the existing places.

A detailed account of this study is included in Gokce Kinayoglu’s Ph.D. Dissertation for his doctoral degree in architecture, titled “The Role of Sound in Making Sense of Place in Real, Virtual and Augmented Environments” (December 2009, UC Berkeley). The study was also presented at the eCAADE 2009 Conference in Istanbul, Turkey in September 2009.

The project runs on the MAX/MSP realtime signal processing environment and uses GPS and real-time head-tracking hardware.

With special thanks to Prof. David Wessel and Adrian Freed from CNMAT for their help with the development of the interface and audio spatialization algorithm.