Syn(es)thetic Reality

simulating synesthesia for the non-synesthetic

 

We generally understand and perceive our surroundings primarily through our five sensory organs – audition, sight, taste, touch and smell.

And we describe our reality in terms of these senses, and perhaps as a summation of these senses. An object is qualified by the color we see or the texture of sounds we hear. All these bits are added to form a description in our brains.

What would then the understanding of the world be if we were to perceive the world not as a sum but as a product of our senses? Can an object be described by the color we hear and the sounds we see?

This is the phenomenon of synesthesia and Syn(es)thetic Reality aims to create a synthetic experience of synesthesia.

‘Syn(es)thetic Reality’ explores a new way of sensing the world by understanding sounds through colors.The project achieves this through a mixed reality web application to be run on mobile phones via a wearable device. The web application is accessible at

https://vinay-khare.github.io/SynestheticReality/

. It shows a camera feed on the display while altering the colors so that they correspond to the sounds being captured by the microphone. Each hue is mapped to specific ranges of frequencies, and each color will saturate towards the original hue depending on the amplitude of the sound.

 

<p>

</p>

What is synesthesia?

Synesthesia is a rare condition, present in only 4.4% of the population, where people experience the world beyond normal senses. Where one sense stimulates another sense.  Chromesthesia is one of 80 types of synesthesia. It deals with the visual and auditory interconnections.

 

The Web App

Electronic devices and sensors such as microphones, speakers, cameras, displays allow for the properties of the physical world to be translated into digital signals and vice versa. Using such devices we can easily simulate the condition of synesthetic sensory cross wiring in a digital manner.  

‘Syn(es)thetic Reality’ is a web application to be used primarily on mobile devices in (mixed reality) XR mode with the Google Cardboard or similar viewers. It shows a camera feed on the display in XR format while altering the colors so that they correspond to the sounds being captured by the microphone.

Each hue/color is mapped to specific ranges of frequencies. And as a sound occurs each color will saturate towards the original levels depending on the amplitude of the sound within that range.

 

Headset

Design Iteration 1

Iteration 2

Final Iteration – Material: Metal

 

Syn(es)thetic Reality is a project of IAAC, Institute for Advanced Architecture of Catalonia developed at Masters in Advanced Interaction in 2019 by:
Student: Vinay Khare
Faculty: Luis Fraguada, Elizabeth Bigger