.liveAV.aura aura.urban | aura.inhouse | aura.test
 

aura.inhouse.0.2
An interior version of the aura system, for details, see aura.urban

 

aura.inhouse.0.1 wall

 
 

Technical Scheme (See diagrams and pictures)
Aura involves a computer surround sound system(including a powerbook, a multichannel audio interface, 4 speakers), sensor unit(including 4 ultrasonic and 2 infrared sensors, a servo motor, and an arduino board), a number of leds, a microphone, a video camera(iSight) and a projector.
Four speakers would be evenly distributed around a square space (at least 4x4 meters), facing four directions towards the wall so that sound could be reflected and appear more immersive.

Sensor unit, microphone, iSight, leds and computer system would be constructed into a set standing in the center of the space. Four ultrasonic sensors, facing 4 directions will be used to watch people's movement in a wider range while two infrared sensors will detect people closer to the sensor and react to their gestures.
The microphone and the iSight will be integrated and mounted onto a servo motor placed at the proper position that people could be easily looking at or voicing (singing, screaming…whatever). The microphone will be used both to collect environmental sound and to receive people's vocal engagement, which will mutate the soundscape based on sonic properties such as pitch, frequency, noisiness and brightness…etc.

When iSight captured any movement in front of it, it would be triggered to rotate, this rotational behavior will further cause the soundscape to be twisted, captured video would be projected as a noise-like imagery into another space. Once the iSight is triggered to rotate, 6 distance sensors would be switched on and use people's movement to trigger the soundscape slowly moving around the space. Meanwhile, a white led will indicate the whole system's power status wile several other white and red leds would start flashing once the iSight is triggered.

In short, all the variables from the input devices (sensors, microphone and iSight) are inter-related and affect both audio and video. Sensor data are sent to arduino program, then comunicate back and forth with max. Sound and video are captured directly by max.msp/jitter, then triggered the sound texture in logic pro.

Feedback
Through testing, the overall physical model, the sound output and the interaction is quite satisfying, people are very much engaged in presenting his/her own agency into this piece.

Due to the limitation of the space and materiality, one of the issue about Aura is its relationship between audio and visual. The content and the space for the projection should be further considered to appear integrated. One scenario is to have a more immersive projection of the visual whether in the same or totally different space as where the soundscape is located.

The other problem is also related to the interior space that there are too much reflection in limited space so that the actually movement of the total soundscape is not that obvious. One solution is to increase the quantity and speed of the movement setting to the sound as well as the sensitivity of the distance sensors. Actually, based on test, this problem is well-solved once Aura is placed in an outdoor space with much larger space and less reflection for sound to move around in whether 2D or 3D dimension.
The form of the stand especially the support should be more integrated for user friendly interaction.