You might imagine sound in space, or dream up gestures that traverse unexplored sonic territory. But actually building it is another matter. Kinect – following a long line of computer vision applications and spatial sensors – lets movement and gestures produce sound. The challenge of such instruments has long been that learning to play them is tough without tactile feedback. Thereminists learn their instrument through a the extremely-precise sensing of their instrument and sonic feedback.

In AHNE (Audio-Haptic Navigation Environment), sonic feedback is essential, but so, too, is feel. Haptic vibration lets you know as you approach sounds — essential, as they’re invisible. The work of Finland-based DJ/VJ Matti Niinimäki, aka MÅNSTERI (“Mons-te-ri”), the project is part of research undertaken at SOPI Research Group at Media Lab Helsinki. Like some sort of sound sorcerer, the user is entirely dependent on movement, feel, and sound as they move unseen sound sources through space. (More technical details below.)

It’s labeled, as always, “proof of concept.” The creator promises more videos to come; we’ll be watching as this evolves, as it looks terribly promising.

Below, “Tension” is a fair bit simpler, in which users walk through a space and control synth parameters. (“You are the knob,” one might say, though I don’t suggest shouting that at someone you don’t know. They could take it the wrong way.)

More descriptions:

AHNE

This is a demonstration video of AHNE – Audio-Haptic Navigation Environment.

It is an audio-haptic user interface that allows the user to locate and manipulate sound objects in 3d space with the help of audio-haptic feedback.

The user is tracked with a Kinect sensor using the OpenNI framework and OSCeleton (github.com/​Sensebloom/​OSCeleton).

The user wears a glove that is embedded with sensors and a small vibration motor for the haptic feedback.

This is just the first proof-of-concept demo. More videos coming soon.

HEI Project 2011
SOPI Research Group
sopi.media.taik.fi/

Aalto University School of Art and Design

AHNE – Sound and Physical Interaction

Tension

A brief video showing Tension. An interactive spatial sound installation for multiple users.

A person enters the space and a generative sound is assigned to that person. The sound pans around in the 6-channel speaker system following the user in the space.

Up to 5 users can use the installation at the same time. Each person modifies the other sounds based on the distance to the other users. The closer you are to other people the more the tension in the sound increases.

Tension – Sound and Physical Interaction

Side note: watching these two videos makes me want to consult with someone on non-verbal expression, posture, and stage presence. That criticism is mounted at myself – I could use it. Perhaps we need an all-physical, unplugged music event for laptopists, controllerists, and electronic musicians. And I can at least say I’ve had some experience in this, working in the dance program at my undergraduate alma mater, Sarah Lawrence. Anyone game? (Sounds like something we could do while CDM is in Berlin in the fall.)

For their part, the Finnish research facility is working with dancers, along with Nokia Research Center. (Sadly, I can’t find documentation.) But I think interesting things happen when us non-dancers learn movement technique, too.

  • http://www.reverbnation.com/mayasky maya sky

    I'm game!!!!

  • http://www.distantdrummers.com Robin Koek

    What a great approach to spatialisation. Would love to work in a composition environment based on these principles of physical space and realtime control. Would be great if it would be also combined with advanced gesture recognition for phrasing the sounds through space (so departing from the "drag and drop" interaction, which is also nice but in combination with more expressive gestures would be heaven, I believe. Looking forward to see it evolve and move beyond the technical concept in to musical applications.

  • http://www.ahsquared.com Andre Hayter

    This reminds me a lot of a work I did a while back (http://www.ahsquared.com/sd/media.html – excuse the crappy site design). I used IR and camera tracking – Kinect wasn't on the radar back then – to track multiple people in the space, assign them a sound (a vocal note). As they moved relative to a spot in space the sound changed from staccato to legato. The notes all added up to a big harmonious chord if everyone found their respective spots (turned out to be very hard to do). I was playing with the idea of moving people in space with sound alone (needed some minimal visual feedback, hence the black tracking squares). I think the work above is a great idea – I'm fascinated by how bad we are at using sound alone to navigate the world, it's a virtually untapped sense, because of our predilection for visual stimulus. Looking forward to more about this.

  • http://vargasz.tumblr.com varasz

    hey matti is one of my fav digital artists!

    i love this work too!!