Microsoft’s Kinect has proven a compelling proof of concept for gestural control of music. But it could be just the beginning of mass-market gestural sensing technologies. The Leap Motion, like Kinect, promises to be affordable gear. Unlike the Kinect, the hardware is even more unobtrusive, and gestural control is more precise and responsive. Given the latency limitations of Kinect, that’s a huge deal for music.

And better expression could inspire new musical ideas. We’ve spoken many times before about the limitations of touchless control – Theremins are spectacular but not the easiest instruments to play, and waving your hands in the air probably isn’t the best way to play the piano. (Though it is fun – see below.) With a more responsive system, these proofs of concept could eventually yield something genuinely different.

Leap Motion units are just now arriving in developers’ hands – I have one on the way, so you’ll be sure to hear about it. But even this first video, at top, demonstrates lower latency and finer-grained control, each suggesting something more musical (whether or not an AirHarp is what you personally want). Developer Adam Somers writes, in the video’s description:

AirHarp is the result of a weekend hacking session with a Leap Motion dev board. Leap Motion is a highly precise and responsive motion tracking device, making it a perfect tool for expressive musical interactions. AirHarp is being developed in C++ using my audio processing toolkit, MusKit. The source code for both projects is available at

Special thanks to the Leap motion team for making these dev boards available and for the great response to the developer community.

For more information visit:

Developer Creation: AirHarp [Leap Developer blog]

The Leap crew also put together this cute holiday video:

There’s definitely some potential here. I’ll be watching for my box. Since we have some hackers out there, let us know if you’re getting one, too.

  • White Noise Audio

    I’m supposed to be receiving one too. If anyone has any requests for apps they’d like to see, post em here!

  • Dean Taylor

    I too am awaiting a LEAP. Check out aka.leapmotion for a max external 😉

  • Newgreyarea

    Is it neat? I guess. The one thing that irks me about these type of interfaces is that they tend to require site. If you are blind, you can play a guitar or piano or harp etc. . but you probably can’t play this or an iPad. So all these “future” things are missing a very core sense, touch, and focusing on site instead which is probably the sense I use the least when it comes to music. I actually close my eyes most of the time to focus on the sound.

    • newgreyarea


    • Pen

      You could play them without sight, particularly if you are interested in exploring the gestural space rather than using it for precise and repeatable actions (although you could do that too, really). Most of the time when I’ve used the kinect, the person interacting with it is a dancer who doesn’t see any screens at all.

    • foljs

      “””particularly if you are interested in exploring the gestural space “””

      So basically in making pretentious BS non music?

    • Peter Kirn

      Actually, I find most humans use a combination of small motions (like fingering) and larger ones (like walking through a room). The pretentiousness or BS-ness or non-musicness, whatever any of those things mean to you, would then be up to how you map the output.

    • Peter Kirn

      Right, I’m with newgreyarea here. They shouldn’t require sight – not at all – because a blind person can still move. In fact, if they do, I would say the failure is:

      a) The design of the interaction. (For instance, providing feedback that you need to see in order to use the interface.)
      b) Requiring calibration with an arbitrary point of reference (that is, failing to give auditory feedback on position in space, or finding some way to work with relative gestures)
      c) Sloppy sensing — an inability to respond well to precise movements.
      d) A combination of the above.

      (c) is, of course, the deal killer – no matter how good the interaction design is, sloppy sensing would make the interface useless. And latency can also kill the sense of connection to the interface. Leap Motion seems at least like a step forward.

      I’m strongly suspicious that a lot of the visual cues are crutches to make up for bad sensing and interaction design.

      Ironically, a system that worked for a non-seeing person would likely also work better for a seeing person.

    • ramin

      agreed. while this is a great tech demo it offers nothing new musically.

      you would think that being able to use all 10 fingers would inspire developers to do more then making a harp sim….which has been done in the 80’s by Jean Michel Jarre (more or less).
      I think we should focus on using this technology to explore new ways of interacting with music/sounds instead of mimicking existing instruments in inferior ways. sure it looks neat, but what else?

      I think the next step for touch interfaces is to actually implement “touch” as in sensing it in your fingers instead of just waving your fingers and seeing stuff happen on the screen or swiping on a flat shiny surface.

  • mat

    looks like a nice additional tool!
    I would prefer ControlChange messages rather than playing notes… (if I get it right each finger is tracked?)… given 3 dimensional space this means a lot of CC controlling :) And I like the apearance and positioning of the tool – on your desktop…no need to stand up and wave around (kinect controlling must be very physical;)

  • 64BitsPerMinute

    I am on the developer list and should be receiving a Leap shortly. I’m really excited for it’s potential uses! Also, the SDK looks like it has some pretty awesome methods. I can’t wait to try it out!

  • Tom Minch

    Make it for the concert harp and every harp player will want one to practice on when they can’t have their 90 pound 6 foot by 4 foot harp on them!