Image courtesy Machine Orchestra.

Ed.: From modern electronica to South Asian Classical music, machines to humans, the Machine Orchestra is doing fascinating things with electrically-powered, digitally-manipulated, physically-robotic music. Here’s more about what makes the ensemble tick.

It’s been nearly three months since I had the opportunity to guest blog here on CDM about a project I am involved in called the Machine Orchestra. In Pt. 1 you were introduced to the directors behind the ensemble, Dr. Ajay Kapur and Michael Darling. Today however, we look at the Machine Orchestra from the inside out, and explore a few of the interfaces, artists, and technologies that make the show a reality.

From the very beginning, a primary goal of the Machine Orchestra has been to explore novel human-machine interaction; how could we both exploit the strengths of our computers and robotic-musicians (i.e., taking advantage of extremely accurate metronomic precision), and at the same time, perform with a high level of musical expression? As we attempted to answer these questions, we made several discoveries that helped us fulfill our desire to musically interact with both our robotic counterparts and our computers.

KarmetiK Machine Orchestra Live at REDCAT from KarmetiK on Vimeo.

The video above gives you a glimpse of the evening which, to throw names around loosely, combined musical elements ranging from Glitch to IDM, traditional North Indian Classical to Balinese Gamelan, and post-rock to new music. Oh yah, let’s not forget the human-interacting machines!

The Speakers.

In addition to exploring new ways to interact with our machines, and taking inspiration from the laptop ensembles that had preceded us, we spent a great deal of time researching ways to reproduce our electronic sounds on stage, as well as experimenting with mains sound reinforcement. At every point in the show we tried to communicate a strong connection between the individual musicians themselves, and the sounds they were creating. To achieve this, each musician had a hemispherical speaker system, and/or a big-ass JBL sub-reinforcement for those musicians requiring extended low-frequency response. Additionally, a 5.1 mains mix was used to reinforce each musician’s location on stage and provide a cohesive house mix for the audience.

The Interfaces.

The diversity of the Machine Orchestra allowed for many types of novel physical interaction. The Machine Orchestra included the following custom interfaces and instruments: Arduinome, the SqueezeVox, ESitar (sitar hyper-instrument), MLGI (laser controller), Helio (touch-strip controller), EDilruba (Dilruba hyper-instrument), and the DigitalDoo. These interfaces were used to control software instruments on each musician’s computer, and also to remotely control the three robots via an OSC/MIDI network designed specifically for the Orchestra.

Interaction and Sync

During our work with the musical robots, interesting challenges emerged that called for creative use of our controllers and technology. One of the most difficult challenges we faced was maintaining stable “sync” between musicians, computers, and the robots. As we’ve briefly discussed in other articles/threads here on CDM, and recently at the CDM mediated NAMM After-Hours Party panel discussion, stable sync between machines is an extremely complex issue, both in terms of technological implementation and its actual uses. When controlling multiple mechanical instruments on stage, and communicating between ten electronic musicians, clock is much more than a way to make up for inaccurate timing—it serves as the essential foundation for fast and accurate communication between robots and performers. We needed to develop a system that allowed complex midi routing over a network, clock sync to be sent to all performers so that tempo changes could be dynamic and on the fly, and the ability for performers to exit or enter the sync stream at any time. We came up with the following solution.

In the Machine Orchestra, all electronic musicians (clients) receive sync from a hub/switch connected to a dedicated server machine via ethernet. The server runs a custom application we developed in ChucK, building off the framework developed for PlorK. Our additions implement a few extra features for interfacing with the robots, as well as addressing some of our stability concerns e.g., in case a musician losses sync the middle of the performance.

We discovered that ChucK implements midi using the RTMidi library, which by default disables midi clock. To enable midi sync in ChucK, the server and client applications are bundled with a custom ChucK binary that is compiled with MidiClock enabled. Additionally, a midi sync client application should configure itself automatically (assigning IP address…etc) and connect to the midi server; in order to facilitate this, we wrote a custom script to dynamically resolve a local IP for the client ChucK applications. Finally, one musician is set as the Master clock, sending clock to the server, and all other clients are then slave to this clock.

Typically, if a computer loses sync, the master clock will need to stop and restart in order to transmit the initial MidiClock start byte and allow that machine back into the sync stream. In practice, this would mean that each time a musician or instrument dropped (or exited) sync all musicians would have to be stopped and restarted by the master clock to get the one machine back in sync. Because of the number of musicians and robots receiving clock during the show, this simply was not an acceptable solution. Instead, we implemented a keyboard command (‘G’ for “Go!”) that each client could manually press if they lost sync. Although not a very complicated solution (simply forcing a stop and start message from the client), it was very effective in allowing a performer to jump back into the sync stream.

With stable sync, and clock communication between all musicians and machines, we were finally ready to explore the different ways to use our custom controllers.

In the piece Voices, various controllers were used to explore vocal synthesis techniques and granular control of vocal sounds. Meason Wiley used his Multi-Laser Gestural Controller (MLGI) to drive a custom Reaktor ensemble with in-air gestures, while Jim Murphy used his new touch-sensor based (akin to a vertical controlled Stribe) controller, the Helio, to control a custom Reaktor granular synthesis instrument he developed with Charlie Burgin. Similarly, Ajay Kapur controlled a granular ChucK patch using his ESitar’s extensive array of sensors (triple-axis accelerometers, thumb-pressure sensors, and fret sensors). Interestingly, each interface’s design imposed a very different use of the granular patch that Charlie, Jim and Ajay were all using—resulting in dramatically different effects.

Other (personal) highlights included being able to work with the visionary electronic and interface pioneers Perry Cook and Curtis Bahn. The vast assortment of interfaces (SqueezeVox, DigitalDoo, EDilruba…etc) and experience they brought to the show was invaluable. In Voices, Perry used the SqueezeVox to control synthesis models (written in ChucK) via an assortment of controls including: tilt/acceleration sensors, replacing the reeds of an accordion with air pressure sensors, force sensors, and linear/rotary potentiometers, creating Forty-One Buttons of pure vocal synthesizing chaos. Throughout the performance, Curtis’ use of the EDilruba beautifully translated human gesture into musical control via accelerometers and pressure sensors on the instrument and bow.

Due to its strength as a reconfigurable device, the Arduinome proved to be a particularly well-suited interface for the Machine Orchestra. One of the ways we used our Arduinomes, for a robot-centric piece called Mechanique, was by setting up 64 midi clips in Ableton, and then midi-learning them to individual buttons on our Arduinomes (we midi-mapped our Arduinomes using a Reaktor patch we made called nomeState). Each midi clip was scored with various sequences/patterns, complete with velocities. Additionally, each clip was paired with midi-clips sending back to ArduinomeSerial for visual light animations on the Arduinomes. Columns on the Arduinomes represented patterns designated for individual arms and beaters of the three robots. By combining different patterns, it was possible to play the robotic instruments in real time, from simple one-shot triggers to complex synced patterns. Completely human controlled, the robots could accurately respond with extremely difficult and complex rhythms, while the clock provided them with fine synchronized precision. The robots not only provided traditional drum sounds, but also effects which would be extremely hard for even the best human musicians to achieve e.g., extremely tight (and fast!) rolls, polyrhythm, and syncopation.

The Arduinomes were also used in many other ways. For example, mapping out the buttons to Ableton’s Midi Note Scale effect, and using the Arduinome as a pitch-based controller for playing soft-synths live; the matrix layout allowed for interesting cross relationships between the intervallic layouts of the different scales.

Each piece in the show called for extremely different methods of interaction between musician and machine. It would be impossible for me to detail every way the instruments were used to control the musical robotics live, as well as all the various software e.g., Ableton, ChucK, Reaktor, and MaxMSP. We would however, like to use this opportunity to open up discussion on the future of laptop ensembles, and promote the sharing of ideas that have been gained when performing with other laptop musicians, interfaces, and/or musical robotics. We graciously thank everyone who came out to support the Machine Orchestra, making it a sell-out debut, as well as those who shared links and spread-the word via twitter, facebook, email, and word of mouth. For those of you who were unable to make it out, no fear, the Machine will come to you soon!

Group Shot

The KarmetiK Machine Orchestra is:
Music Director, Co-Creator: Ajay Kapur
Production Director, Co-Creator: Michael Darling
Guest Electronic Artists: Curtis Bahn & Perry Cook
World Music Performers: Ustad Aashish Khan, Pak Djoko Walujo, & I Nyoman Wenten
Multimedia Performer-Composers: Charlie Burgin, Dimitri Diakopoulos, Jordan Hochenbaum, Jim Murphy, Owen Vallis, Meason Wiley, and Tyler Yamin
Visual Design: Jeremiah Thies
Lighting Design: Tiffany Williams
Dancers: Raakhi Sinha, Kieran Heralall
Sound Design: John Baffa
Production: Lauren Pratt

  • Todd Fletcher

    Slightly off-topic, but I saw the Pat Metheny Orchestrion show last Saturday. Bottom line: artistically he pulled it off. Most interesting to me was the audience response: they were wildly enthusiastic, 6 standing ovations. What this tells me is that whether you use an acoustic guitar alone, a synthesizer or a stage full of robots you have to pull it all together in to a great show.

    I think electronic musicians could learn a lot from Metheny in terms of how to get a crowd jazzed up while using computers on stage.

  • gregorz

    "Walt Disney Cnecert Hall" – a concert hall named after a well-known racist and Nazi-supporter. Awesome:D

  • tekcor

    genius :)

  • salamanderanagram

    @gregorz, ??

    the only evidence i have ever seen to back up this extremely off topic assertion is this video http://www.metacafe.com/watch/2195704/walt_disney

    which if you watch until the end is clearly making fun of hitler.

  • neutral

    An incessant titillation is available to us now through technologies realizing our own abstracted distraction from self and being.

    This is not a bad thing. It is merely a symptom of our time and place. We are in love with enacting our self-displacement to rigorous extremes.

    In short, losing the human element in music somehow seems and feels more real than ever truly gaining it.

    All the moreso through the infinitization of technology found in the fantasy of robotics.

    Here is the recipe: take your own being, and project it away from yourself, into the mechanical construct of choice. Then, sense the split involved.

    How far can one take this before fatigue asserts itself as the inevitable result?

    This is not philosophy, but merely a description of the enactment of robotics itself: of our innate ontology walking backwards, away from us.

    How can this ever sustain our attention beyond the microsecond of our self-distraction?

    It can't. And it won't.

    It's purpose is to titillate, merely, the point of our replacement.

  • Damon

    That's like Peter Gabriel's best dream ever.

  • http://www.myspace.com/nayseven nay-seven

    Fantastic one!

    really inspiring !

    thanks peter

  • rhowaldt

    @neutral: goddamn dude. i made a real attempt at following your argument, but i lost the coherence somewhere within the bloat of words. this may be due to me not being a native speaker of the English language, but still, goddamn.

    @salamanderanagram: (cool name) – i think another piece of evidence would be that it took Disney (the company, that is) until the year of 2010 to release a movie featuring a black (coloured/african american) protagonist.

    but really, i could not care less if Walt was, in fact, a racist, i just care about his awesome drawing and animation skills and the great cartoons he (and his team) created.

  • salamanderanagram

    sorry for the extreme off topic again, but disney died in 1966. personally i can't think of any movies with a black protagonist from before that time period. not saying that they didn't exist but it's not like it was common. as for what the company has done in the near 50 years since, that can't exactly be blamed on a dead man. just to be clear, i could care less one way or the other about walt disney, but i think to say that he was a "well known racist" is hyperbolic at best.

  • electronic_face

    I think Walt Disney did attend some nazi conventions back in the day (before the war), to get in good with the Germans, in order to get distribution in Germany. It was business. Didn't IBM service computers within holocaust camps? Are they nazi's, too? With that logic, if you've ever owned an IBM computer, you're a nazi sympathizer. *sigh*

    I don't know if Walt Disney was a nazi, or racist, and neither do you (for a fact). What I do know for sure, is that he was a creative genius.

    … and those are some damn sweet robots. : )

  • Pingback: How To Choose Forex Magic Machine?

  • Todd Fletcher

    Stop feeding the troll please.

  • http://www.goojie.tv ?

    ????????

  • http://www.flipmu.com Owen Vallis

    Hey guys,

    All the great info about Walt Disney aside, we were wondering if anybody had thoughts or questions about the concert?

    Cheers,

    Owen

  • http://turntablepoetry.com/blog dj professor ben

    Disney may himself have been of questionable character, but the sound in the Disney concert hall is exquisite.

  • Pingback: CDM – Machine Orchestra Post (Part II) « Cycles Per Second

  • Darrell

    Interesting how similar in look and even some of the tech it is to the Metheny concert – and yet how there is 10 times more actual musical content in the Metheny music. This is your basic vamp based computer music, which makes up what seems to be 90% of what all "laptop" musicians do. A backbeat and one or two chords. The reason Metheny's thing is so amazing and gets such a strong response is because the MUSIC is incredible. (And not a vamp or backbeat to be found). I have seen the concert 3 times now (including the aforementioned Disney one) and it was different and excellent all 3 times. He is dealing with "digital music" on a level that no one else has ever approached. The fact that he is one of the best improvisers in the world figures in in a big way too.

  • http://www.flipmu.com Jordan Hochenbaum

    @Darrell – Indeed, the work Eric Singer and Pat Metheny have done is quite inspiring, not to mention, our mentor and leader of the Machine Orchestra, Ajay Kapur has worked very closely with Eric in the past. On a few levels, there are places where the Machine Orchestra and Pat Metheny's show overlap- for example, the way in which the robots play directly along with Pat's guitar is very similar to the way Ajay sometimes chooses to directly trigger specific notes/robotic hits from his ESitar.

    However, I am a bit confused what you mean by the Machine Orchestra mimicking "90% of what all "laptop" musicians do. A backbeat and one or two chords." In comparison to Metheny's show, you are correct in your assumption that the music is very different, however, the Machine Orchestra and Metheny's show were going for completely different musical outcomes.

    Perhaps the trailer doesn't quite capture it enough, but as stated in the article, the concert consciously covered material which ranged from highly complex rhythmic and pitch based material (crossing IDM, North Indian Classical, and Balinese Gamelan), to more simple, steady beat oriented electronic and rock-based compositions. Throughout the entire show however, only in the couple rock based compositions where there more idiomatic vamping, and possibly backbeat components to the music, as you would expect of the genre. Without copying and pasting most of the article here, in other pieces, for example Voices and Mechanique, the pieces are comprised of completely improvised granular synthesis and robot playing, merely following basic structural archs discussed by the musicians. Additionally, the North Indian and Gamelan pieces, as you would expect, follow the forms prescribed by their respected genres.

    If you read the article, we go into great length discussing the various interfaces, and performance tools we created to try and creatively create all of the music live, including many different ways to interface with the robotics from direct triggering / playing from various interfaces, via musical instruments like the ESitar, sequence triggering, Call and response improvisation…etc.

    I think its important to open up this discussion and see what works and what doesn't work, citing other related work being done, but I also think we need to be conscious of our assumptions and sonic aesthetics. That being said, I'd love to discuss this more and get to the bottom of this whole vamp/backbeat thing!

    Cheers

  • Pingback: Androidliving.com