Take a powerful game engine (for animation, 2D and 3D graphics, physics, and on-screen interaction). Add the flexibility of a visual development environment for programming with virtual patch cords, for rich sonic and musical capabilities plus easy interaction with data and input. That’s the idea of combining something like Unity 3D with Max/MSP. In the example from earlier today, the solution simply routed basic data from a Unity-based game to a responsive music engine in Max.

In the case of [myu] – the Max Unity Interoperability Toolkit – that integration goes further still. Developed at the DISIS (Digital Interactive Sound & Intermedia Studio) at Virginia Tech, [myu] allows bi-directional integration of the Unity engine with Max or Pd. The two tools use netsend/netreceive to send data via TCP and glue the two together.

For visualists using Jitter, you can even exchange texture data, which offers some mind-blowing powers for live visuals.

Download at Virginia Tech — bonus, an extension of the aka.wiiremote object so you can use the lovely Wii Fit controller, among various other projects
Discussion on the Unity Community Forums
Discussion on the Cycling ’74 forum
Virginia Tech DISIS

As an interactive prototyping tool, this should have a lot of potential for lovers of patch-style programming.

Thanks to Dr. Ivica Ico Bukvic, DISIS Director and researcher, for sending in his project. I’ll be curious to see what other people might do with this.

Previous post

Teaching Adaptive Music with Games: Unity + Max/MSP, Meet Space Invaders!

Next post

Immersive Music: Revo:oveR Installation, Lightbent Synth, Max + Unity