Hidden beneath a detailed interface, there’s a lot that a computer might do. Amidst a growing variety of touch tools, musicians are surfing those capabilities with custom cockpits, fingers dancing across glowing rectangles.
Production and performance tool Ableton Live is really designed around mice and keyboards. For touch controls, that means turning to remote controls – and for now, a tablet (most often iPad). Developer Liine was an early adopter of the notion. Their classic, minimal Griid and Griid Pro were elegant and simple – but they’ve also failed to keep pace with ongoing controller evolution. They just can’t control all the things a Live user might like. At last, Liine has completed work on LiveControl 2, the “next generation” of this controller utility, with full support both for Ableton Live 8 and this year’s Ableton Live 9.
And it’s quite a release. With modules that keep access to settings like channel strips handy, and the ability to play notes and make your own MIDI clips, it almost reads like the feature list came straight from “Dear Santa” letters written by Ableton Live users.
The upgrade takes a different approach to the problem than other apps (notably, the very thorough Touchable controller app). It uses four modules which can be used together: Continue reading »
Jon Hopkins, smiling as he jams to his new music in Switzerland.
If anyone might chart a course for the future of ambient dance music – contradiction in terms as that might seem – it’s the UK’s Jon Hopkins. Spacious sounds and free-flowing gestures seem to flow effortlessly in his music, but that same texture can be honed into hard-hitting grooves or set against forward-propelled rhythms. It is, simply, beautiful music you can dance to.
In the new full-length “Immunity,” Hopkins is once again in top form. To me, he’s reached a new level of clarity and coherence. There’s an almost narrative thread through “Immunity,” as though it’s a score for a kind of film we don’t know how to make yet. A good substitute, while we work that out, is the short film made by artist/biochemist Linden Gledhill and art director Craig Ward. Images of microscopic realms, butterfly wings and crystals in motion, seem perfect, organic and dynamic as the music.
The sonic world of “Immunity,” with its thick, sharply-defined bass, adds a greater degree of consistency than past Hopkins outings. Here, he has pared down materials to a mature record, confident in all the best stuff.
Jon spoke to CDM three years ago about his techniques in the studio and live onstage. He talked about keeping keys constantly at the ready – a piano a chair swivel away, and a trusted Korg Trinity his one and only keyboard. He also spoke a bit about economy in composition, which I think is relevant to what you can hear on “Immunity.” He told us at the time: Continue reading »
Acer’s P3 convertible Ultrabook sits astride a Serato Scratch rig (running on a conventional laptop, actually). The software is a new touch-enabled version of VirtualDJ, made for Acer and currently available free with their touch range. Photo from the Acer event in Taipei. (And yes, the iPad has somethingto say about this, as well.)
“Where are my touch laptops?”
It’s becoming the “where are my flying cars?” of the laptop music age.
And so it is that I’m here in Taipei, Taiwan, having spent today hanging out with Acer as they talk about what they’re doing with touch on their computers (laptops and tablets). The touch laptops are here in force – not a couple of netbooks or tablet PC oddities, but with the full-blown force of the PC industry behind them. The question now is whether we actually want them.
2012 was a little early to ask that question for the music audience; now the mature products – with Windows 8 behind them – are in the 2013 generation. I have some specific information to share, but I want to back up and consider some of the broader questions first. (If you just want to look at hardware, read later this week.)
It’s been nearly a decade since electronic musicians first started seeing touch in the wild. At the time, the power was immediately evident: you had the ability to imagine new ways of interfacing with music without the limitations of hardware knobs and faders. It was Star Trek: The Next Generation-style power, finally appearing in the real world. And that was a natural fit to musicians suddenly facing computer capabilities that lacked obvious form – sounds unfettered by the laws of acoustics and physical instruments. So it was also immediately apparent that eventually, you might want these touch interfaces to merge with your computer.
But since that first epiphany, the marriage of touch with conventional computers has been surprisingly slow in coming. Apple showed the way with iPhone and iPad, in their own categories. But laptops, with their hinged clamshell design, are another animal. Conventional software written for the mouse and keyboard can be simply awful when you start jabbing with your fat fingers, and the hinged design of a laptop leads to the dreaded “gorilla arm”: using a vertically-oriented display feels uncomfortable and makes your arms go numb. (On behalf of the gorillas of the world, I have no idea why this is called gorilla arm; maybe gorillas were unfairly subjected to usability testing in an early computer lab.)
So, why would you want a laptop to be touch-enabled, anyway, instead of a dedicated tablet running touch-centric software? Apple, for their part, has drawn a line in the sand and decided you don’t. Their MacBook line eschews touch beyond the trackpad, and focuses on conventional (still very powerful) software. The iPad is the platform for touch. Even years into a supposed “post-PC” age, software on the two remains very different – and the OS X software is far closer to its Windows brethren than iOS. Whatever rampant speculation about the two fusing, with the MacBook and iPad leading their respective sales categories, there doesn’t seem to be a logical motivation to fuse those two – least of all when Microsoft’s strategy to treat the two categories as blurred have initially fallen flat. Continue reading »
Yes, this looks like an ordinary stompbox, but it is reprogrammable. Can I put this massive “prototype” disclaimer over any photos of me tagged on Facebook? No? Photo courtesy the OWL folks.
There are stompboxes. They are — for lack of a better word — foot worthy. You can step on them, in a way that is less possible with a computer. (Well, sure, somewhere amidst an endless spinning color pinwheel you may have wanted to step on your MacBook Air, but then thought better of it – financial investment and whatnot.)
Then, there are computers. They can do everything. That stompbox is one particular distortion effect. And it is always just that one distortion.
But what if you could have both?
As embedded technology continues its march toward greater user friendliness, lower cost, and greater sonic powers, it seems the time is right for hardware that combines the durability of dedicated sound gear with the open-ended potential of computers. That is, it’s not really clear where the computer ends and the stompbox begins.
OWL isn’t the first project to take on this dream, but it’s looking more practical than those that came before.
The project promises open source hardware, with open code, that can be reprogrammed into new sound effects simply by uploading new code. As with a new generation of low-power tablets and phones and the like, there’s an ARM chip at its heart. (The ARM Cortex M4, to be exact.)
If you’re a guitarist who writes your own C++ code – yes, there’s actually a sizable group of those – you can have a ball making your own DSP routines. If you’re not, OWL promises a library of patches, presumably growing with more contributions from the open source community.
There’s not a whole lot to look at at this point – while they’ve got a GitHub repository going, it includes only a little bit of sample code. But in the video, the results look impressive, perhaps enough – given an experience team – for some to go ahead and take the leap of supporting the crowd-funded Kickstarter project.
And they say computer technology for music is “disposable.” Csound has a direct lineage to the very first digital audio synthesis ever to run on computers, counting decades of history. It remains an elegant way to make any instrument, event, or musical creation you can imagine, all with a free tool. And now, a Csound file can be baked right into an app for iOS, if you so desire.
Whether or not you’re ready to tinker with code, that means more musical goodies for your sonic amusement. And the next in line is something called csSpectral. Boulanger Labs has been hard at work on this one, and it looks like it will yield some insane sonic frontiers.
The new Csound-based iOS app by Boulanger Labs, csSpectral. Deepak Gopinath (Lead iOS Developer) is using csSpectral to play back a simple beat and transforms the rhythms into a unique percussion track that morphs beyond glitch. This aspect of the app is well-suited for many applications ranging from advanced sound design for film to a mashup of your favorite track.
In other words, it makes crazy noises. Or, in marketing speak, it’s well suited to serenading a future mate, providing a futuristic science fiction atmosphere to your next meal, for playing to babies in their cradles to turn them superintelligent and get them into the best afterschool programs later on, or as a means to entering higher states of astral awareness.
Another video below.
More good news: @csoundcommunity tells us via Twitter, “Just like last time, the Csound .csd will be available to investigate and learn from. Takahiko Tsuchiya takes it to another level!” And for Max fans, “there will be a Max patch avail on release w/MIDI learn. Controllers programmed on the fly, even the APC!”
If you’re ready to make your own app powered by Csound, we’ve got good news for you: there’s a free tutorial to get you started. Download the PDF and, provided you’ve got the Apple SDKs configured for building apps, you’re all set to turn your Csound files into apps:
Put your hands – and your hands together – and get a nifty ensemble of Giorgio Moroder goodness scrolling across your screens, like a palm-top animated album cover. Photo courtesy Google.
Imagine the browser window – on a desktop, a phone, or a tablet – as another canvas for musicians. Hearing Web nerds talk about the latest browser tech may, it may not be immediately clear how that connects to this browser future. But with the addition of features like 3D and network sockets, suddenly you begin seeing dynamic music toys and tools that work without downloading apps.
Google has become part R&D lab, part arts patron, with its Chrome Experiments. In the latest, Giorgio Moroder’s music is the soundtrack to a “race” of abstract, colored geometries as they track between devices.
All you need is some iOS and Android gizmos running Chrome, and you can make it happen.
Playing with laptops can become performative in conventional ways, just by adding instruments – voice, guitar, live drums, ukelele, or whatever it is you play. But it becomes more mysterious in the hybrid performance media that emerge from “playing” the arrangement directly, manipulating the larger bits of a track in the form of stems and samples. That can be really boring – the “press play” approach – or it can begin to embody an artist’s musical imagination. They can improvise with the composition.
You’ll want to make sure you don’t tune out early in this video with Four Tet, shot recently at Red Bull Music Academy in New York, or you’ll miss the good stuff. The UK artist begins in a conventional-enough way: he has his pre-made tracks divided into stems and triggers them in Ableton Live. If you’re a singer or instrumentalist, that would work fine as backing tracks; it’s when it becomes of a way of playing tracks back verbatim with nothing else that the “live set” can become bland.
But then things get a lot more interesting. Using some simple techniques for sampling loops, Four Tet uses external hardware to extend and transform the arrangement as it plays. There’s an unusual little loop sampler: a Cycloops / Red Sound C-Looper He makes heavy use of a BOSS Dr. Sample SP-303 (not the newer Roland SP-404 that I mistakenly saw originally. The SP-303 sees a button creatively misused as a gate. And finally, he also spends tome clicking around the UI with an old Windows laptop running CoolEdit. Nothing is terribly complicated, but these simple techniques make all the sounds more malleable, and it’s the way Four Tet plays them that makes them distinctively his.
If you keep watching, what may make Four Tet fans crack a smile is that the results become almost magically the sound of his productions – only improvised in live form.
A little secret, Mr. Tet: no, there isn’t actually this stuff hidden deep in Ableton Live, even when you do know how to use it. So, there’s no need to apologize for not looking deeper into Live. Yes, you could make something with Max for Live and the like, and actually, I imagine some Max patchers may be inspired by this setup. I’m not certain that matters either way, though; what does matter a lot is being able to have physical controls that externalize each of these techniques, whether that’s in the form of a controller or a netbook or a stomp pedal or a patch running on a Raspberry Pi.
But I imagine a lot of people will be inspired watching this video to try their own experimentation. And toward the end, he gets to why this matters: he needs this flexibility to respond to a crowd. That detail is what will always make performance human, whether they respond as a DJ, as a traditional musician, or as the new hybrids of composer, conductor, DJ, remix artist, and performer that computers can allow.