V Motion Project – Part I: The Instrument

Overview

The Motion Project was a collaboration between a lot of clever creative people working together to create a machine that turns motion into music. The client for the project, Frucor (makers of V energy drink), together with their agency Colenso BBDO, kitted-out a warehouse space for this project to grow in and gathered together a group of talented people from a number of creative fields.

Producer Joel Little (Kids of 88, Goodnight Nurse) created the music, musician/tech wiz James Hayday broke the track down and wrestled it into Ableton, and Paul Sanderson of Fugitive built the tech to control the music with the help of Mike Delucchi. I also helped with the music side of the tech and built the visuals software with motion graphics warlocks Matt von Trott and Jonny Kofoed of Assembly. Assembly also produced and directed the music video. Hip-hop/tap dancer Josh Cesan was the ‘Motion Artist’ tasked with playing the machine and Zoe Macintosh of Thick As Thieves documented the process. Heaps of other people were involved along the way. It was a true collaboration, with everyone contributing ideas and energy to the process.

The Instrument

The final design of the machine was the result of months of experimenting… it all makes perfect sense in hindsight, but in the beginning, none of us were really sure what we were building. The starting point was the idea of using the Kinect camera to detect the movements of the musician. Paul spent months playing with the Kinect and looking at what other people had done utilizing it to create music. He used Ben Kuper’s brilliant BiKinect project as a starting point and heavily modified it into something even more powerful. This software allows a musician to control powerful audio software Ableton Live with only the movements of their body.

Unfortunately, this control and flexibility comes at a price… the system has a significant lag. This lag doesn’t affect many of things you might want to do with the instrument, such as tweak a filter setting, or trigger a loop to start on the next measure. But the lag does make it nearly impossible to use the system as a drum-pad or keyboard. When there’s a delay between hitting a key and hearing a sound, it’s really, really hard to play a melody and even harder to play a drum beat in time.

Cracking the problem of playing notes in ‘real-time’ was one of the first major hurdles we overcame. In my previous work with the Kinect, I hadn’t experience the lag that we were seeing with BiKinect solution. I realized that the lag was happening because the OpenNI drivers it uses do a heap of maths to process image data and calculate the position of each joint of a persons skeleton, 30 times a second. I’d been working with raw depth data straight out of the camera without the skeleton processing, so had no intermediaries to slow things down.

To create a system that could give us the control and flexibility of the skeleton drivers, and the real-time speed of the raw-depth method, we decided to use two Kinects running different software on two different computers. A good idea, but it brought with it a whole new set of problems.

Two Hearts Beat as One

Below is my highly technical drawing of the instrument. The Motion Artist is on the left, and the Kinects are pointed at him from straight ahead. One Kinect is using OpenNI drivers to calculate his skeleton position. The other Kinect uses freenect drivers to access the raw depth data from the infrared sensor. The top computer is Paul’s music system, which is built in Processing on a Windows PC. The bottom computer is the Mac Pro running my visuals system written in C++ with OpenFrameworks. Mac and PC happily working together, hand in hand… could world peace be far behind?

The two computers talk to each other over UDP, sending simple JSON objects back and forth on each frame. The music system sends data about its current state (the user’s skeleton position, what sound the user is manipulating, etc) to the visuals system. The visual system sends midi info back to the music system when the user is pressing an “air keyboard” key, as well as timing info to help the motion artist nail the song transitions [something we programmed in on the morning of the performance :) ].

The visuals computer outputs to 3 displays. One output went to a normal monitor for my debugging/monitoring UI, another was split with a Vista Spyder across two 20K projectors to cover a 30 x 11.5 metre wall. The final display out went to another 20K projector aimed at a second building.

The music system outputs to an M-Audio external sound card hooked into a large speaker setup. The sound card also has a line-out that plugs into the visuals machine, which does FFT audio analysis for some simple sound visualization.

Don’t Cross the Streams

You’re not supposed to aim Kinects at each other. If you do the proton fields reverse and the universe implodes on itself… or at least that’s what we’d heard. If you give it a try, the world doesn’t end, but the cameras definitely don’t like it. The depth data you get is full of interference and glitchy noise. Luckily, there’s an incredibly simple solution to fix it.

Matt Tizard found a white paper and video that explained an ingenious solution: wiggle the cameras. That’s it! Normally, the Kinect projects a pattern of infrared dots into space. An infrared sensor looks to see how this pattern has been distorted, and thus the shape of any objects in front of it. When you’ve got two cameras, they get confused when they see each other’s dots. If you wiggle one of the cameras, it sees its own dots as normal but the other camera’s dots are blurred streaks it can ignore. Paul built a little battery operated wiggling device from a model car kit, and then our Kinects were the best of friends.

Controlling the Music

The music system works by connecting the Kinect camera to Ableton Live, music sequencing software usually used by Djs and musicians during live performances. Below is a screen capture of our Ableton setup. The interface is full of dials, knobs, switches and buttons. Normally, a musician would use a physical control panel covered with the knobs, dials, and switches to control Ableton’s virtual ones. Paul’s music system works by allowing us to map body movements to Ableton’s controls. For example, when you touch your head with your left hand a certain loop could start. Or you could control the dry/wet filter with the distance between your hands. This ability to map physical motion to actions in Ableton is enormously powerful.


Video: The Ableton Live setup for the Motion Project


Video: The skeleton based audio control system. In this example, the distance between the hands turns one dial in Ableton, and the angle of rotation controls another one.

The Air Keyboard

Getting the “air keyboard” to work along side the gesture based system was a big breakthrough, and gave us the speed we needed to play notes in real-time. The video below shows the evolution of the idea. In my first test, I made the keyboard work as if it were a giant button in front of the player. When you pushed your hand into the air in front of you, they key would trigger. Next I tried breaking the space in front of the player into a series of boxes that you could play quickly, like a xylophone or harp.

This ‘push forward’ technique worked well and was fun to play, but it was hard to play a specific melody or an intricate beat. There was a bit of lag introduced by the image detection I was doing to see the ‘key’ presses, but the main difficulty was the lack of feedback to the player. When you play a real keyboard or drum, your hand strikes the surface and stops. When you play an arbitrary square of air in front of you, it’s hard to intuitively know when you’re close to the target, have hit it or are totally off. We displayed visual cues on screen, but it was always a bit like groping in the dark.

The big break through came when I moved the keys to the side of the player. When you fully extend your arm to play a note, there’s the physical feedback of your joints extending to their limits, so playing the instrument at speed becomes much more natural. Also, to detect a “hit” or “miss” all I need to do in code is look at the silhouette of the player and then check each rectangular key. If the key has more than 20 pixels of silhouette in it, then it’s down. Otherwise, it’s off. It’s super fast and responsive.

The final piece of the puzzle was to define what the keyboards look like and how they behave. We defined the layout of each keyboard in a JSON file that lives on the PC. Paul wrote a little Air app to let us tweak positioning and define which midi notes each key should trigger. This was really helpful as we worked with Josh to create a keyboard to fit the way he wanted to move. So the end result is a giant MIDI “air keyboard” with keys that we can create and position anywhere in space.

Putting it All Together

This video demonstrates the instruments available to the motion artist. “Vox” and “Bass” are keyboards. “LFO” controls the low-frequency oscillation (that distinctive dubstep ‘wobble’). “Dough” uses two filters, one controlled by the distance between his hands and the other by the rotation of the ‘ball of dough’ he’s creating. “Drums Filtered” is a drum keyboard, but the sounds become filtered as his chest gets lower to the ground. There’s one element of the performance that’s not in this test; the moment that Josh looks like he is stretching out a giant triangle. Here, the distance between his two hands and the ground provides control. As he pulls the sound up, it gets louder and as he pulls his hands apart, the dampen filter decreases causing the sound to “open up”.

We were just playing around during this test, so his rhythm is a bit off at times :)

Another nice thing about dividing the system into two symbiotic parts was that Paul and I were able to split off and work in different directions once we had nailed the structure of the system and the communication method we’d use to pass data back and forth. Paul worked at the warehouse with the musicians and performer to get the flow of the song nailed down and tune the instrument so Josh could actually play the thing (he makes it look easy in the video, but its definitely not). I moved on to the Assembly office to work with Matt and Jonny on adding a bit of sparkle and spectacle to the visual side of things.

Up next.. Part II: The Visuals

[Update: Paul has posted an article detailing the project from his point of view. He has lots of good insight into the music tech and describes how the instrument evolved over the months. Definitely check it out!]

[Update 7/20: For more 'next level' Kinect miss-use, check out the interactive music video we just released for Neil Finn's (of Crowded House fame) latest band. You watch and control the video in 3D as it plays in your web browser! Some behind the scenes details here.]

170 comments

  1. Jayisgames

    Congrats, Jeff (and team)! This is really cool, very impressive work. Coming from a DJ background as well as games, I can appreciate how exciting this is. :)

    • Jeff Nusz

      Thanks Jay! It’s been fun to see this bounce around the internet. I haven’t been in the middle of such a big viral hoohah since “Sprout”!

      I’m hoping to have some time to enter your game design contest again one of these days.. I miss making games!

  2. Hybrid3rdGen

    So beyong the edge… Cyberpunk meets pure music techno madness! I would go to a show like this… do it! The Overmind demands it!

  3. Alan Martell

    This is one of the most innovative projects I have seen in my life time, how I wish I could be a part of such a brilliant team. Keep it up and start getting your projects even bigger and pitch it to big commercial companies for promo. Best of luck to all of you.

  4. Scott A

    This should be entered into the America’s Got Talent web/YouTube competition as a competitor. If not then as one of their professional acts

  5. John Script

    Wow, I’ve never thought you can use kinekt technology at this level. Astonishing performance also. This reminds me of Jean Michel Jarre’s live performances. I love the project, so I’ve share it with my readers.

  6. Web Games

    Pleasе lеt me κnow іf yоu’re looking for a article author for your weblog. You have some really great articles and I think I would be a good asset. If you ever want to take some of the load off, I’d abѕolutеlу
    love to write sоme material for уour blog
    in eхchаngе for a link bаcκ to mіne.
    Ρleаѕe blast me аn е-maіl if intеresteԁ.
    Regаrds!

  7. gab_lecup

    amazing. been working on an “abstract” and open version from the same idea, with contemporary dancers in improvisation context. i love the way you resolved the problem of not having feedback for the dancer to know when he’s hit a key. in my project i come around it using direction changes (brutal stop after acceleration). i’d love to contact you guys to change ideas, please send me a mail if you think we could have a skype chat or something. and check my old project here: http://www.gabriellecup.com/consequencer

Post a comment

You may use the following HTML:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>