For our first piece, Kevin and I extended the work we did in April and May of 2011 when we used a Kinect sensor, projection mapping, sound and video to animate a collection of fictional letters through readers’ interactions with an antique rolltop desk. This time we wanted to work on a smaller scale, so we projected video into a Kosta Boda snowball candleholder designed by Ann Warff. We hoped the candleholder’s rippled glass would diffuse the video imagery into the kind of flickering light one might find on a table set for a romantic dinner, as it is in the poem at the core of this piece.
How to Setup Konkrete Performer to Ableton Live via Osculator using OSC:
1. Create a Network on the Mac
In order for the iPad and the Computer to communicate they both need to be on the same network. The most simple way to achieve this is to create an ‘ad hoc’ network where the computer acts as a wifi network hub which your iPad can connect to.
2. Open iPad Settings and Connect iPad to the Newly Created Network
Next, go into the wifi settings of the iPad and make sure you are connected to the network you just created on the computer.
3. Open Konkreet Performer on the iPad and Specify the OSC Output
Change the OUT > to the name of your computer (you can find and/or change the name of your computer in System Preferences/Sharing). Make a note of the OUT port for use in Osculator. Make any other changes to Performer while you are here, e.g. how many nodes you want, etc.
4. Open Osculator on the Computer and Change the Incoming Port Number
Once you enter the correct port number, move some of the nodes around in Performer on the iPad and all the variables should appear.
We are most interested the variable from Performer that sends the distance each node is from the center point, although, as you can see, there are a number of variables that Performer can send such as the distance each node is from its adjacent nodes and its absolute value on the screen. Here through, we just want the distance which is in the format “1/n1/l”.
5. Assign the Each Variable a Data Type and Specify the Output Value
In order for Ableton Live to see the data coming from Osculator the event type will need to be “MIDI CC”. In essence, Osculator here is changing OSC data in to MIDI data. MIDI CC means that it is a MIDI event which has continuous values, like a knob or slider. You will need to assign the Event Type for each of the variables from Performer that you would like to use in Live. Do this in the Event Type column in Osculator.
Next, you also need to give each MIDI output a unique “Value” in Osculator so that Live can tell which variable is which. Do this in the Value column on Osculator. These Value parameters are arbitrary so its probably most simple to just assign node 1 to value 1, node 2 to value 2, etc.
6. Test the Data in Osculator
Once you have the variable MIDI outputs set you can test the data by clicking on the message you want to view and click the space bar, this will bring up the data view window. Move the node around on Performer on the iPad and you will see the data move about.
7. Scale the Data in Osculator, If Needed
This step is optional because you can also scale the data in Ableton Live when we get to that point. But if you like, you can enter the Scaling Page in Osculator to change the incoming and outgoing scale of the data. Do this from the View menu or click Command+F on the keyboard. The reason you may want to scale the data in Osculator is so that the node distance from center can be the variable that increases the closer you get to center, or decrease the closer you get. The default is that it will decrease, but for our purposes we probably want it to increase as we will be controlling the volume of tracks depending on how close they are to the center. To do this, either flip the incoming data or the outgoing data (1 to 0, or 127 to 0, for example). Side note: OSC data is a floating point decimal number, and MIDI is an integer value that goes from 0 to 127.
8. Open Ableton Live and Change the Preferences to Receive Data from Oscualtor
Next, you will want to open Live and enter the Preferences screen. Here, go to the MIDI tab on the left and find the MIDI Ports. “Osculator Out” is what we want which is the data coming out of Oscualtor in to Live. Now turn on all three buttons: Track, Sync and Remote. Don’t worry about the Midi Out to Oscualtor.
9. Create Tracks and/or Groups of Tracks in Live to Control with the Incoming MIDI from Osculator
Next, you will want to create individual tracks for each of the sounds you intend to use, and group them together if necessary. Create tracks from the menu or click COMMAND + T. To group the tracks together to control their collective volume with one volume control, select all the track you want to group together by SHIFT+selecting all of the tracks and click COMMAND+G. Now they have one main volume for each of the tracks. You can change their relative volumes within the group but their overall volume can be controlled via the Group track volume. This is the parameter in Live we will want to control with the nodes on the iPad.
10. Assign the Incoming MIDI to the Track/Group Volume Controls
Click the MIDI button in Live at the top right of the screen. The entire interface in Live should now turn blue. Each element that is shaded blue can be controlled via the incoming MIDI data from Oscualtor/Performer.
Select a Volume control for one of the Groups or Tracks, what ever it is you want to control. Once selected, you can move the node in Performer on the iPad and that data will assign to that parameter.
You will know its mapped correctly when you see a box above the parameter that displays the MIDI channel (which should just be 1) and the unique VALUE of the node that you specified in Osculator previously.
Repeat the same process for each track/group volume you would like to control.
Once you are done mapping the nodes to volume controls make sure you hit the MIDI button again in Live to get out of the mapping mode. This will ensure you don’t accidentally reassign the parameters.
11. Scale the MIDI Data in Live if Necessary
In order to scale the data or make it so that it controls a specific range, you can re-enter the MIDI Mapping Mode in Live by hitting the MIDI button again, and in the MIDI MAPPINGS window on the left, you should see each of the parameters you mapped. Its important that you change the high and low values to -inf dB to 0 db, so that its doesn’t get too loud. You can also flip the data here if you didn’t do that in Osculator.
12. Control the Volumes!
Now, once you click out of MIDI mode again, you should be able to move the nodes around on Performer and see the Volumes of the assigned tracks/groups move. If you don’t see them move right away, keep moving the nodes back and forth toward and away from the center so that the volume control in Live can “pick up” the controls. You will know it is trying to pick up the data if the yellow bit of text on the bottom of the screen says it is trying to pick up the data.
At this point, you should be rockin’ and rollin’!
Here are the links to software downloads for dynamic sound mixing and composition via iPad. All tools, except the Konkreet Performer iPad app, can be downloaded in a trial version for experimentation.
OSCulator (middleware, allows iPad to talk to your computer)
Konkreet Performer from Konkreet Labs (iPad app for sound composition)
Ableton Live Free Trial (Sound composition software for your computer)
As part of another iteration of Found Letters, and in conjunction with the class Video Sculpture, we set up to project a video poem in the underside of a snowball shaped glass candle stick holder. The challenge here was to create a very small video projection through a clear piece of glass.
First, the logo from the glass piece was etched into the bottom of the globe, making it visible when projecting through the glass. Additionally, due to the globe being clear glass there was not enough diffusion to project a video onto it. Therefore, the first step was to wet sand the bottom of the globe with some very fine sand paper. In doing so we were able to remove the logo from the bottom (which ended up being painted on as opposed to etched) and it also served to create a diffuse bottom to the glass globe allowing a projection to appear through the hole where the candles goes.
Next, we found that the sand papering didn’t create enough of a diffuse surface, and therefore decided to add a bit of tissue wrapping paper to the underside. Once this was in place, a rear-projection looked perfect.
The next challenge was the size of the projection. Luckily we have at our disposal a very small Pico Projector Development Kit from Texas Instruments, which itself is about 2inx3in in size, the perfect size for a small video sculpture.
Of course, putting all of this together, in a stable and aesthetically pleasing installation, was the final challenge. With the projector being so small and the distance from the surface needing to be very very exact (about 7inches) we had some difficulty finding the right housing for the piece. It was ultimately a simple solution. We utilized a shipping box, cut a hole in the top and used some of the packaging from the projector’s box to fit the projector in place.
Finally, the video content was edited both with a circular mask and without to fit the globe’s shape. It turned out that the video without the mask gave the outer edges of the glass a certain glow that was aesthetically pleasing. Ultimately, the piece was a simple, and intimate iteration in the Found Letter series of prototypes.
Konkreet Performer for iPad by Konkreet Labs ~ $24.99 ~ iTunes App Store. To dynamically compose sound on the fly, we add tracks to Ableton Live, then use Konkreet Performer to send Open Sound Control messages to Live, allowing us to create a new soundscape from a predetermined set of tracks.
Working in Ableton Live gives us the flexibility we need to transform vocal recordings and sound effects into an immersive soundscape.