đź”” The Morph is discontinued. Click to read the announcement. đź”” The Morph is discontinued. Click to read the announcement.
Maxing the Morph.

Maxing the Morph.

We’re really excited to announce a new way of working with the Morph API: the sensel object in Cycling ’74's Max visual programming environment. Until now, only programmers comfortable wrangling the API from a git repository and working in a scripting or compiled text-based programming language have had access to all the rich data from contacts on the Morph. Using C, C#, or python and finding all the necessary libraries and hooks to connect a contact on the Morph to video, audio, or just logging data for research can take a lot of time and frustration.

With Max, a lot of those media and network hooks are built in and exceptionally powerful. And when we say “visual programming environment”, it means you can now focus on process, rather than syntax: contacts can be literally connected by a patch cord to some other process: video, audio, robotic, machine learning, lighting, the internet, and more. There are high level functions such as synthesizers and video manipulation. There are low-level functions, such as DSP programming, regular expressions, and mathematical functions.

The great thing about Max is that it is always running in realtime, so experimentation is rewarded with results, and those results can be instantly accepted, rejected, or modified. Design thinking, in the palm of your hand! In all seriousness, Max has such a long history of creative manipulation of digital data, beginning as a humble MIDI modifier in Paris in the 80’s, all the way to today, where it is embedded in Ableton Live as an essential part of one of the biggest names in music software.

Check out the example below. While I am very experienced in Max patching , I’ve been away from it for a few years, so I was pretty rusty. I dug through a couple tutorials to get re-familiarized with some of the OpenGL objects, and cooked up a novel control interface for abstract 3D visual art in about a day. Take a look at how this works:

The Morph is divided in half: spatial controls and color controls. The first challenge was to figure out how to rotate the object in three dimensions. I use vertical and horizontal swipes on the Morph to rotate on X and Y, and use pressure to rotate on Z. However, pressure is only active when I put a second finger down - then ensures that it only rotates on Z when I want to. Adding a third finger changes the interface to affect the X, Y, and Z scaling of the object, allowing me to resize it. 
Controlling Video with Morph and Cycling '74 Max
On the right side, I divided it into a top and bottom area. Rather than making “sliders” where my fingers would have to be in a pretty specific area, I used the "contact orientation" property to act as a “knob” for the red, green, and blue components. By twisting my finger, I can add or subtract any one of those color components. The number of fingers I use determines which of the RGB colors I am mixing: one finger for red, two for green, three for blue, four for opacity. I use this same scheme on the bottom half, but for the “erase color” property for the OpenGL context, allowing me to add some erasing trails to make things extra trippy.

I’m more than comfortable saying that my experiment was only partially successful. I really like the idea that I can put my hand anywhere in a large area, but still have specific controls over parameters using contact counting. But it can be a bit awkward to use, and dialing a specific value is nearly impossible. There’s a lot of promise in this idea, but there’s clearly a lot of iterations needed to refine my mixing controls so I can dial in the exact color I want. Good thing Max makes it easy to keep on trying!

Get the object in the Max Package Manager (open Max, File->Show Package Manager). Get the patch at Blend:

  or download it from our server.

Â