(max) fft patcher dump

It’s the end of the year, and a not a lot of time before the final project is due, but here are the fft patchers for your reference. They may still help you with some reference materials.

example patchers

All patchers are in an fft folder. You may just want to download that to have all parent and pfft~ subpatchers.

Individual files will be listed with each example description.

fft pass through

  • <FFT-Pass.maxpat> and <justPassingThrough~.maxpat>

<FFT-Pass.maxpat> demonstrates the basics of using the [pfft~] object, [fftin~] and [fftout~] objects as inlets and outlets, inlet/outlet numbering, and how [pfft~] handles other forms of messages. Pay particular attention to the difference in inlets/outlets for converting audio to the frequency domain and back to time, and how inlet/outlets are numbered.

In this example, the [pfft~] object contains a spectral processing sub-patcher (<justPassingThrough~maxpat>), and has arguments specifying the FFT size (2048) and number of overlaps (2). [pfft~] works similarly to [poly~], in that as many instances of the sub-patcher will be loaded as specified in the number of overlaps. The sub-patcher merely converts the audio to the frequency domain and back to the time domain, while also allowing two number messages to pass through unchanged.

cross synthesis

  • <FFT-CrossSynth1.maxpat> and <crossSynth1~.maxpat>

There implementation here is simple cross synthesis, where the real and imaginary signals of two signals converted to frequency domain are multiplied together, and the result is converted back to the time domain. Since the signals are multiplied in the frequency domain, some consider this process to be convolution. However, this isn’t true convolution, as neither signal is converted from Cartesian (x,y) coordinates to polar (amp, phase) values. That will come next.

Remember, since multiplication is taking place, both signals need to have amplitude present for any output to be heard.


  • <FFT-Convolution.maxpat>, <convolution1~.maxpat> and <convolution2~.maxpat>

Two implementations of convolution are demonstrated. The first is less CPU intensive.

[convolution1~.maxpat] converts the Cartesian (x,y) representation of the frequency domain signal of the second audio input to polar values (amp, phase) using [cartopol~]. Since amplitude information is in both the real and imaginary values of the first signal, only one [cartopol~] is needed, and no subsequent conversion from [poltocar~] is needed.

[convolution2~.maxpat] converts both frequency domain signals to polar values, multiplies the amplitudes and adds the phases as true convolution should do, then converts the resulting amp/phase values back to Cartesian with [poltocar~]. You can compare the audio results of each process, which sound extremely similar.



(max) final project – 2015

final project performance patcher

  • Due: Tuesday, April 28, at noon. (our final exam meeting time)


Create a Max patcher and a two-minute performance using audio and video. Your final project may/should draw from your previous projects, as well as the topics we investigated at the end of the semester (audio buffers, video, etc).


  1. This project will build upon the synthesis techniques, live audio input, soundfile manipulation, and video topics that we have discussed in class. You may incorporate any configuration of sounds, synths, and videos that you like. Your objective should be to create a performance patch that features real-time control of 8 different parameters. You may consider building synths that will work in conjunction with the data storage objects (tables, histograms, coll, umemu, etc) and control of the basic MIDI parameters that we discussed earlier in class. You may choose to expand your pervious projects, or simply start from scratch. You should make every effort to be creative with this project; as always, aesthetics are important. Make music that is interesting to you. Draw upon your knowledge of experimental art music forms and unique beat-based compositions.
  2. You should complete a composition to perform that is TWO MINUTES in length. Some elements to consider include:
    1. Your patch should make use of the Presentation Mode feature in Max. Include any objects are necessary to your performance interface.
    2. Your project should have at least semi-neatly routed patch cables, encapsulated sub-patchers, and a general sense of organized chaos. Patches that are messy, poorly documented, contain no presentation mode, and/or look sloppy are unacceptable. Your instructor or colleagues should be able to open your patch and make sense of it. If they can’t, it is too messy.
    3. Your project should be of sufficient complexity to stand as a final project for this class. Think of ways to apply filtering, buffer playback, synths, and possibly video manipulation to create a convincing musical texture. The parameters you control should manipulate higher-order processing – don’t just control volume and playback speeds of sounds. You should be able to control a variety of parameters that make sense musically over time.
  3. Write a brief report (two pages, typed, double spaced, 12-point font, 1 inch margins) describing the sounds you created, any use of real-time control of your sounds, and significant DSP processes that you used. In your report, also briefly summarize the central organizing idea of the composition, and describe the formal structure. Make any other comments you feel are relevant.

project specifications

  1. At least 8 controllable parameters. These parameters should include input data, filters, buffer playback, effects, and video controls.
  2. Your project should be at least 2 minutes in duration.
  3. Your project should make use of layering as means creating a complex sonic texture.
  4. Your project must include AT LEAST one of the following: live audio input or soundfile manipulation.
  5. Your project MUST use MIDI performance controls. You may map your control parameters to the keyboards in the lab, or feel free to bring in your own devices.
  6. Your project must have a clean, well-organized performance interface. Projects that are messy and have poorly routed patch cables will be heavily penalized.
  7. You must have at least five separate sounds/synths for this project. If you plan to include video, it may substitute for one of your required sounds.
  8. You must include at least effects/processors on your sounds/synths. Effects and processors can include filters, reverb, degrade, delays, and other topics we’ve covered in class. Feel free to explore the Max documentation for ideas.

file organization

  1. Your Max patch should be labeled “yourLastName_yourFirstName_FINALmax”.
  2. Your studio report should be labeled “yourLastName_FINALreport”.
  3. All of these materials should be placed in a folder labeled “yourLastName_FINAL_15”.
  4. All necessary external subpatchers, audio, and video files should be placed into “yourLastName_FINAL_15”.
  5. “yourLastName_FINAL_15” should be zipped into a folder.

procedure for turning in project

  1. Upload your zip file to a file sharing site (Google Drive, Box, Dropbox, etc.).
  2. Email me the link to download the file.

grading criteria

  1. Creativity of patcher and performance (60 pts).
  2. Neatness and documentation via comments (and/or send/receive as appropriate) (30 pts).
  3. Meeting the 2-minute length of performance and having the required 8 parameters under patcher control/variation (15 pts).
  4. Following turn-in procedure (5 pts).

110 points total


(max) synchronizing loops

example patcher

synchronizing playback of looped audio files without using the transport – two methods

The example patcher demonstrates how you can synchronize playback of two audio files of arbitrary lengths. One method is to use [wave~] objects, which playback according to a phase signal (typically supplied by a [phasor~]). The same [phasor~] can drive multiple [wave~] objects, keeping them synchronized. The other method is to use [groove~] for playback of one file, and then to use the sync output of [groove~] to drive other [wave~] objects.


This section discusses the top half of the patcher and the section outline in purple.

The [wave~] object plays back according to a phase signal. Using a [phasor~] with a positive frequency will cause the audio loaded into the buffer to play forward. [phasor~] outputs a continuous ramp signal from 0. to 1. If you want to express playback speed as a ratio you need to know the length of the audio file, which is expressed in ms. The [info~] object reports all sorts of useful information about an audio file loaded into a [buffer~], including its length. It needs a bang to report information, which can be supplied from the right outlet of [buffer~] upon completion of an audio file load. Divide 1000 by the length of the audio file to the get the frequency to playback the audio at unity speed.

The upper right portion of the patcher allows for speed to be expressed as a ratio by multiplying the playback ratio you want by the unity frequency. The new playback ratio becomes the argument of the multiply object, and then follows with a bang to the unity frequency (stored in the floating-point number object) to complete the execution of the multiplication operation. I’ve included a floating-point number box in the purple part of the patcher for controlling the playback ratio. It is sent to the upper right portion of the patcher to compute the [phasor~] playback frequency.

The second sound is synchronized to the playback of the first sound by connecting it to the same [phasor~] for playback. The addition of the [rate~] object allows for the possibility of expressing the speed of playback for the second sound as a multiplier (ratio) of the first. [rate~ 1. lock] translates to an initial frequency multiplier of the incoming phasor signal, and to lock the output phasor signal to the incoming phasor signal at the sample level. You can change the multiplier with a floating-point input to the right inlet of [rate~].

The two [snapshot~] and [slider] objects graphically show the phase of the two signals, from the [phasor~] and from the output of [rate~].

groove~ + wave~

The section enclosed in green shows another method for synchronizing two audio files. The [groove~] object sends a sync signal out its right-most outlet. The sync signal is the equivalent of a [phasor~] output, traversing from 0. to 1. over the course of the looping section. That sync signal can be sent to a [rate~] object feeding a [wave~] object for synchronized playback.

The main differences with this method as opposed to the first is that, a) you do not need to compute the unity playback ratio of the first audio file as [groove~] is driven by a playback ratio (not phase), and b) the output sync signal adjusts to changes in the start and end points of the loop specified to [groove~]. If you change the loop points, the sync signal will match its 0 – 1 output to the new loop points.

You can follow phase outputs with the [snapshot~]/[slider] combinations included in this portion of the patcher.


(max) buffer shuffler

example patcher

manipulating playback location

Up until now, all of our [groove~] playback has been linear, progressing through the buffer, or a looped portion, in one direction. We can use location, however, to cause [groove~] to jump to different locations within the buffer or a looped portion.

Look inside of [p loopShuffle], in the orange border. Inside of it, open up [p loopSegment]. [p loopSegment] takes the selection start and end points (being used as the loop points), calculates the loop length, and divides the loop length by 8 (or some number sent to the patcher) to get a segment length.

The segment length is used in [p loopShuffle]. The [random] object outputs a number that gets multiplied by the segment length and offset by the loop start time. Each time a random number is output it jumps to a new start time in the loop. The loop segment length is also the speed of the metro.

You have to turn on shuffling in the parent patcher, and you can also turn on pitch shifting in half steps and set the half step transposition range so that each segment is transposed by a new amount.



(max) mapping midi to groove~ controls

example patcher

mapping continuous controllers to groove~ parameters

The example patcher shares the vast bulk of its functionality with the [waveform~] examples in a previous post. The only differences are two subpatchers, [p midiControls] and [p kbControl], located in the blue border in the upper right portion of the patcher.

[p midiControls] has CC’s assigned to control the playback ratio (pbRatio), half-steps converted to cents (centShift), and loop start and loop end points (loopStartDisp and loopEndDisp). pbRatio and centShift are the result of simple scaling operations. For the loop start and end, the scale objects must know the size of the file in the buffer~ in order to know the full range of possible time values. The [buffer~] size is reported whenever a new file is loaded. When a file finishes loading, [buffer~] sends a bang out its right outlet. That bang causes the [info~] object to report information about the file, including its length in milliseconds. That length is sent to the [scale] objects in [p midiControls].

It’s important to note that mapping a CC to a large buffer size will not give you very fine control over your loop points.

controlling playback pitch/ratio

[p kbControl] shows two ways to handle pitch shifting with a MIDI keyboard, sending back either pitch shift as a ratio, or pitch shift as half steps (for shifting by cents in this patcher). Both methods assume a unity key (middle C – 60, unless a different key is chosen). The ratio method subtracts the unity key from the MIDI key pressed and sends that to an [expr] object. The [expr] object allows you to use a C-like math expression. In this case we use pow, which takes two arugments (x, y). The result is x to the power of y. In our expression uses 2 to the y/12 power, which results in equal tempered pitch ratios. Negative y values (half steps down from unity) produce ratios less than 1., lower than unity pitch playback. This ratio method is the only one that works if the timestretch attribute is off.

The other method only works with timestretch on, because it uses the pitchshiftcent method. With this method, you can take the half-step distance from unity, multiply it by 100. (in the parent patcher), and send via pitchshiftcent to [groove~].


(max) record and play

example patcher

recording to a buffer~

Recording to a [buffer~] object is pretty straightforward. You need an audio input (I’m using [ezadc~] in the example), connected to a [record~] object. A non-zero number to [record~] starts recording; zero turns recording off. A [toggle] objects provides easy control, but for remote control purposes it might make more sense to use something like a sustain pedal to start and stop recording ([ctlin 64]). It’s handy to have a signal level meter to make sure your input audio is at an appropriate level (such as [meter~] or [levelmeter~]). You need to make sure your [buffer~] object has enough space (time specified in ms) to record whatever your might need. I err on the large size when specifying buffers for recording.

playing back, with looping immediately after recording

If you want to emulate the functionality of a looping pedal you can do that as well. You just need to keep track of time. A looping pedal is a device that allows you to input audio for a specified or user-controlled period of time and then immediately playback the recorded material as a loop.

You can use [timer] to keep track of the actual length of your recorded audio segment. A bang in the left inlet of [timer] clears and starts the interval to be timed. A bang in the right inlet causes the current time to be reported. The example patcher has a [select 0] object connected to the output of the [toggle]. I’m looking for “off” and anything else, rather than a specific “on” and “off” messages because of the different number values that toggle can report. If I click on the toggle, it outputs a 1. If the toggle is turned on by some other message, such as a sustain pedal on message, it is receiving and outputting something different from 1. Most sustain pedals send a 127 value when pressed, and 0 when released. I can’t always predict what will turn on the toggle, but I can be sure of what will turn it off. So I look for off messages as 0, and consider anything else from the toggle to be an on message.

The [timer] reports the time of recording when recording stops, and it sends that information to [waveform~] as the selection times. Selections in [waveform~] are then sent as loop points to [groove~].

The full control process for recording on/off and timing:

  • start recording with either the toggle of the sustain pedal.
  • starting a recording sends a message to clear the [buffer~] and start the [timer].
  • stopping recording triggers the following events/messages:
    • sends recordOff message to [timer]
    • sends recordOff message to set [groove~] playback ratio/speed to 1.
    • sends recordOff message to set [groove~] location to 0. ( to start at the beginning of the loop)
    • [timer] outputs the duration of the recording to [t f b].
    • [t f b] sends a bang to set the start of the selection in [waveform~] to 0., and the sends a float to [waveform~] as the selection end time.
    • [waveform~] sends its selection start and end times to [groove~] as the loop start and end points.

(max) graphic audio display, waveform~

example patchers


The [waveform~] object is a graphic editor for buffered audio. The easiest way to make use of [waveform~] in a patcher is to copy and paste from the help file. Copying from the help patcher gives you the pict slider (with tool picts) and the subpatcher that changes the tool for [waveform~]. Unlock the patcher to see all of the connections. A short list of things to pay attention to:

  • the [waveform~] object has to be linked to a named buffer. The loadbang to the message accomplishes this link. The message is formatted as name buffername channel# (of buffer).
  • The four tools on the left are part of a pictslider object. The pictslider lets you load pictures that serve as the background for a selectable slider. The index of the tool selection is sent to the subpatcher [wfkeys]. The tools are: select, loop, move/zoom, and draw.
  • The move tool lets you change the zoom of the display by clicking and dragging up or down.
  • You can select loop sections by entering time messages in the number boxes above the waveform, or by using the select tool in the waveform.
  • The [set $1] messages update the number boxes above the waveform when you select in the waveform, without creating a feedback loop.
  • Start and End points of the loop selection are sent to the groove~ object as start and end points for looping.

stereo waveform~ view

You can link two [waveform~] objects to view and edit stereo audio files. Each [waveform~] has to be linked to a channel. Tool changes are sent to both. To set it so that selecting a portion of one channel selects both channels, you need to connect the link outlet of each [waveform~] to the link inlet of each [waveform~]. Link connections are on the far right. Otherwise, the operation is the same as when working in mono.


(max) playing audio from memory, buffer~, groove~

example video and patcher


To play sound from memory (RAM) instead of the hard drive, you have to create a space for it in memory. You use the [buffer~] object for that purpose, defining a name for the buffer, length, and number of channels. Other objects that access the buffer~ for playback or other manipulations will use the buffer name to make a logical connection.


[groove~] allows for ratio-based playback of audio from a buffer~, with looping and continuously variable speed control. Some things to keep in mind:

  • playback ratio has to be sent as a signal to groove~. The usual method is either a float sent to sig~, or a [number~] in signal output mode. Unity playback is 1. You can also reverse audio with negative ratios.
  • The default mode is that playback speed (ratio) affects pitch.
  • if no loop points are sent to groove~, the entire buffer will be looped.
  • a float sent to the left inlet sets the time of the playback head. You have to set the playback head before initial playback, as well as after reaching the end of the buffer if looping is not enabled.

new features in version 7

Like playback from disk, version 7 of Max allows you to change playback speed independently of pitch, and vice versa. All of these messages/attributes work the same as they do with [sfplay~].


(max) soundfile playback with playlist~

example patcher


playlist~ is one of the new media players available in Max 7. You can load one or more soundfiles for playback, complete with play and loop controls, graphic waveform display, and the ability to select portions of the file for playback and/or looping.

You can insert a playlist~ object into a patcher simply by dragging a soundfile from the media bar (on the left side of the patcher) or the Finder into blank space in the patcher. Dragging a single file into the patcher creates a playlist~ with that file. Dragging multiple files creates a single playlist~ with all the files loaded.

You can drag and insert more files into a playlist~, remove files, and rearrange the order of files. When you drag in a new file, pay attention to the red insertion/replace indicator. If you see a red line, the file will insert between clips at that point. If you see a red rectangle, then the new file will replace the enclosed clip in the playlist~.

controlling playlist~

You can control and modify soundfile playback in playlist~ in all the same ways you can with [sfplay~]. However, since you have to target specific clips with control messages, you cannot use [attrui]. You have to use messages that specify the parameter being changed, the clip number it applies to, and the parameter value. The example patcher shows a number of these controls. Clip numbering starts with 1.

Sending a number to playlist~ causes that clip to start playing. Only one clip can play at a time, so sending another number while any clip is playing will stop the current clip and start the indicated clip. Sending a 0 stops any clip from playing.

Playback notifications likewise include clip-specific information. While [sfplay~] outputs a bang when a soundfile stops playback, playlist~ sends start and done messages. The format is start (or done) clip# filename. You could create random or algorithmic controls for cueing clips based on which clip has finished playing. You wouldn’t need to process the filename in any way. Instead, you can route messages using start or done and process according to the clip number.


(max) sfplay~ > soundfile playback

example patchers


[sfplay~] plays soundfiles from the hard disk. You can define cue regions, loops, adjust speed of playback, and with version 7, independently stretch/compress time and shift pitch.

sfplay1.maxpat shows some basic controls. The patcher is well-commented, but highlights include the open message to open the standard Mac/Windows file open dialog for file selection. A toggle can be used to play/stop a file, but the toggle doesn’t update itself when a file stops playing by reaching its end. [sfplay~] sends a bang when it stops playing, either by reaching the end of the file, or by being stopped. Using the gate, you can control the sending of the bang back to the toggle, so that it only happens when the file reaches its end.


[dropfile] creates a rectangle that you can use as a target to drag and drop files from the desktop to a max patcher. The type of file dropped is output via the right outlet. The full path name of the file is sent to the left outlet, and with the [prepend open] object can be sent to [sfplay~] object to load that file.


The [playbar] is a QuickTime-style control bar. You can connect it to [sfplay~] to control playback and looping via a familiar tool.

controlling sfplay~

sfplay2.maxpat shows a number of ways that you can control playback parameters of [sfplay~]. You can change most parameters via messages (demonstrated with loop and speed) or with attribute inspectors [attrui] (demonstrated with timestretch, speed, pitchshift, and loop).

The are a few reasons why you would pick one or the other. Messages can receive input from other objects in a patcher, so they can be controlled by changing conditions in the patcher. [attrui] objects come with a label and UI control all in one, so they are more compact in a patcher and easier to create. [attrui] objects also must be connected to the object they control directly through a patch cord connection. They cannot use send/receive objects. An [attrui] object attached to a send object becomes an attribute inspector for that send object.