max max-lecturenotes

(max) incrementalism – developing an arpeggiator


Applied to coding, incrementalism is more a philosophy than a tangible thing. It’s the idea that you break large project into small steps, to not try to do too much at once. Incrementalism encourages you to develop a small part of the project, test it, and expand it. We’ve been using incrementalism with the development of an arpeggiator patcher. First we got a repeated note to play, then we made it so that playing a MIDI key didn’t upset the rhythm of what was playing, etc. It’s good for us to be aware of process.



The first improvement to make is to be able to start and stop the repeating note with the MIDI keyboard, with a key press sending pitch to the repeater and turning the metro on, and a key release turning the metro off.

The <toggle> object has an inlet, which can be used to turn the toggle state on or off. Any non-zero number turns the <toggle> on; a zero turns the <toggle> off.

Key velocity from <notein> can be used  for <toggle> control. You need to take the key velocity directly from <notein>, not <stripnote>, as no zero velocities get through <stripnote>.

other pitches


An arpeggiator that only repeats one note is really an unfulfilled arpeggiator. To play other pitches we need to think about how to generate intervals in half steps that can be added to the base pitch. The simplest way to do this is with the <random> object. <random> takes an argument for how many different numbers to choose from. If the argument is 4 then the range of possible values will be 4. The only confusion is that computers start counting at zero. So a range of 4 equals 0 – 3 (0, 1, 2, 3). The maximum number will always be the range – 1.

It can be confusing, so I’ll say it again:

computers start counting at zero

mapping values


Just playing random intervals above a base pitch gets old quickly. A better way would be to play specified intervals above a base pitch. To do that, we need an <itable> (or <table>) object. An <itable> is an (x,y) graph that stores values (y) at locations (x). To retrieve a value, send a location in the form of a number to the <itable>. Taking one set of values and using it to select from another set of values is called mapping.

You can set the size (# of x values) and range (# of y values) for an <itable>. I highly recommend that you reduce both properties in the <itable> inspector so that you have a useful/visible space to edit. You can change values in the <itable> by clicking or click-dragging within the <itable> in a locked patcher. My example patchers use a pentatonic scale.

The TableRandom version of the patcher uses the <random> object to generate numbers that are read as index locations by <itable>. The TableCount version uses a <counter> that will always play the table in order. By looking for 0 velocities (noteoff messages), we can reset the <counter> so that the next time a note is played it will start at the beginning of the pattern. The <select> object looks for 0. When a 0 is found, a bang is sent out the left outlet of <select>. Routing the bang to the middle inlet of the <counter> tells counter to reset to the minimum value and output that value on the next count. You could also reset the <counter> and immediately output a value (but then you would hear a note when releasing the key), and you can reset the <counter> to any value within the counting range.

max max-lecturenotes

(max) object inventory, tutorials

Current students should be working their way through the first seven Max tutorials (“Hello” through “Numerical User Interfaces”).

objects used so far

  • <notein>
  • <noteout>
  • <makenote>
  • <number> (number box, integer)
  • <message> with “set” as a special message
  • <stripnote>
  • <button> (bang!)
  • <toggle>
  • <metro>
  • <comment>

objects to check out (we’ll use them soon)

  • <random>
  • <pgmout>
  • <table> and <itable>
  • <counter>
  • <slider>
  • <dial>
  • <%> (Modulo operator)
  • <select>
  • <key>
max max-lecturenotes

(max) beginnings of an arpeggiator patcher

We’re starting our arpeggiator patcher, learning some basics of how Max works, and how to translate our what we want to do into Max code.


Receive a midi note input from a MIDI keyboard to set a pitch, and play that pitch at a regular, repeating time interval.


We can translate parts of that common language description of the task into Max objects.

  • Receive a midi not input from a MIDI keyboard: <notein>
  • regular, repeating time interval: <metro> (which usually includes a <toggle> to start and stop)
  • play that pitch: play a note, <makenote>



This first patcher attempt, as noted in the filename, does a lot of the task but still present a few issues. MIDI pitch from a <notein> object is sent to a <makenote> object, which is played at a repeating time interval by way of a <metro> object.

The patcher also does some things we didn’t ask it to do. It plays a note whenever I press a MIDI key, and whenever I release a MIDI key. It does that because MIDI pitch is sent directly to <makenote>, and MIDI pitch is generated every time I press a MIDI key (a noteon message), and every time I release a MIDI key (a noteoff message).



So what is needed to prevent MIDI keyboard presses from being directly echoed out of Max? The second patcher implements two additional features. The first addition is a <stripnote> object. <stripnote> looks at all noteon messages and only passes the pitch through if the velocity is above zero. It prevents noteoff messages from passing through. Remember that you have to connect both pitch and velocity from a <notein> object to <stripnote>. 

The second addition is a set message. Using a <message> object, the keyword “set” along with the variable $1 tells the subsequent <number> object to set its value to the input without sending it as output. The $1 is variable placeholder argument. It means, “use the first message in a list.” We aren’t sending a list to the <message> object, but a single message can be considered the first item in a list.

So now, only pitch data from noteon messages are being sent, and when they are sent, the <number> object updates its data but does not pass the data on to <makenote>. The bang message from the <metro> tells the <number> object to output its current value.

playing multiple notes at a time


This final patcher shows that we do not need multiple <makenote> objects to play multiple notes at the same time. We can simply send two pitches to <makenote> and it will play them both. The pitch from the <number> object is being sent both to <makenote> to an addition object <+>, where an interval is added to it and sent to <makenote>. You can change the interval by typing a new number into the <number> object that is connected to the right inlet of <+>.

Intro to Digital Music

(must115) digital audio data

This post is both a summary and an expansion of topics in chapter 6 of An Introduction to Music Technology, by Hoskens.

comparing analog and digital signals

To understand the difference between analog and digital signals, you must understand that analog signals are continuous fluctuations, while digital signals are comprised of discrete values. Hoskens illustrates the difference as that between a light dimmer switch and a regular light switch. The dimmer is continuously variable between a minimum and maximum value. The switch is either on or off. The dimmer represent an analog system, and the switch a digital system.

digital recording: sampling and quantizing

To make a digital recording, you sample a fluctuating signal at regular intervals and store those samples as a series of numbers. How often you take a sample is called the sampling rate, and it is measured as a frequency: samples per second, in Hertz. Assigning a numerical value to each sample involves quantizing the value to assign it to a discrete number. How many values a digital system can represent between minimum and maximum is the sample resolution.

For CD-quality audio, the sampling rate is 44,100 Hz (44.1 kHz), and the sample resolution is 16 bits.

sampling rate and the Nyquist Frequency

To get an accurate representation of frequency of a signal, you must sample the signal at least twice a cycle to capture the positive fluctuation and the negative fluctuation. If you do not sample at least twice a cycle, the resulting signal heard on playback will be aliased. An aliased signal will be heard at a different frequency than the original signal. For accurate recordings, you need a sampling rate that is at least twice that of the highest frequency. Another way of looking at this situation is that any frequency above 1/2 of the sampling rate will be aliased.

We refer to 1/2 the sampling rate as the Nyquist Frequency. For CD-quality audio, that frequency is 44.1 kHz/2 = 22.05 kHz. Since normal human hearing goes up to 20 kHz, this frequency is generally adequate for digital recording. You can predict the aliased frequency of a signal above the Nyquist by finding the difference between the original frequency and the Nyquist Frequency, and then subtracting that difference from the Nyquist Frequency.

Original frequency – Nyquist = frequency difference
Nyquist – frequency difference = recorded/heard frequency

For example:

SR = 10,000
NF = 10,000/2 = 5000
Original frequency = 7000
7000 – 5000 = 2000
5000 – 2000 = 3000 Hz

Aliasing only applies to frequencies above the Nyquist Frequency, 1/2 of the sampling rate. To avoid aliasing in digital recordings, a lowpass filter is inserted before the ADC to remove all frequencies above the Nyquist from the signal. Since the recorded signal is a discrete series of values, the resulting waveform is stair-stepped, or jagged. If played back in its original form, it would introduce many high frequencies into the signal that weren’t recorded. Therefore, a lowpass filter (sometimes called a smoothing filter) is applied to the signal after the DAC on playback to remove frequencies above the Nyquist, since they were not part of the original recording.

quantizing and sample resolution

Each sample is stored as a discrete number. Any incoming amplitude value that falls between available values in a digital system has to be rounded or truncated to match an available value. The easiest way to think about this process is to compare floating point numbers (numbers with fractional components) and integer numbers. The value 37.3 would be represented as 37 if I could only use integers. 37.8 might be represented as 38 or 37, depending on if you rounded the number (changed it to the nearest integer) or truncated (the fractional value removed). In any case, the process of changing the incoming signal to a discrete value is called quantizing. In a CD-quality system, you have 16 bits of sample resolution, leading to 65,536 values. 24 bits provides 16,777,216 values.

The difference between the actual value and the quantized value will be heard as audio noise. Literally, the digital value is a mistake, and mistakes equal noise in the system. How much noise, or how much error, is called the signal-to-error or signal-to-noise ratio, and signifies the dynamic range of the system. More on that below.

binary numbers

Computers represent values as binary numbers, or binary digits – bits. A bit is either 0 or 1 (two choices, hence binary). For each bit added to your system, you double the range of possible values you can store. 1 bit = 2 choices (0 or 1). 2 bits = 4 choices (00, 01, 10, 11). 2 bits = 8 choices (00, 01, 10, 11, 100, 101, 110, 111).

Another way to think of binary numbers is that they represent a base 2 counting system. We normally use base 10 counting. Each place in base 10 represent a number times a power of 10. For example:

125 in base 10 equals
5 x 10-to-the-zero-power (1) = 5
2 x 10-to-the-first-power (10) = 20
1 x 10-to-the-second-power (100) = 100

Base 2 counting is similar, with each place representing a power of 2.

101 in base 2 equals
1 x 2-to-the-zero-power (1) = 1
0 x 2-to-the-first-power (2) = 0
1 x 2-to-the-second-power (4) = 4

101 in base 2 (binary) equals 5 in base 10 counting.

8 bits = 1 byte, which combined with a base 2 counting system, impacts how we talk about file size. Since a computer represents data in binary, a kilobyte (kb) is not 1000 bytes. A kb equals 1024 bytes. A megabyte (MB) equals 1024 kilobytes, and etc (gigabyte, terrabyte…)

more on signal-to-error ratio (signal-to-noise)

Since each added bit in a digital system provides twice the available values, each added bit provides 6 dB to the signal-to-noise ratio. The signal-to-noise ratio represents the range between the highest amplitude that can be recorded and the noise floor that results from quantization error. In a 16-bit system, 16×6 = 96 dB signal-to-noise ratio. In a 24-bit system, 24 x 6 = 144 dB signal-to-noise ratio.

high resolution digital recording and file size.

You can easily ascertain the file size of a digital recording of a certain length by multiplying the Sampling Rate x Sample Resolution x Length (seconds) x Channels (number). For example, 2 seconds of stereo CD audio:

44,1000 x 16 x 2 x 2 = 2,822,400 bits
2,822,400 bits / 8 (1 byte) = 352,800 bytes / 1024 (1 kb) = 344.53 kb

Using the formula, 1 minute of stereo CD audio equals approximately 10 MB.

Doubling the sampling rate would double the file size, but knowing what we know about pitch to frequency relationships, it would add only 1 octave to the available pitch range of the recording. (Not much!) However, adding 8 more bits of sample resolution, taking the bit resolution to 24, only adds 50% more data to the file but adds 2-to-the-eight power to the amplitude resolution. That additional resolution is 256 times better than 16 bit audio.

The key point here is that better bit resolution leads to a much more noticeable improvement in digital recording quality than increasing the sampling rate.

Intro to Digital Music

(must115) setting up a simple recording in pro tools

To complete the first project, you will need to record yourself speaking a poem. Doing so will require at least one track of audio and performing some simple setup steps.

external hardware – audio interface

We use Native Instruments Komplete Audio 6 for audio input and output in the lab. It’s a USB-powered interface with two mic inputs on the front.

If the interface is plugged in and the computer is on, you will see a green light on top right of the interface by the label “USB.” The top of the interface also has a large MAIN VOLUME 1/2 knob, which controls overall audio output gain for listening.

The front of the interface has two microphone inputs with individual gain controls, and a headphone jack with its own volume control. The Komplete supports the use of two separate headphone mixes out the same physical jack, selectable with a button next to the jack. You can see which headphone mix is selected on the top of the interface.

connecting the microphone 

The lab has three Røde NT1 microphones for use on the project. Each has a shock mount and pop filter. You should use both accessories. I’ll go over the setup in class.

The Røde NT1 is a condenser microphone, which means it requires power to operate. The Komplete supplies that power in the form of 48 volts – usually called phantom power. The 48V power button is on the back on the interface, just above the USB connector. When on, there is a 48V indicator next to the USB power on indicator on top of the interface.


When connecting the microphone to the interface, phantom power (48V) must be turned off.

While you might not always damage a microphone when connecting with the phantom power on, you CAN damage a microphone that way. And we don’t want to damage microphones.

setting up pro tools to receive audio input

To record in Pro Tools, you need to have a record-enabled track, and that track has to have the proper input setting. If you haven’t done so already, create a mono audio track to record on. Set the input as shown in the picture (top menu).

PTSetupChannelIOYou can then record enable the track. If you have set your input properly, and record-enabled the track, you should be able to hear audio from the mic. You can adjust the input gain for the microphone on the interface to get a good audio input level. Make sure that the line/inst button below the gain is set to line.


With a record-enabled track you can hear audio input (monitor it). To record, you need to follow a two-step process. First, click on the record button in the transport (red circle/dot). It will start to flash. Then you need to click on the play button (green triangle) to actually start the transport and the recording of audio input.

volume versus gain

Sometimes controls on the interface will be listed as “gain” controls, and other times listed as “volume” controls. In the most basic of terms, volume is gain. However, the Komplete interface uses gain for controlling input audio levels, and volume for controlling output audio levels.


Intro to Digital Music

(must115) setting up pro tools 12

pro tools and DAWs

Pro Tools is an example of a Digital Audio Workstation (DAW). Pro Tools relies (mostly) on non-destructive processing and mixing. The program allows for multiple sounds to be used at once by reading from the multiple sound files, applying gain changes as indicated by mix commands, and applying processing through plugins. When you have completed a mix (or at any stage along the way), you “bounce” your project to stereo, which mixes and applies all processing to the individual tracks.

Since a DAW project has a more complex organization of files than a stereo audio editor, and a more complex set of preferences and setup, it is important to understand setup and file organization for a DAW project.

The following tips and screen shots are from Pro Tools 12.

first step – launch screen

By default, Pro Tools opens a launch window on startup. It allows you to create a new file or open a recent one. For your first project, you are creating a new file. With creating a new file comes the need to set up options properly, name your project, and store it in a place you can find later.

Our first project specifies the following options: BWF .WAV file type, 24-bit bit depth, 48kHz sampling rate and interleaved audio files. Take a look at the marked up screen for guidance.


Name your file according to the instructions (at the top of the window. Be sure to select “Prompt for location” for saving your project. Pro Tools will create a folder for your project with the name given. Store that on the Desktop to help you find it more easily (for backing up when your work session is complete).


Once you have finished creating a project with the proper options, open File | Preferences…, and select the Processing section.


Most of these preferences apply to projects with imported audio, but we will be creating our audio within the program. Still, it’s good to get in the habit of setting these preferences. You may end up using Elastic Audio, so those settings matter for this project.

setup options

Once you’ve set your preferences, you should check to make sure Pro Tools is using the correct hardware and input/output (I/O) setup. In the lab, the settings should be correct for you, but people always seem to make changes.

From the Setup menu, choose Hardware…. The Komplete Audio 6 should be the hardware selected for use.

From the Setup menu, choose Playback Engine…. The top choice should match your selection in the Hardware setup. I prefer to not ignore errors during playback/record, as I don’t want errors to be recorded as part of my audio files, or mixed into my final project.



From the Setup menu, choose I/O Setup…. It should look like the window below. Note that built-in output is the Komplete Audio 6 – not the built-in audio output of the computer.




Pro Tools has two main windows: the edit window and the mix window. You may also see a floating transport window. The edit window has your tracks (nothing showing at first) showing horizontally like an audio editor, with waveform representations for each track. The edit window also has a Clips pane to the right that shows the audio files that have been imported into the project and any clips made from those files.

You can close the transport window and incorporate the transport tools into the top of the edit window by clicking the top-right triangle button and choosing transport from the drop-down menu.

The Mix window shows a virtual mixer, with individual channel strips for each track. You can switch back and forth between the edit and mix windows with CMD-= (command equal).

All of the audio files you record will show up in the Clips pane of the edit window. As you edit those files, you will notice that additional clips are created as sub-units of the parent audio file. We’ll talk more about clips and editing later.

Intro to Digital Music

(must115) tell me who you are

To help me learn who you are, I want each of you to fill out an “electronic note card.”


  • Preferred name (no DJ names yet)
  • Year in school (incl. transfer, non-traditional, post-baccalaureate)
  • Instrument
  • Home town
  • Fun fact about yourself
  • photo (attachment)

Due Wednesday, August 26

Intro to Digital Music

(must115) opening day welcome

faculty staff in the program

  • Dr. Jason Bolte, Director of Music Technology, Assistant Professor, Academic Advisor
  • Dr. Linda Antas, Assistant Professor of Music Technology, Academic Advisor
  • Dr. Keith Kothman, Director of the School of Music, Professor of Music Technology
  • Josh McRae, Computer Support Specialist

other staff

  • Kim Eggemeyer, Administrative Office.Right now Kim is the only staff member we have in the office. Be nice to her!

registration for required coureses

  • MUSI 195 or MUSI 160/260 (Applied Lessons or Guitar Class/Lessons)
  • MUSI 105, 135, 140 (Theory 1, Keyboard Skills 1, Aural Perception 1) 
  • MUSI 103RA Section 3
  • Scholarship students must be in a large ensemble (recommended for all)
  • 6 – 9 University Core classes recommended

concert attendance/seminar

  • You must attend 120 concerts to graduate (15 per semester).
  • You must swipe in before the concert starts AND after it ends to receive credit.
  • If the card swipe isn’t working, you can manually log in with your GID at the computers in the lobby of Reynolds.
  • Even if the system says there is no concert, swipe your ID or login before and after to receive credit.
  • Non-MSU School of Music concerts must be approved by your advisor and you must bring a program.
  • Music Major Seminar meets Thursday, 11:00 – 11:50 in Reynolds. These seminars are required and count towards your 120 concerts.

grade requirements

  • A C or better is required for any MUSI, MUST, MUSE, EELE 217, FILM 259 to count.
  • A C- can count for any core or non-music class.
  • A D only counts towards your 10 free electives.


It’s in D2L, and also here. MUST115Syllabus2015sec3


  • D2L (Desire 2 Learn) is your academic portal.
  • Assignments and grades will be posted and entered on D2L.
  • Login to D2L with your NetID.

recording assistance

  • All MUST students will assist in the recording of two concerts semester.
  • Your recording assistance will be part of your MUST 115 grade.
  • If you are late or miss your assigned concerts your grade will suffer.

multimedia series concerts

  • Your attendance is required.
  • You can also use them for general concert credit.
  • Dates tba

howard 127 lab

  • 127 is only available to students taking Music Technology classes.
  • The lab has all the software and hardware you need.
  • Staff (and faculty if staff unavailable) can let you in.
  • Open lab hours will be posted by the second week of class.




(must625) links: synthesis, retro synth, listening

It’s a grab bag of a post, with links to content rather than new posts.

A (very) brief introduction to synthesis: synthesis

How to find your saved Retro Synth instruments: saving-retro-synth-programs-and-finding-them-later

A few “concert music” synthesis selections:

Some electronica/ambient/idm synthesis selections: midi-and-synthesis-listening


(must625) project 2: musique concrete, midi, and virtual instruments

Due Thursday, June 4, at beginning of class.


Compose a short work (75 – 90 seconds, 1’15 – 1’30) utilizing musique concrète techniques. You will process audio as before in Audacity and Audition, and assemble in Logic Pro X, with the addition of using a virtual sampler (EXS24) to further process audio samples. You are encouraged to build upon your first project.

Form again does not have to be much of a concern. Gestural or continual variation should dominate your project, whereby anything material that is presented can be subject to immediate variation. Your use of the EXS24 sampler should not be as an organ/keyboard, but as an additional way of expanding your developmental possibilities. Your EXS24 instruments must make use of interesting modulation techniques (envelopes and LFOs) and have external MIDI CC realtime controls to allow for interesting development of the sound over time.


  • Duration: 1’15 – 1’30 (10)
  • The inclusion of at least three virtual instruments using the EXS24. (10)
  • The project must rely on gesture as a primary component of the work. (10)

Additional factors you will be graded on

  • Creativity: are your edited sounds interesting, and used in interesting ways in the project? Do your EXS24 instruments make use of interesting modulation and realtime controls? (40)
  • Quality of edits and finished audio: you should not have audible clicks at beginnings or ends of edited audio, and your audio should not distort (it should not go over the maximum amplitude). (20)
  • Organization of files and following turn-in procedure: is there a finished, mixed audio file? Can I open your project file and play back all the tracks contained? Did you include your original source files and your processed files? (10)