2023 Retrospective

misc

Another year, another look back at another year.

Last year about this time I made some goals. Let’s see how I did on those…

  1. Create my own interpreted language. Nope. Didn’t do anything with that this past year. I’ve done some work around this in the past, but never made anything final. I’d still like to do this someday, but it’s gonna be on hold for a while. Don’t expect I’ll do much with this in 2024.
  2. I want to finally do something with music. I did!!! I got into Supercollider and did a pretty deep dive. I even published some things on SoundCloud. https://soundcloud.com/bit101. This is by far the most I’ve ever done with music and audio. I wound up taking a break from it towards the end of the year, but I do plan on getting back into it. More on that later.
  3. I’d love to do another side project creating graphics for something. Yes, I did this too. I finished a new design for a wine bottle label for Anarchist Wine recently. It’s not out yet, but most of what you see on that page are designs I did in the past. Sometimes they just find something I posted somewhere and ask if they can use it (and pay for it). Other times it’s a more collaborative effort where I come up with a concept and we iterate on it. The fun ones are data driven. We gather some numbers on a given subject or theme and work them into the design. It’s somewhat like data visualization, but usually way more abstracted. Beyond the one I finished recently, I’m at work on another one as I write this. And there are more planned in the coming year. So that’s fun! But I’d still love to do more projects like this for other companies or brands. So if you need work like this, or know someone who does, hit me up.
  1. Of course, I’ll finish the Coding Curves project. And I did finish. That was really fun and I like how it all worked out. I mentioned that I’d like to maybe publish it as a book at some point. I’d still like to. I had another project in mind, Coding Color, and I put an initial post up about that. But didn’t get too far. I got a bit bogged down in how to do it in a cross-platform/language/api way, since every library seems to treat colors a bit differently. But it’s definitely something I’d like to tackle more at some point. I guess that’s one goal for 2024.

Music

I want to talk a bit more about music, as this was the biggest, most interesting thing I got into in 2023. I learned so much – about music, but also about myself and what I want to be doing. For ages I felt bad that I knew nothing about music and had no abilities in the area. This exploration got me over that feeling. I have created some music now and I have enough tools and knowledge to do more if I want to. But as I got more into it, I started questioning what exactly my goal was with it. I realized that I have no goal at all of becoming an electronic (or any other kind of) musician. I don’t want to create traditional “songs” per se, or “drop tracks and albums”. I don’t want to sell music or in any way support myself through that.

In short, I don’t really want to do music for music’s sake. I loved the hell out of learning what I learned and want to learn more. My initial goal was to be able to make music, or audio soundscapes for animations and videos that I code. I feel like I am kind of at square one of being able to do that now.

So another goal for 2024 is to actually make some longer form (2-3 minutes or longer) generative videos with my own soundtracks.

Another thing I learned in my foray into music was how much I love coding graphics and animations, how good I am at it (yeah, a little brag there), and how comfortable I am sharing that stuff. Coding music is still a bit of a struggle for me. It doesn’t come easily and I’m still really hesitant to share what I make. But I’m getting better at it.

Part of the hesitancy in sharing is that I feel like asking someone to listen to a song is a much bigger ask than asking them to look at an image or a short animation. You post an image and it’s there and they see it and that’s that. Listening to a song is a small investment of time and attention.

One thing I know will kick me into gear musically is this book I pre-ordered that will arrive in January: Supercollider for the Creative Musician by Eli Fieldsteel. HIs Youtube channel is amazing and one of the resources I hit up over and over and over when I was learning about Supercollider (and music/audio concepts in general).

ABC – Always Be Creating

In general, I just want to keep on creating cool and interesting stuff and sharing it. Whether that’s images, animations, tutorials, or music, it’s all good. I’ll be looking at more community driven projects, like Genuary to participate in. I’m going to shoot for 100% participation in Genuary. I’ll be posting on Mastodon at https://mstdn.social/@bit101.

I’ll be looking for more stuff like this throughout the year, either actual projects like https://7daysofcode.art/ or just hashtag-based projects. Seems like there are more and more things like this popping up, which is great.

SuperCollider 7 – Scales and Degrees

audio, supercollider

Day 7 of 30 Days of SuperCollider

I guess by now it’s obvious that this whole 30 day plan isn’t going well. It’s not that I’m not into SuperCollider anymore. On the contrary, I’m super deep into it, as well as a lot of other related topics – synths, music theory, midi hardware, etc. I’m learning a whole lot and really enjoying it. But writing about it is the last thing on my mind. So I’m sure there will be more posts about SuperCollider, but I think this will be the last numbered one.

Describing the Problem (slowly)

But I did run into something recently that I thought was worth sharing. It’s got to do with scales and degrees and patterns. SC has a rich library of pattern classes/functions that give you powerful ways of composing sounds into … well, patterns. Pbind is often used for creating an overall composition of a single synth, like so:

(
Pbind(
  \freq, Prand([100, 200, 300, 400, 500], inf),
  \dur, 0.5,
).play;
)  

This will play a random note with one of those five frequencies, and choose another one 0.5 beats later (by default 0.5 seconds). And it will do this infinitely. (The audio samples here have been abridged to something less than infinite length to save on storage costs and upload time.)

The problem with this is that those frequency values are nice round numbers, but from a harmonic viewpoint, they are kind of random. It might be better if we could use the frequencies of the tones in, say, the C Major scale. My first attempt at this was brute force, looking up the frequencies and plugging them in.

(
Pbind(
  \freq, Prand([261.626, 293.665, 329.628, 349.228, 391.995, 440, 493.883], inf),
  \dur, 0.5,
).play;
)

Those are the frequency values for middle C through D, E, F, G, A, and B. This sounds better, but there’s no way I’m typing all those numbers very often. Luckily, that’s not necessary. As long as the synth you’re using has a \freq argument, you can use other types of notation and SC will convert it all to frequency. So this is equivalent to what we just did above:

(
Pbind(
  \midinote, Prand([60, 62, 64, 65, 67, 69, 71], inf),
  \dur, 0.5,
).play;
)

In fact, this is even more accurate because the frequencies are calculated out to something like ten decimal places. Here, 60 is the midi number for middle C. 62 is D, 64 is E, etc.

But what if I wanted to shift it down an octave? I could go through all those numbers and subtract 12. Or, with SC’s amazingly fluent array operators, I can just subtract 12 from the whole array:

(
Pbind(
  \midinote, Prand([60, 62, 64, 65, 67, 69, 71] - 12, inf),
  \dur, 0.5,
).play;
)

I could even put it into the key of D by adding 2:

(
Pbind(
  \midinote, Prand([60, 62, 64, 65, 67, 69, 71] - 12 + 2, inf),
  \dur, 0.5,
).play;
)

All that works great, but the code itself not super expressive. I’d like to just be able to put in the octave I want directly. Lo and behold, there is an \octave parameter! But it doesn’t affect midi notes because those are absolute. Midi note 60 is middle C, so it doesn’t make sense to say “middle C in octave 2”. But what it does affect is \degree. A degree is a more generic specification for a note that can be transposed to any octave. For example, both of the below Pbinds will do the same thing (play a middle C over and over).

(
Pbind(
  \midinote, 60,
  \dur, 0.5,
).play;
)

(
Pbind(
  \degree, 0,
  \octave, 5,
  \dur, 0.5,
).play;
)

Setting \octave to another value will move the C to that octave. Now I was all set, I thought, and did this:

(
Pbind(
  \degree, Prand([0, 2, 4, 5, 7, 9, 11], inf),
  \octave, 5,
  \dur, 0.5,
).play;
)

This assumption was validated by digging through SuperCollider’s Scale class. You can post the value for Scale.major.degrees and it gives you [0, 2, 4, 5, 7, 9, 11]. Exactly what I was using. These are both “degrees”, so it should be correct, right?

But the last sample doesn’t sound quite right, and winds up going much higher than I’d expect. Then I was playing a simple melody on the keyboard and trying to encode that into a sequence and the notes were all wrong. While trying to figure out what was going wrong, I changed the Prand to Pseq in the above example…

(
Pbind(
  \degree, Pseq([0, 2, 4, 5, 7, 9, 11, 12], inf),
  \octave, 5,
  \dur, 0.5,
).play;
)

This plays the notes in order, rather than randomly, so should give you the scale from C to B. (I threw on a 12 to finish out the scale the way we’re used to hearing it.) But it’s all wrong! Skipping notes and going way too high!

Looking around the net for examples, I finally realized that this works:

(
Pbind(
  \degree, Pseq([0, 1, 2, 3, 4, 5, 6, 7], inf),
  \octave, 5,
  \dur, 0.5,
).play;
)

Here the scale actually goes from 0 to 6 for C, D, E, F, G, A, B. I added on the 7 to finish it out on the next C.

The Aha Moment

OK, so it seems like what’s happening is that Pbind‘s \degrees are like indexes into the degrees defined by Scale.major.degrees. Yup, they’re using the same term to define two related but different concepts. Sigh. In other words, \degree sets the number of the note in the scale you are using. By default, that seems to be the C Major scale. This is validated looking at the source code for pitchEvent:

pitchEvent: (
  mtranspose: 0,
  gtranspose: 0.0,
  ctranspose: 0.0,

  octave: 5.0,
  root: 0.0,                      // root of the scale
  degree: 0,
  scale: #[0, 2, 4, 5, 7, 9, 11], // diatonic major scale
...

Here, I discovered that \scale is another Pbind parameter. So if I want to set the scale explicitly, I can say:

(
Pbind(
  \scale, Scale.major.degrees,
  // or...
  // \scale, [0, 2, 4, 5, 7, 9, 11],
  \degree, Pseq([0, 1, 2, 3, 4, 5, 6, 7], inf),
  \octave, 5,
  \dur, 0.5,
).play;
)

Or, you could use a minor scale, which is defined as [0, 2, 3, 5, 7, 8, 10]

(
Pbind(
  \scale, Scale.minor.degrees,
  \degree, Pseq([0, 1, 2, 3, 4, 5, 6, 7], inf),
  \octave, 5,
  \dur, 0.5,
).play;
)

Or you could use one of the more than one hundred scales built into Scale, or create your own. Here’s a small sample of what’s included:

Looking at the pitchEvent code again, I realized that it’s also now more expressive to express the key of the scale you want to use, by using the \root parameter. For example, to use a D Major scale, set \root to 2. This is a bit confusing maybe. You might think we want to set it to 1 since we just established that a degree of 1 is D. But this is operating on the scale definition level, so it’s operating on semitones, not the tones of the final scale.

(
Pbind(
  \scale, Scale.major.degrees,
  \degree, Pseq([0, 1, 2, 3, 4, 5, 6, 7], inf),
  \root, 2,
  \octave, 5,
  \dur, 0.5,
).play;
)

The last thing I’ll say is that if you really, really wanted to use semitones in the \degree parameter, you could set up your \scale to include all the degrees in an octave:

(
Pbind(
  \scale, [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11],
  \degree, Pseq([0, 2, 4, 5, 7, 9, 11, 12], inf),
  \octave, 5,
  \dur, 0.5,
).play;
)

It works, but I don’t think this is the normal way of doing things.

Also, none of this takes into account custom stuff like more or less than 12 notes per octave, detuning, transposing, etc. But if you’re already into that, this post is probably telling you stuff you already know.

The Result

All this is to say that I can now do stuff like this:

(
Pbind(
  \scale, Scale.major.degrees,
  \degree, Prand([0, 1, 2, 3, 4, 5, 6, 7], inf),
  \octave, Pxrand([Pn(4, 8), Pn(5, 8), Pn(6, 8)], inf),
  \dur, 0.5,
).play;
)

Here I’m using a major scale and a random degree. Then I’m playing eight notes in one of the octaves 4, 5, or 6, randomly chosen, never repeating (\Pxrand), infinitely. Not amazing in itself, but opens some creative possibilities.

Supercollider 6 – Envelopes

audio, supercollider

Day Six of 30 Days of Supercollider.

Envelopes control how a single aspect of a sound changes over time. Traditionally, this meant the volume of a sound. When you strike a bell, for example, there’s an initial fast peak of volume, which then slowly fades over a potentially long period, as the bell continues to resonate. If you graphed out that volume level, it might look like this:

Of course different bells will have different curves. A meditation bell sound seems to go on and on, whereas a cowbell fades out pretty quickly.

A flute would have a totally different curve. It will probably reach peak volume more slowly than a bell. Then it will maintain a steady volume for as long as the flutist continues to hold that note and fade out rather quickly, though the flutist could make that fade longer if they wanted to. That might look more like this:

These curves are known as envelopes and they are usually broken down into standard sections:

  • Attack. How long it takes for the sound to ramp up to its initial full peak volume.
  • Decay. Often that peak volume subsides a bit before it gets to the next phase.
  • Sustain. For sounds that can be held for a period of time, this is the volume and length of time they are held at.
  • Release. When the sound generation is stopped, how long does it take for the volume to get to zero?

Here are all those parts labeled:

Not all envelopes have all those parts. You can break down envelopes into sustaining and non-sustaining envelopes. Most bells, for example, do not have a sustain section. You strike them, they peak and then they release. The same with most drum sounds. So this kind of envelope is often called a percussive envelope. It just has an attack and a release. Since it starts its release after it reaches the peak, there’s no real decay either.

Sustaining envelopes may or may not have a decay section. So usually you’ll see “adsr” and “asr” envelopes.

In Supercollider there is an envelope class that you can use to construct all these kinds of envelopes and more. For example there is Env.perc, Env.asr and Env.adsr as well as others. To create envelopes, you need a number of volume parameters and a number of time parameters.

Volumes:

  • The start volume – where the volume starts – usually 0, but doesn’t have to be.
  • The peak volume at the end of the attack.
  • The sustain volume – where the volume goes down to after the decay. If there is no decay, this is the same as the peak.
  • The release volume – where the volume ends. Again, usually 0, but doesn’t have to be.

Depending on which type of envelope you are using, you may not need all of these.

Times:

  • Attack time.
  • Decay time.
  • Sustain time.
  • Release time.

As with volumes, not all envelopes use all of these. Also, most of Supercollider’s envelopes do not have a parameter for sustain time. The end of the sustain period is usually triggered by something else, such as the release of a key, or some other signal. We’ll see examples of this later.

Percussive Envelope

But let’s get started actually creating an envelope and applying it to a sound. I’m going to start with a function that has two vars, sig for the signal, and env for the envelope. And I’ll create a pulse (square) wave for the signal.

(
f = {
    var sig, env;
    sig = Pulse.ar(400);
};
)
f.play

If you evaluate the part in parentheses, it will store that function in the global variable, f. Then you can evaluate the next line that plays f. You’ll hear a noise that will just go on forever. Press Cmd/Control + period to stop it. Next we’ll create the envelope. We’ll create a percussive envelope with Env.perc. This will need to be wrapped in an EnvGen class, which is an envelope generator. Finally, multiply the two together, the result of which gets returned by the function.

(
f = {
    var sig, env;
    sig = Pulse.ar(400);
    env = EnvGen.ar(Env.perc);
    sig = sig * env;
};
)

When you play this, it sounds a bit more bell-like.

But before we do anything more to the envelope itself, open up the Node Tree window. If you’ve played this function a few times, you’ll see a bunch of items hanging around. These are sounds that are still technically playing, but the envelope brought their sound down to zero so you can’t hear them.

To fix this, add a doneAction to the EnvGen. Setting this to 2 will cause the sound to be removed when the envelope completes.

(
f = {
    var sig, env;
    sig = Pulse.ar(400);
    env = EnvGen.ar(Env.perc, doneAction: 2);
    sig = sig * env;
};
)

Now we can start playing with the parameters to Env.perc to change the envelope. As mentioned, this kind of envelope just goes quickly to a peak (attack) and then slowly fades out (release). The arguments to the function are:

attackTime = 0.01, releaseTime = 1, mul = 1, curve = -4

Try changing attackTime and releaseTime to get an idea how that changes the sound. Another neat trick is to plot the envelope, which you can just to by adding .plot to the end of the call. Like so:

Env.perc(0.1, 0.3).plot;

Which gives you this:

Notice that the two lines are curved. You might guess that you can change that curve with the curve parameter. And you’d be right. A curve of 0 creates linear changes to the volume, and straight lines in the plot.

Negative numbers curve one way. The default curve for this envelope is -4, so you have seen how that looks. Positive 4 looks like this:

Higher or lower numbers change the shape of the curve. Try different curves to see and hear the changes they make.

Sustain

A percussive envelope is non-sustaining. It plays and it finishes on its own. Sustaining envelopes will play indefinitely until something ends the sustain. Let’s start with the simplest of these, the ASR envelope, which is created with Env.asr(attackTime = 0.01, sustainLevel = 1, releaseTime = 1, curve = -4)

Note that there is only a sustain level, not a sustain time. Let’s just change the percussive envelope with a default ASR one, and see what happens.

(
f = {
    var sig, env;
    sig = Pulse.ar(400);
    env = EnvGen.ar(Env.asr, doneAction: 2);
    sig = sig * env;
};
)

Play that and it pretty much sounds like there is no envelope at all. It goes quickly to full level and just stays there. We need some way of telling the envelope to move on to the next step. This is known as a gate and is an argument to the EnvGen. The gate is a value that is evaluated as true if it is a positive number, and false if it is zero or negative. If the gate is positive, the note will play to its sustain level and stay there. When the gate goes to zero or below, the sustain ends and the release phase begins.

In order to change the gate at run time, first you’ll need to add a gate argument to the function. Then you need to save a reference to the object created when you call play on the function. Then on that object you can call set so change the gate. Here’s what that all looks like:

(
f = {
    arg gate = 1;
    var sig, env;
    sig = Pulse.ar(400);
    env = EnvGen.ar(Env.asr, gate: gate, doneAction: 2);
    sig = sig * env;
};
)

a = f.play;
a.set(\gate, 0);

Evaluate the function, then evaluate the play line to start the sound playing. Finally evaluate the final line to end the sustain, and you’ll hear the release.

Often, these actions would be triggered by a midi controller key press. Pressing the key would trigger the sound to start, and it would play as long as the key was held down. Releasing the key would trigger the code to set the gate to zero, which would then let the note release.

Alternately, you can programatically trigger the gate. One way would be to use some other UGen which just will change from negative to positive sometimes. Here’s an example of this.

(
f = {
    var sig, env;
    sig = Pulse.ar(400);
    env = EnvGen.ar(Env.asr, gate: LFPulse.kr(0.5));
    sig = sig * env;
};
)

Here, the gate argument is set direction to an low frequency pulse UGen running at a frequency of 0.5, meaning it will complete a full cycle every two seconds. So this will go positive for 1 second, playing and sustaining the note. Then it will go negative, ending the sustain and letting the note release. Then back to positive after a second. You can change the width value of the LFPulse to change how long the note plays, while still maintaining the rate at which notes occur.

You should also try to get the Env.adsr envelope working. This works almost exactly the same as the ASR envelope, but has a decay phase and a decayTime parameter to control the length of that.

Beyond all that…

Earlier I said that envelopes have been traditionally used for volume or amplitude, but they are really just functions that return a stream of values over time, so they can be used to control any parameter of pretty much anything. It’s common to use envelopes on filters for example, to change how the filter is applied to the sound over time.

There are also other envelopes to explore. Or you can just use Env.new to create a completely custom envelope. You can pass in an array of levels, times and curves for each stage of the envelope, and set which stage is the release. And you can make as many stages as you want and even set up looping envelopes. All very powerful.

Supercollider 5 – Unit Generators

audio, supercollider

Day Five of 30 Days of Supercollider.

I could write hundreds of pages about UGens. Other people have. I’ll let you read their stuff instead and just give some of the basics.

Unit Generators, or UGens for short, are one of the key building blocks for creating sound in Supercollider. If I understand it correctly, UGens create the signals that are used within Synths to describe sounds that get created in the server. Even if you create a UGen without a Synth, a default Synth is used behind the scenes to wrap that UGen and create the sound. That’s what the function play message does.

A sound could be a single UGen and Synth, or it could be made of multiple UGens hooked up to each other, with envelopes and filters and all kinds of other things in there shaping the final sound. For this post, we’ll just look at a few basic UGens wrapped in functions. Nothing complex.

UGens are defined in classes. And class names start with capital letters. So you’ll see things like SinOsc and LFTri and Saw. UGens have three key methods that can be called, ar, kr, ir. Which one you use is determined by what you’re using it for. Mostly you’ll be using ar, which stands for “audio rate” or kr which is “control rate”. ir is for “initial rate”, but we won’t be getting into that here.

You use audio rate when you are generating actual sound data. By default this creates signals at a default rate of 44,100 samples per second. Control rate generates samples at a much slower rate. It is used for many different things, but one of the more common use cases is to control some aspect of an audio rate UGen. So yeah, you can have one UGen nested inside another UGen. The inner one will usually (but not always) use kr to control some aspect of the outer one, which is actually making sound with ar.

But let’s create a sound. Type in this code and evaluate it:

{ SinOsc.ar(400) }.play;

This creates a sine wave oscillator UGen that generates and audio rate signal at 400 hz – or 400 cps (cycles per second) if you prefer. You should hear a sound, assuming your server is booted and sound is on, etc. Note that the sound will probably only come out of the left speaker / headphone. That’ll be the case for all the sounds in this post. We’ll cover stereo later.

Change the 400 to something between 20 and 20,000 and you can hear other tones.

But there are other arguments. In full, it’s SinOsc.ar(freq, phase, mul, add)

We already saw freq. The phase argument shifts that sine wave one way or the other. This is useful for creating two of the same waves, but getting them to sound separated. Otherwise, not too useful on a sine wave. mul controls the amplitude of the sine wave (multiplies it). It’s default is 1.0. You can think of this as amplitude, or the volume level of the resulting sound. Since you’ll often have multiple sounds playing together, and their volume adds up, you often want to set this down around 0.2 or 0.3 or even some lower amount so that the sum of all your sounds stays at 1 or below. Finally, add adds some amount to the wave, defaulting to 0. This is often more useful in kr than ar, as we’ll see soon.

You can take the defaults for phase and add and just change freq and mul, so you’ll often see something like:

SinOsc.ar(400, mul: 0.5)

Try some of these other UGens:

{ Pulse.ar(400) }.play; // a square wave
{ LFTri.ar(400) }.play; // a triangle wave
{ Saw.ar(400) }.play;   // a sawtooth wave.

Most of the arguments to the ar methods for these UGens are similar to SinOsc, but there will be some differences.

If you ever want to know what your sounds looks like, use plot instead of play:

{ Saw.ar(400) }.plot;   // a sawtooth wave.

We’ll probably get into some other UGens later in this series. But I just want to show a few examples of kr with a UGen within a Ugen.

First, let’s make a SinOsc UGen using kr with a freq of 1 and a mul of 100, and plot it.

{ SinOsc.kr(1, mul: 100)}.plot;

This gives you the following:

Not too useful. Because our frequency is 1, it’s going to take a full second to complete a full sine wave. And the plot is only showing 0.01 seconds. We can fix that by telling plot to plot 1 second:

{ SinOsc.kr(1, mul: 100)}.plot(1);

Note that the amplitude now goes from -100 to +100, every second.

So the cool thing about UGens is that you can do math with them just like they were single values, even though they are in fact objects that produce a stream of values. So we can create a sawtooth wave and just add the above sine wave to its frequency argument.

{ Saw.ar(400 + SinOsc.kr(1, mul: 100)) }.play;

You should now hear a siren type sound. The sawtooth wave has a base frequency of 400 hz, but that’s going to go up and down by 100 (from 300 to 500) once per second. Try playing with those numbers and getting different values. This is known as frequency modulation (yup, just like FM radio), because you are modulating the frequency.

You can do the same thing with amplitude modulation (AM radio) by using the sine wave to change the mul value of the sawtooth wave. But we probably want it to go from 0 to 1 over and over. The math for this is that we want to set mul of sine wave to 0.5 (which makes it go from -0.5 to +0.5) and then add 0.5 to that, to make it go from 0 to 1. That looks like this:

{ SinOsc.kr(1, mul: 0.5, add: 0.5)}.plot(1);

That math can become a pain though. A shortcut is to leave it all out and call range at the end, passing in the min and max values you want the wave to hit.

{ SinOsc.kr(1).range(0, 1) }.plot(1);

Now you use that as the mul argument to the sawtooth wave.

{ Saw.ar(400, mul: SinOsc.kr(1).range(0, 1)) }.play;

Again, try different numbers here, but avoid going much over 1 (or -1) for that final mul value. If you’re not sure what you’re doing, it’s always safe to plot a wave before subjecting your ears to it.

The last thing I’ll show is additive synthesis. This is simply adding two UGens together. Seriously, it’s that simple.

{ SinOsc.ar(400, mul: 0.5) + Saw.ar(770, mul: 0.5) }.play;

Here I’m using a 400 hz sine wave and a 770 hz sawtooth wave. I set both’s mul to 0.5 so it wouldn’t be too loud.

Plotting this at plot(0.05) gives us:

Quite a complex wave, for a complex sound.

Supercollider 4 – Variables, etc.

audio, supercollider

Day Four of 30 Days of Supercollider

Variables in Supercollider, not surprisingly, are rather special, compared to many other languages. I can count four rather distinct types of things that will hold a value:

  • Regular vars
  • Arguments
  • Single-letter variables
  • Environmental variables

Regular Variables

Let’s start with regular variables. These aren’t much different than variables you’d find in most other languages. You declare them with the var keyword and the name of the variable, which should really be more than one character long and has about what you’d expect for legal identifier names, as far as I know. They should also start with a lowercase letter, as identifiers starting with capital letters indicate a class name.

You can optionally decide to assign the var a value when you create it, or you can do it later. Unassigned vars have a value of nil. Vars are not typed, so you can reassign them with data of another type of you want.

(
var foo;
foo.postln; // nil

foo = "hello";
foo.postln; // hello

foo = 42;
foo.postln; // 42

var bar = "hello";
bar.postln; // hello
)

You must use the var keyword before assigning a value to a variable. i.e. you can’t do something like the following. It will throw an error that foo does not exist.

(
foo = 99;
)

Vars have scope, as you might expect. A var inside a function is scoped to that function and will have a different value than a var of the same name outside the function, as the following shows:

(
var foo = "apple";
var func = {
	var foo = "banana";
	postln("in function: " + foo); // banana
};

func.value;
postln("outside function: " + foo); // apple
)

Also, vars declared in one region are scoped to that region and will not be available in other regions. Example:

(
var name = "keith";
name.postln; // keith
)

(
name.postln; // error, name is not defined.
)

Finally, vars must be declared before any statements are executed in a given function or region. This will fail:

(
"hello".postln;
var foo = "hi";
)

But if you switch the order of the two lines, it will be fine.

Arguments

We already looked at arguments when we covered functions. As far as I know, they have all the same rules as regular vars, but they need to be declared before vars or any other code in a function. Oddly, you can declare args outside of functions and they seem to work pretty much as regular variables. So most likely they are pretty much the same under the hood.

(
arg age = 90;
age.postln; // 90
)

Single-Letter Variables

Earlier I said that regular variables should be more than one character. The reason for that is that single-letter variables are known as global variables. Global variables a to z already exist and can be used without the var keyword. And as their name suggests, they are available across regions, functions, any scope.

(
f = {
	a = "hello world";
}
)

(
a.postln; // nil
f.value;
a.postln; // hello world
)

Evaluating the first region will assign a function to global variable f. Inside that function, global variable a is assigned a value.

Evaluating the second region calls postln on global variable a, which should not have a value yet, so it shows nil. It then calls value on the function stored in f. Although that function was declared in another region, it is still available here because f is global.

After the function is run, we postln the variable a again. Now it has the value assigned to it in the function.

This globality even works across files. If you open one file and write to a single-letter global variable, you can open a new file and read from it.

Now we’ve all been taught that global variables are bad. But in most cases, when you’re coding in Supercollider, you’re not doing hard computer science. You’re just being creative. So I think it’s OK to relax a bit. Since you’ll often be defining functions in one region and using them in another region, global vars become kind of necessary in many cases.

Of course, if you are making a plugin or some kind of reusable code library, avoiding globals is still a very smart idea.

One more vital warning here. You should avoid using the global variable s in your own code. This has been assigned as the current server. So you can do things like s.boot, s.reboot, s.stop. There’s nothing special about s other than it’s already being used. If you really, really think you need to use s, then at least reassign the server to some other variable.

(
z = s;
s = "foo";

z.boot;
)

Environmental Variables

Environmental variables are similar to global variables in functionality, but can be even more useful because you aren’t limited to a single character. Your variable name can actually be useful.

Environmental… ok, I’m just going to call them env vars. Env vars always begin with a tilde and do not need the var keyword. Otherwise they work pretty much like vars and global vars. You can use them in any scope. This is the same example we saw before, redone to use env vars.

(
~magic = {
	~message = "hello world";
}
)

(
~message.postln; // nil
~magic.value;
~message.postln; // hello world
)

Technically though, env vars are different than global vars. They are scoped to the current environment. And really, the code ~foo = "hello"; is an alias for currentEnvironment.put(\foo, "hello"); I’m not going to go too deep into environments, but they are basically namespaces. You can create new environments, push them and pop them off a stack of other environments, etc. But until you’re actually doing things at that level, env vars are probably safe to consider essentially global. If there’s an edge case, I’m sure someone will bring it up.

Supercollider 3 – More Function Stuff

audio, supercollider

Day Three of 30 Days of Supercollider

This will be a short one.

There is some more weirdness with functions in SC that I didn’t think of yesterday. This one is actually a pretty cool language feature. Just something you don’t see in most languages. It has to do with the way methods are called, or I guess I should say the way messages are sent to objects.

Yesterday I was using syntax like this to play a unit generator:

{ SinOsc.ar(400) }.play;

And that’s fine. But there’s an alternate syntax that does the same thing:

play({ SinOsc.ar(400) });

In other words, you can send the play message to a function instance, or you call the play method, passing in a function to play. Both are equivalent and probably one converts to the other in the back end.

Similarly, you can send a value message to a function, or pass a function to the value method.

{ 42 }.value;

value({ 42 });

This goes way beyond functions though. In Supercollider, if you want to send a message to the Post window, you use the postln method. You can do it like this:

postln("hello");

Or you can send the postln method to whatever you want to print.

"hello".postln;

This can be really useful for debugging, because in addition to posting the value to the Post window, it will return the value that it just posted, so tacking on postln to something is completely transparent to the logic of your code. For example…

(
f = {
	a = rand(10);
	b = rand(10);
	a * b;
};
f.value;
)

This function chooses two random numbers and returns their product. But say instead of rand, you were calling some function that returned an important value. But your code is not doing what you expect it to so you want to trace out the values of a and b. In many languages, you’d need to add more lines for the postln calls:

(
f = {
	a = rand(10);
    postln(a);
	b = rand(10);
    postln(b);
	a * b;
};
f.value;
)

But in Supercollider, you can just do this:

(
f = {
	a = rand(10).postln;
	b = rand(10).postln;
	a * b;
};
f.value;
)

The values get posted and the code continues to work as expected with no side effects caused the the post.

Some other examples:

All the array functions.

reverse([1, 2, 3, 4, 5]);

// or...

[1, 2, 3, 4, 5].reverse;

Numbers:

squared(4);

// or

4.squared;

This can look a bit confusing at first when applied to floating point numbers.

0.7.cos;

1.0.cubed;

But you get used to it. The hardest part is that when you’re learning and looking at other people’s examples, some will use one form of the syntax, and some will use the other form, sometimes even mixing them. So it’s best to get used to both ways.

30 Days of Supercollider Series

misc

This will be an index of the articles I post about Supercollider.

Warning: I don’t know a lot about Supercollider yet. This will be a journal of my discoveries as much as anything else. I probably know less about music in general. Trying to learn something before my last trip around the sun. Anyway, this shouldn’t be taken as a step-by-step tutorial on learning Supercollider. Just a random collection of stuff.

The Days:

  1. The IDE
  2. Functions
  3. More Function Stuff
  4. Variables, etc.
  5. Unit Generators
  6. Envelopes

In the off chance you might be interested in the actual sounds I’m creating, you can find them here:

Supercollider 02 – Functions

audio, supercollider

Day Two of 30 Days of Supercollider

A word of warning about this series as a whole: this should not be taken as a comprehensive, step-by-step tutorial on how to use Supercollider. There are better resources out there for that. This will be a loose collection of deep, or not-so-deep, dives into different topics. A lot of it is just documenting stuff for myself. Teaching is the best way to learn.

One of the first things you’ll learn about in Supercollider is functions, because the most common way demonstrated to play sounds at first is to wrap them in a function. But it took me quite a while to wrap my head around functions in Supercollider. Like how code is evaluated in the IDE, Supercollider goes way off the beaten track with functions.

Functions are defined by code inside curly brackets. The last value in the function is the return value. Here is an empty function:

{ }

If you evaluate that line, you’ll see -> a Function in the Post Window, telling you that it is a function.

Fairly often you’ll want to assign a function to a variable. You can do that like this:

f = {};

I’m just going to use the single letter f throughout this post. You can use other names but there are some rules around all that which I’m not going to get into here. Another day. For now just use f, or another single letter.

Say you want a function that returns a value, like 42, you can do this:

f = { 42; };

// or...

(
f = {
    42;
};
)

Note the parentheses around the second version. They’re not necessary, but as described in the first post in this series, it makes it so you can evaluate the entire function by putting your cursor in that region and hitting Ctrl-Enter/Cmd-Enter.

By the way, semicolons are not always required on the end of lines, but more often then not if you leave one off, you’ll wind up with an error that will be really tough to debug. It will just run two lines together and try to parse them like that. You can get away with it if it’s the last line of code in a block or you’re only evaluating a single line. Otherwise, best to use them.

Now, what do you do with functions? You call them. So you’d probably guess to do something like this:

f();

But that will give you an error:

ERROR: syntax error, unexpected ';', expecting BEGINCLOSEDFUNC or '{'
  in interpreted text
  line 1 char 4:

  f(); 
     ^
-----------------------------------
ERROR: Command line parse failed
-> nil

Sometimes you’ll see advice to do something like this:

f.();

And sure enough, that works, outputting what you’d expect:

-> 42

Now that just looks like some funky syntax decision, but what’s actually going on is a lot deeper. I eventually came to the realization that functions in Supercollider are not really functions like in other languages that are directly callable. I find it easier to think of them as special objects that have a few methods that can be called. You probably shouldn’t talk about it in those terms because nobody else does, but if you’re coming from another language, that may help you make sense of them.

Supercollider docs actually say:

A Function is an expression which defines operations to be performed when it is sent the value message.

So, yeah… value. That starts to look more normal:

f.value();

And that works! It turns out that f.() is really just an alias for f.value(). Better, better. Also, most of the time, you don’t need the parentheses. This works too:

f.value;

You will need to use parentheses when you pass arguments to functions though. So let’s cover arguments next.

Arguments are defined at the top of a function, before any other code. Use the key word arg followed by the argument name.

(
f = {
    arg foo;
    foo * 2;
};
)

When you evaluate it, you can now call it with value, the argument inside parentheses:

f.value(8);

As expected, this will print -> 16 in the Post Window.

Multiple arguments can be added to the same arg line with commas:

(
f = {
    arg foo, bar;
    foo * bar;
};
)

And now you can call it, passing in two args:

f.value(8, 3);

And this should print -> 24.

Arguments can also use default values, just set them in the arg line:

(
f = {
    arg foo = 10, bar = 3;
    foo * bar;
};
)

Now you can call this with 0, 1 or both arguments.

f.value();     // 30 - using both defaults
f.value(7);    // 21 - using only the second default
f.value(7, 2); // 14 - using no defaults

Like some other languages, such as Python, you can also use named arguments, or a mix of named and unnamed.

f.value(7, 3);           // unnamed
f.value(7, bar: 3);      // unnamed and named
f.value(foo: 7, bar: 3); // both named

With named args you can order them however you want and even skip arguments, assuming they have defaults. But once you use one named argument, all the rest must be named.

f.value(bar: 3, foo: 7); // opposite order
f.value(bar: 3);        // skip the first arg
f.value(foo: 7, 3);     // illegal! will throw an error

Lastly, there’s an alternate way to specify arguments. Rather than the arg key word, you can enclose the arguments line in a pair of pipe | characters.

(
f = {
    | foo = 10, bar = 3 |
    foo * bar;
};
)

That’s just a matter of preference. However you want to do it is up to you.

One last thing I want to go over on functions: functions that play sound. This one confused me for quite a while. Without going into too much detail just yet, there are various classes called Unit Generators that are mostly used to generate sound. SinOsc is a common one. It generates a sine wave oscillator – a really basic sound. Normally you call the ar method of that class to generate a sound of a particular frequency, such as SinOsc.ar(400) to generate a 400hz tone.

But that line of code does not play the sound. You’ll most often see something like this:

{ SinOsc.ar(400); }.play;

You can type that in and evaluate it and hear the sound. We’ll go more into unit generators later.

But I could not wrap my head around this one for a while. So SinOsc.ar(400) creates the unit generator. I’d expect that you’d play it like so:

SinOsc.ar(400).play;

But that gives you an error that the play message is not understood. But you have a function… the last line of the function returns that generator, and then you call play on that generator. How is that different from just calling it directly? I finally understood it though. It turns out that like value, play is a special message that you can send to a function. The details get a bit deep, but the bottom line is that when you call play on a function, it tries to evaluate the return value of that function as something that it can send to the server and play as a sound. Just calling play on the generator itself doesn’t work because play is a message you send to a function, not a generator. It’s a bit more complex than that, and I understand a good bit of what’s actually going on there, but I’m not going to try to explain it in this post. Enough for one day.

30 Days of Supercollider – Day 1 – The IDE

audio, supercollider

Day One of 30 Days of Supercollider

Years ago when Flash was on its way out, I started looking more and more into JavaScript and HTML’s Canvas as a replacement. I started a series on my earlier blog called 30 days of JavaScript. It was popular and moreover I learned a lot, needing to learn some new aspect of the language and graphics api each day.

Now that I’m taking a deep dive into Supercollider, I decided to try that same trick again. So, I plan to make 30 posts in the next 30 days (bear with me if I miss a few days here and there) tackling some aspect of Supercollider.

Caveat: these won’t be super in-depth most likely. Some of them will seem very basic and naive to anyone who knows this stuff more than I do. I can’t even guarantee that everything I write will be correct. But, as a wise person once said (paraphrased), the best way to learn the correct way to do something is to post the wrong way; someone will instantly show up to correct you. 🙂

Supercollider IDE

So let’s dive right in. Supercollider is actually a suite of several bits of technology that come together to form an environment for programming, playing, and recording synthesized sounds and music.

The parts are:

  • The Supercollider language, called sclang. There will probably be many posts on the language itself, as it’s quite different from most languages I’ve worked in, and I suspect it will be the same for other programmers out there.
  • The Supercollider interpreter. This is what reads the sclang code that you write, interprets it and sends it to …
  • The Supercollider server. Known as scsynth. This receives the interpreted commands from the interpreter and translates it into sound and music. And other things that aren’t necessarily audible, like timings, routings, etc.
  • The Supercollider IDE. This is scide. And is what we’ll be covering lightly today.

Here’s what the IDE looks like currently on my Linux laptop:

Pretty standard stuff here. On the left is a place to write your code and on the right you have some docks – the Help Browser and Post Window. The Post Window shows the output of the commands you run, success, errors, other messages. You can also log your own messages here, which about as much debugging capability as you’ll get here.

The editor is decent. It has color coding with different available color schemes. I found and installed a gruvbox theme, which makes me feel at home (code and instructions here: https://github.com/brunoro/base16-scide). It has pretty good code completion and hints for method parameters. You can do Ctrl-D or Cmd-D on a keyword and see the documentation for that item in the Help Browser.

On the bottom right is a status bar that shows what’s going on with the interpreter and server. When everything is green there, you know the server and interpreter are running and ready to convert your code into music… or something noisy anyway.

There are also a number of helper panels you can open up to visualize what’s actually going on with your compositions.

Seen here are the Node Tree, which shows the active objects, the Server Meter, showing the input/output levels across channels, the Stethoscope showing a wave form for any channels you choose, and the Frequency Analyzer. The last is particularly useful for seeing visually what different filters are doing to your sound.

One thing that took a lot to get used to is the way that Supercollider interprets the code you write. In every other language I’ve ever worked in, you write code in a file and save and do something that builds that code – either interpreting or compiling that code, possibly importing and/or linking other code files in with it and then executing the result.

This is not even remotely how Supercollider works. Part of the reason for this is that Supercollider was originally conceived of as a tool for musical performance. So you wouldn’t be just sitting down and spending a long time creating this perfect program and then running it. Instead, you’d code a little bit, run that, add a bit here, run that. Stop that bit, change it a little and re-run it. Then code a few more pieces and add them to the mix, maybe removing some of the earlier bits as you go.

So generally, the way things work is you’re evaluating one specific block of code at a time. This can be a single line by default, or you can select multiple lines, or even a portion of a line and evaluate that. Whatever you evaluate gets instantly interpreted and sent to the server and if that code creates a sound, you’ll hear that sound.

The most common shortcut you’ll use is Ctrl-Enter or Cmd-Enter on Mac. This evaluates what’s under the cursor. If nothing is selected, it will evaluate that whole line. If a part of a line or multiple lines are selected, it will evaluate the entire selection.

But say you have some code like this:

var freq;
freq = 300;
{ SinOsc.ar(freq) }.play;

In order for anything meaningful to happen, you need to select all three lines and then evaluate them. And that’s a very minor example. As you can imagine, you might have a chunk of code that is dozens of lines long that needs to all be evaluated together. For this, we have regions. Creating a region just means putting a pair of parentheses around the code you want to be evaluated as one large unit. Like so:

(
var freq;
freq = 300;
{ SinOsc.ar(freq) }.play;
)

Now you can put your cursor anywhere inside the parentheses, or even on one of the lines with a parentheses, and hit your shortcut and the whole thing will be evaluated and sent to the server. Also, if you are in a region, but only want to evaluate a single line of code, you can hit Shift-Enter and only that current line will be evaluated.

There’s also a menu item to evaluate the whole file, but there’s no shortcut by default for that. Coming from other “normal” programming languages, that seemed absurd to me and I immediately set up a shortcut for that. Eventually I figured out that you almost never want to evaluate a whole file and removed that shortcut.

The other important (very important) shortcut is Ctrl-. or Cmd-. (Control or Command + the period key). This stops all sounds from playing. You’ll work that into muscle memory quickly, especially after having a few random and unexpectedly loud noises blasting in your headphones.

Linux Audio

Just a note for you Linux nerds like me. The first week or so using Supercollider, I had to do it on my Macbook Air because the Supercollider server would not start on Linux. I knew I could fix it, but wanted to focus on learning a bit more about Supercollider itself before delving into Linux audio configuration. Eventually though, I put in the effort to figure it out.

The problem is that on Linux, Supercollider needs to use the Jack audio system. But most Linux systems right now use PulseAudio. Both of these interface to your sound card using ALSA. Jack is apparently superior and used by most serious audio software on Linux, but for some reason is not the default.

I was able to get Jack started by installing QjackCtl which gives you a nice little panel to turn Jack on and off. That got Supercollider working just fine, but it killed everything else that was running via PulseAudio on my computer. Once I turned Jack off, PulseAudo and the rest of my apps worked, but they were mutually exclusive. I finally found the solution here:

https://wiki.archlinux.org/title/PulseAudio/Examples#PulseAudio_through_JACK

This was a little fiddly and took a couple of reboots. Possibly because Jack was still running in the background via QjackCtl. But once it started working it was fine. I just open up the Cadence app and start Jack. Now all my computer’s audio is routed through Jack and everything works as expected, including Supercollider. If I turn Jack off, everything reverts to using PulseAudio instantaneously. So I’m very happy with that. Since I’m messing with Supercollider on a regular basis, I tend to just leave Jack running all the time now.

[Update] – already had a comment on Jack vs PulseAudio that Pipewire should resolve a lot of this. I checked and I do have PipeWire installed on my system, but it doesn’t seem to be in use. I’ll be digging into this more and will update with any fun findings.

Summary

So there’s Day One. Not too exciting, but I’ll be prepping a list of other topics and as I go I will create an index to all the articles. Hopefully some of them will be useful, if not to you, at least to me.