SuperCollider

compiling SuperCollider from source on Ubuntu in 2021

Despite the fact I really want to move away from WordPress, I need to write this down here.

The other day I did a heavy update of my ‘workstation’ machine. I jump several Ubuntu releases and suprisingly most of things ended up working the way they worked before. SuperCollider was unfortunately not one of them. So I decided to recompile from source, using the latest release from SuperCollider Download page (3.12.1) and my notes for compilation on Ubuntu from 2019.

Continue reading

Installing SuperCollider on Linux in 2020

These are short notes on installing SuperCollider from source on a debian-based Linux distro … in 2020.

Download sources

Download source from an official release from github: https://github.com/supercollider/supercollider/releases – grab the Source-linux.tar.bz one. Also grab corresponding SC3-plugins at https://github.com/supercollider/sc3-plugins/releases

Continue reading

SuperCollider in 2020

I’ve been working more and more in SuperCollider this year. What follows is a couple of things I’ve done lately and some interesting resources I came across recently.

I tried in October with #looptober, but I couldn’t find time one way or another, but there came #noisevember and this works – to a degree. It’s a commitment to make some sort of sound-noise-music recording everyday and post them to your social network (where tags work), but if it’s not everyday, it’s alright too, and if it’s a blog, that’s fine too. No hard rules here. So I managed to make number of SuperCollider patches and posted video recordings and code. Here’s the #noisevember 2020 thread on my mastodon: sonomu.club/@luka/105145827210709919. Each toot has a link to source code on my git. See more: gup.pe/u/noisevember, mastodon.social/tags/noisevember.

At the end of October I played an online AV version of my Rhizosphere project/composition for an online event (but also in Tel Aviv), and the video is here: YouTube, PeerTube share.tube, pretok.tv

A discourse forum for SuperCollider community at scsynth.org is very friendly and great place. I’ve got number of questions answered from kind people there and therefore learned a lot. And there’s a community section at supercollider.github.io.

remote live stream at dawn

This morning I noticed that about one hour before the sunrise the stars are already starting to fade and there is light appearing at the east. It comes as a slight dissapointment, today. Looking into this feeling I realize I’m drawn to the idea of that particular time of darkest night and the moment of very first appearance of light, and then further more the whole process, the whole phase, up until the first rays of the sun hit the surface of your face. Only once in my life I have experienced that kind of strike, when sun was rising up from behind a mountain in New Zealand – I was cycling around south island with nothing but a tent – and I just climbed out into the fresh morning and was looking and anticipating the appearance of the sun, and there it was, it shot its first ray into my eye, suddenly, like a switch. This stands in opposition to a slow blur of the dark night, slow fade of the stars, a much less sudden transition from the darkest night to something only slightly brighter that is starting to flicker at the east.

Continue reading

SCDAWNREM20 general rehearsal

A week ago I made a similar trip to Volavlje, eastern-most part of Ljubljana municipality, but it was during the sunrise. A week ago that meant I was able to see the surrounding landscape on the way: the valley that starts at the end of the basin after crossing a steep hill and then final ascend to the rise, but through weirdly positioned and maintained houses – some obviously weekend holiday types, some rustic ones without the isolation. A peculiar aesthetic experience.

Today I got up at four in the morning, quickly made some non-coffee (rye, chicory) and off I was into the night onto the motorway, the ring around Ljubljana until veering perpendicular from it eastward. I It was a good thirty-five minute drive, and when I arrived to my spot, it was not completely dark anymore. The birds were already loud. There was a light breeze, but nothing substantial. Good conditions for my microphone, which I managed to attach to the top edge of the car’s front windshield. The USB cable then came through only small openning of the side window and into my laptop.

Continue reading

Jit.log#190904

Trying to make a review of what has been done and researched since last log entry.

About 8 days ago I realised I need to get into understanding underlaying concepts of patterns in SuperCollider in order to be able to imagine first and then know where and how to start laying first prototype/s for Jitakami. I’m been at this point of starting for very long and it’s quite frustrating to be constantly conceptualizing but nothing comes out of it.

I worked through the whole Streams-Patterns-Events SuperCollider tutorial. It did open and solidify some concepts, but I’m painfuly aware how it is really important to keep coding stuff in SC on daily basis. And on the side note – I actually think I should stick with one programming language for a while, master it and write the whole sound_engine+GUI in it first and then start expanding it to visuals.

I was writing and testing and trying hard to move from the dead point for the last two days. Current question is – what data should an ‘agent’ actually output – send and execute an algorithm. How to shape ‘agents‘, what kind of information do they have and what do they output. I started to write what kind of libraries we (should) work with (inventory): libraries of instruments, libraries of duration patterns… what other patterns are important, how to classify them, how to treat them so that I can create a simple specification.

I guess there could be a small library of basic instruments, like a drum machines (hihats, snares, kicks, claps, percussion, glitches), bass generators/synths, pads, leads. This is classification according to function of an instrument in a song. Let’s start simple.

The other ‘sort of’ library should be duration/trigger patterns – 8 or 16 bars of beat patterns, about 10 variations that could be applied to beat machine for example. And then melodic patterns for basslines, and melodic patterns for leads, and the same for pads.

This needs to be laid down in a simple specification and that’s our blueprint version 0.1 (or 2019-9).


Tomorrow I’m leaving for Linz with a train and a bike. The program is packed and the only thing I know and I’m looking forward is Ryoichi Kurokawa – especially his live a/v performance. He has done a new one in 2019. He also has an installation – silent one on Ars Electronica, and there are two things I’m wondering about – what and how did he incoporate A.I. into his work to be featured on the show, and secondly, why is his a/v language very dear to me. It is extremely powerful to me and something I “understand” (in affective sense) very well. Like his artistic language is very clear and domestic. When I watch his work it is … I don’t know how to describe it.

Jit.log #190827

Short report on today’s work:

A four hour session started as quick 20-30 minutes regrouping – that’s a synonym for checking on the plan so far and readjusting it. I haven’t looked into all the stuff I have layed out in Asana (all online courses and reading I’m in delay with), I wanted to refresh my memory and plans about the Jitakami instrument project. That is why I didn’t fiddle too much with the study line.

Let me first go over what was done further on today and then come back to evaluation of past two months.

I worked a bit on basic general concept that the Jitakami engine prototype should follow. The idea is that basicly there’s some kind of top conductor process that overlooks the composition. Conductor is controlled by the user via a touchscreen GUI. Conductor is able to intelligently launch and control agents/workers/operators/sequencers – these are possibly separate algorithms or functions that in turn create actual sequences of events – probably SuperCollider’s Patterns (or Events or Streams?) and deploy them to the concrete timeline.

I arrived at the question “can a function generate a pattern somehow” and off I was into SuperCollider Help system where I started to work on “Understanding Streams, Patterns and Events” tutorial and few hours took me to go through a first part (out of 7). I learned some basic stuff about Streams but the feeling is I don’t know enough yet, so I think I should be working on this tutorial daily for few hours.

To go back and reflect on this last month or two especially with regard to installation at Kapelica and A.I.:

At the beginning of the month (August) I managed to work on pretty difficult rephrasing of the project – kinda refocusing and trying to find more narrow path to produce material and then to rework on the audio-visual of the installation. I came with some interesting material on the area of rare earth elements, did a lot of research, but then felt like I’m not sure how to proceed and kinda left it there. So the installation was not upgraded into another version that I would be satisfied with and today, just few days before the closing of the exhibition, I feel like I failed to do something I liked, something that said something articulated (in an art’s own way) about Artificial Intelligence.

Next week I’m leaving for Linz to be at Ars Electronica festival, and especially their AI x Music part, but I must say I’m very frustrated by the term. It’s just too loaded with hype and it encopases too huge range of disciplines and approaches to actually mean anything but hyper-bloated phantasm.

Installing SuperCollider on Linux in 2019

These are short notes on installing SuperCollider from source on a debian-based Linux distro … in 2019.

Download sources

Download source from an official release from github: https://github.com/supercollider/supercollider/releases – grab the Source-linux.tar.bz one.

un-tar-bzip it with:

Prepare system for compilation

Install all dependencies for your system by following instructions on how to build SC from source on Linux here: https://github.com/supercollider/supercollider/blob/develop/README_LINUX.md (the most tricky is QT version – for recent versions of distros the QT version is high enough to install through apt):

Check if QT in your distro’s repos is high enough for SC (5.7 or later!):

if so, install the QT dev packages:

I’m using checkinstall to create a deb package, so, install that as well:

Building/compiling

Go to the folder with source, create a build folder, and cd to it:

Configure make (cmake)

Start compilation (lotsa warnings but if you don’t end up with ‘Build failed’, then build was successful):

Now create a deb with checkinstall:

Installing

Install the final deb:

or

supercollider: deb package from git with checkinstall

Follow instructions how to build and install supercollider from git here: https://github.com/supercollider/supercollider/wiki/Installing-SuperCollider-from-source-on-Ubuntu, but instead of final make install, use checkinstall (apt install checkinstall):

$ sudo checkinstall -D -t=debian --install=no --pkgname=supercollider --pkgversion=3.10.0-0your_name make install

You’ll end up with a .deb file in that current build folder. Install it with gdebi:

$ sudo gdebi supercollider_3.10.0-0lallafa-1_amd64.deb

Prevent apt from ever upgrading your package by creating a file called “preferences” in /etc/apt folder, and put in:

Package: /supercollider/
Pin: release *
Pin-Priority: -1

EDIT 20200311:

My last iteration of this is (change the pkgversion apropriately!):

sudo checkinstall -D --pkgname=supercollider --pkgversion=1:3.10.3-deviant200109 --backup=no --install=no --nodoc -y --replaces=supercollider,supercollider-common,supercollider-ide,supercollider-language,supercollider-server,supercollider-supernova

And if you also compile sc3-plugins you can use it similarly:

sudo checkinstall -D --pkgname=sc3-plugins --pkgversion=1:3.10.3-deviant200109-git --backup=no --install=no --nodoc -y

Daily Beat: a simple 808 beat with an immature bass sound

Today I started to work on a simple project that I hope becomes a routine almost every day. I delved into Patterns in SuperCollider again and hacked together a simple beat with 808 samples and with very little time left today added a very simple broken bassline/synth line.

Two important aspects of this little endeavor, for now: a) making *something* everyday, and b) making open source music – well libre open source code that generates music.

Making (composing) something everyday is an important practice for every artist. I’m not sure if this is gonna work for me, as I frequently start something and then abandon it, but nothing will change if one doesn’t try it. I want to play with something everyday, even if it’s a short melody line in Renoise or something new in SuperCollider, I just want to spend minimum half an hour on it. Not everything will probably be SuperCollider (although I have a fantasy to switch completely to SC and compose everything there, including heavy club tracks!), and there will be days when I’ll not manage.

So, today’s project is obviously a beginning. The full code is below, with a link to a zip with scd and samples, and also on Gitlab (https://gitlab.com/lukap/DailyBeats2018/tree/master/180805). Personally it feels that this was an important little refreshing lesson to remember how to use Patterns, what Pbind does and how to put patterns running in parallel together with Ppar.

Continue reading

Learning SuperCollider patterns: legato, sustain and using pairs

Yesterday I had problems understanding why does a SynthDef and it’s Synth instance complain when it’s envelope duration is shorter than duration \dur. The following code will produce FAILURE IN SERVER /n_set Node 1067 not found complaints in the post window.

The sound will be generated fine, but the problem is that the doneAction and sustain value in Env.linen result in the Synth instance being freed before Pbind does it. So when Pbind does tries to free it, it’s already gone, it doesn’t exist anymore.

I asked nice SC people on SC FB group and the answer led me to understand that there’s a good use of\sustain parameter in Pbind, which deals with exactly that. So to properly control sustain time, an example shows how it can be done:

It seems like the most important part is that the SynthDef has a sustain argument. Pbind will calculate sustain from \dur so that it doesn’t even have to be defined in the Pbind.

There’s a nice thing used above, which I was looking for a while – how to defined note and it’s duration in pairs instead of on separate lines for each parameter: [\degree, \dur] and what follows is a list of pairs.