Announcing Pink and Score: libraries for music systems design and composition, written in Clojure

Hi All,

I’d like to announce two Clojure libraries for music systems design and composition called Pink[1] and Score[2].

Pink is a library for music system design. It currently contains code for an audio engine, events processing, and signal processing. It is heavily influenced by Music-N systems (i.e. Csound, SuperCollider), but explores some novel areas of music system design, such as using audio-rate functions as arguments to instruments.  This allows such things as reusing an instrument whether it has a fixed pitch, a glissandi, randomized frequencies, etc., depending on what the user decides to pass in as an argument (rather than encoding that within the instrument design).  Events are also generalized as delayed function applications, which allows a great flexibility for the user to decide what will happen when an event is processed.

Score is a higher-level library for music score generation.  This library contains two primary score generation models–one based on CMask[3], the other on SuperCollider’s Patterns[4]–each of which solves various score generation use cases. Score also contains some useful functions for converting units to frequencies and working with non-standard scales that have scale degrees different than the western standard of 12 tone equal temperament. The library is designed to stand alone and work with various other systems.  I currently use it in my integrated music environment Blue[5] to generate Csound scores, as well as starting to use it with Pink.

I’ve also added a music-examples[6] project that I’m using to explore use of Pink and Score together.  There is also an example there using Incanter to plot the audio signals generated from a French Horn model as well as the output of an envelope generator. (This was useful for debugging some instrument code that went awry. :) )

Some notes:

* In the Clojure world, most people working on music probably use Overtone. I think Overtone is an excellent project and will probably handle many people’s use cases. However, I am interested in use cases which are–as far as I understand–not possible with Overtone, particularly crossing the event boundary with audio-rate functions as arguments.  (I believe that’s a limitation of SuperCollider rather than Overtone though; if that situation has changed since last I looked into it, then please let me know.)  I am also very much interested in encapsulating projects as fully as possible, for the purpose of having higher preservation of musical works.  This made me curious to explore a pure-JVM solution.

* Most of the experimentation so far has been done using the REPL and Vim.  You can check out the code in the demo folders in Pink/src and Score/src for some hints.  I hope to get around to documentation and tutorials soon.

* I have not yet fully benchmarked Pink, though it has so far been adequate for the various test musical ideas I have run.  I don’t expect it to run as quickly as C/C++ based systems such as Csound and SuperCollider, though I do expect to push things as fast as it can go on the JVM.  To that end, if you look at the code of Pink, you’ll find lots of typehints, as well as a design that reuses arrays between function calls. These are done to perform as quickly as possible as well as minimize memory usage. I’ll be continuing to explore optimizations; any suggestions would be very welcome!

* For those who might know me from my work on Csound, I am very much planning to continue my work there.  Working on Pink has helped to experiment with engine design ideas that would be more difficult to do with Csound’s code. I hope to bring back some of the architecture and design ideas back to Csound when I have a chance.

* These projects are not yet mature, but I felt they have reached a point where I could invite others to take a look.  At this point, I have some short-term plans (i.e. working with audio samples, engine code for writing to disk), but the longer-term is still a bit nebulous.  As it is, the libraries are not in a shape to submit to clojars. If you are interested to experiment with them, you can do so by checking them out with Git and running ‘lein install’, then adding dependencies (see the music-examples project.clj for an example).

Thanks!
steven

[1] – https://github.com/kunstmusik/pink
[2] – https://github.com/kunstmusik/score
[3] – http://www.bartetzki.de/en/software.html
[4] – http://doc.sccode.org/Tutorials/A-Practical-Guide/PG_01_Introduction.html
[5] – http://blue.kunstmusik.com/
[6] – https://github.com/kunstmusik/music-examples

Netbeans RCP: LazyPlugin and LazyPluginFactory

In working with the  Netbeans Rich Client Platform in building Blue, I have created a number of plugin types for the program.  Following a standard Netbeans convention, I have implemented plugin registration by having plugins register themselves with the System Filesystem. This is done either through XML (layer.xml) or through annotations I have created for each plugin type.

This works very well, but over time, I found two things I wanted to address:

  1. I was using a lot of boilerplate code for loading plugins and wanted to encapsulate that into an API to simplify things.
  2. I needed a way to lazily load the plugins, such that each are only instantiated only when they are requested for use.

To those ends, I wanted to share a couple of classes I have recently worked on: LazyPlugin and LazyPluginFactory. The code for these two are as follows:

LazyPlugin:

package blue.ui.nbutilities.lazyplugin;

import java.util.HashMap;
import java.util.Map;
import org.openide.filesystems.FileObject;
import org.openide.filesystems.FileUtil;

/**
 * Lazy Plugin class that wraps a NB FileObject.  
 * 
 * @author stevenyi
 */
public class LazyPlugin<T extends Object> {
    private final String displayName;
    private final String path;
    private final Class clazz;
    private final Map<String, Object> metaData;

    
    public LazyPlugin(FileObject fObj, Class c) {
        this.displayName = (String) fObj.getAttribute("displayName");
        this.path = fObj.getPath();
        this.clazz = c;
        metaData = new HashMap<>();
    }

    public String getDisplayName() {
        return displayName;
    }
    
    public T getInstance() {
        return (T) FileUtil.getConfigObject(path, clazz);
    }
   
    public Object getMetaData(String key) {
        return metaData.get(key);
    }

    public void setMetaData(String key, Object value) {
        metaData.put(key, value);
    }
}

LazyPluginFactory:

package blue.ui.nbutilities.lazyplugin;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.openide.filesystems.FileObject;
import org.openide.filesystems.FileUtil;

/**
 *
 * @author stevenyi
 */
public class LazyPluginFactory {

    private LazyPluginFactory() {
    }

    public static <T> List<LazyPlugin<T>> loadPlugins(String folder, Class z) {
        return loadPlugins(folder, z, null, null);
    }

    public static <T> List<LazyPlugin<T>> loadPlugins(String folder, Class z, MetaDataProcessor processor) {
        return loadPlugins(folder, z, processor, null);
    }


    public static <T> List<LazyPlugin<T>> loadPlugins(String folder, Class z, Filter filter) {
        return loadPlugins(folder, z, null, filter);
    }
    
    public static <T> List<LazyPlugin<T>> loadPlugins(String folder, Class pluginClass,
            MetaDataProcessor processor, Filter f) {
        List<LazyPlugin<T>> plugins = new ArrayList<>();

        FileObject files[] = FileUtil.getConfigFile(
                folder).getChildren();

        List<FileObject> orderedFiles = FileUtil.getOrder(
                    Arrays.asList(files), true);

        for (FileObject fObj : orderedFiles) {

            if(fObj.isFolder()) {
                continue;
            }

            if(f != null && !f.accept(fObj)) {
                continue;
            }
            
            LazyPlugin<T> plugin = new LazyPlugin<>(fObj, pluginClass);

            if (processor != null) {
                processor.process(fObj, plugin);
            }

            plugins.add(plugin);
        }

        return plugins;
    }

    public static interface Filter {
        public boolean accept(FileObject fObj);
    }
    
    public static interface MetaDataProcessor {
        public void process(FileObject fObj, LazyPlugin plugin);
    }
}

LazyPlugin encapsulates the plugin and defers loading until the plugin user calls getInstance().  This allows the plugin’s displayName to be used for displaying to the end-user (i.e. “Add GenericScore” for a GenericScore plugin).  There is also a metadata map that is useful for storing additional properties.  To note, the class does not itself hold on to any references of the FileObject.

The LazyPluginFactory is used by giving a folder name to load plugins from, as well as the type of plugin to load.  This then returns a List<LazyPlugin<T>> of the plugins found.  The loadPlugins() method does allow taking in a Filter as well as MetaDataProcessor.  The Filter class can be used to inspect a FileObject and skip over it or not.  (I have a pre-made AttributeFilter class that checks if a given attribute name is set to true. I also have a pre-made ClassAssociationProcessor that reads an attribute string assumed to be a fully qualified class name, loads the class, and adds to the LazyPlugin’s metadata as “association”.)

Having LazyPlugin as a class has–in my opinion–made my code a little clearer.  It’s easy for me to see the code and say “Oh, I’m dealing with plugins here, and I’m lazily loading them.”  The classes are also generic enough that I can handle two scenarios that appear in my program:

  1. I need to lazily load plugins that are model classes.
  2. I need to lazily load plugins at are view/controller classes and need to have an association with the model class. (i.e. this is an editor for class X)

Using generated XML, the following are two fragments, one for a model class, the other for a view/controller class:

Model:

            <file name="blue-orchestra-BlueSynthBuilder.instance">
                <!--blue.orchestra.BlueSynthBuilder-->
                <attr intvalue="50" name="position"/>
                <attr name="displayName" stringvalue="BlueSynthBuilder"/>
            </file>

View/Controller:

<file name="blue-ui-core-orchestra-editor-BlueSynthBuilderEditor.instance">
                <!--blue.ui.core.orchestra.editor.BlueSynthBuilderEditor-->
                <attr name="instrumentType" stringvalue="blue.orchestra.BlueSynthBuilder"/>
            </file>

When loaded within the program, I would get a LazyPlugin<Instrument> and a LazyPlugin<InstrumentEditor>.  The former would get used when adding an instance of an Instrument to the project’s main model class. The latter is used when the Instrument is selected. Note the metadata for the LazyPlugin has an association class loaded via a ClassAssociationProcessor that reads in classes using “instrumentType”. It has enough information then to setup the application’s cache of editors for instruments without having to load the actual editor until requested.

So far I have been happy with this design and it has served all of the use cases I currently have.  I imagine these classes may get further refined in the future as other uses cases come up.  I hope others may find these classes useful in the Netbeans RCP programs.

Announce: New Score Library in Clojure

Hi All,

I’d like to announce a score generation library written in Clojure called “score”:

https://github.com/kunstmusik/score

This library is currently a work in progress. I am planning to put all general composition functions that I use or plan to explore within this library.

Some notes:

The library currently offers two styles of score generation. One is styled after SuperCollider’s Patterns. Patterns in SC generate values without context, and map directly to standard Clojure sequences. gen-notes and gen-score in src/score/core.clj are functions for use with the score generation style. With this it is simple enough to emulate any feature in SC Patterns using standard Clojure sequence-related functions.

The other score generation style is CMask-based. In CMask, rather than have sequences, generator functions are used that function within a context of time. (The start time of the current event being generated is passed-in as an argument.) That difference of having time as an argument allows to express things like time-varying masks, frequencies, etc. So far, I have completed porting all of the features of CMask and have done light testing.

As for the future of this library, I will be using this in my pieces moving forward, and expect to maintain this library, adding features as required. I would warn that the library is still a little volatile, so functions may move namespaces and users may need to update code between these early versions. I hope to clean up and stabilize the API soon so backwards compatibility can be maintained. (The library is version 0.1.0 at the moment; it will be bumped to 1.0.0 when the API is stable.)

Also to note, the library is purposely designed to be generic. I am targeting Csound score generation at the moment, but the core of the library works to generate simply lists of lists (see core.clj, and note the difference between gen-notes and gen-score, or gen-notes2 and gen-score2). This allows the library to be used beyond Csound. For example, you could always create a formatting function to send the notes as MIDI, OSC, etc. (I have some plans to do some interesting event exploration using score with a Clojure music system I’m working on.)

For examples, I have some demo clj files I used while developing within a REPL (https://github.com/kunstmusik/score/tree/master/src/score/demo).They show a bit of what using the library would look like.

Comments and contributions would be very welcome.

Thanks!
steven

Clojure and Blue/Csound Example

I have started to work on a new composition and wanted to move from using Python to Clojure as the scripting language for my music.  I was experimenting yesterday and was pleased to be able to write a score generation function that was fairly flexible, allowing for easily using hardcoded values as well as any sequence for filling in p-fields of generated Csound Score.

From my work session I came up with some fairly condensed code I was happy with:

(require '[clojure.string :refer  [join]])

(defn pch-add  [bpch interval]
      (let  [scale-degrees 12
                       new-val  (+  (* scale-degrees  (first bpch))
                                                      (second bpch) interval)]
                [(quot new-val scale-degrees)
                          (rem new-val scale-degrees)]))

(defn pch->sco  [[a b]] 
      (format "%d.%02d" a b ))

(defn pch-interval-seq  [pch & intervals] 
    (reduce  (fn  [a b]  (conj a  (pch-add  (last a) b)))  [pch] intervals))

(defn pch-interval-sco  [pch & intervals] 
    (map pch->sco  (apply pch-interval-seq pch intervals)))

(defn score-arg  [a]
      (if  (number? a)
                (repeat a)
                a))

(defn gen-score  [& fields]
      (let  [pfields  (map score-arg fields)]
                (join "\n"
                                  (apply map (fn [& args] (join " " args)) (repeat "i") pfields))))

;; EXAMPLE CODE

(def score 
      (gen-score 1 0 1 
                         (pch-interval-sco  [6 0] 12 8 6 2 6)
                         (pch-interval-sco  [6 0] 12 8 6 2 6)
                         (range -10 -100 -1) 0 1))

(print score)

The output from the print statement is:

i1    0.0    1    6.00    6.00    -10    0    1
i1    0.0    1    7.00    7.00    -11    0    1
i1    0.0    1    7.08    7.08    -12    0    1
i1    0.0    1    8.02    8.02    -13    0    1
i1    0.0    1    8.04    8.04    -14    0    1
i1    0.0    1    8.10    8.10    -15    0    1

The key part is the gen-score function.  It can take in either a number or a sequence as arguments.  If a number is given, it will be repeated for each note.  For a sequence, they can be infinite or finite. The only important part of using gen-score is that the at least one of the arguments given is a finite list.

This is fairly similar to SuperCollider’s Pattern system, though it uses standard abstractions found in Clojure. To me, it is a bit simpler to think in sequences to generate events than to think about the Pattern library’s object-oriented abstractions, but that is just my own preference.  Also, the Pattern system in SC is designed for real-time scheduling and also has an option for a delta-time generator.  I think the delta-time aspect could be added fairly easily using an optional keyword-argument to gen-score.

As for making this work in realtime, gen-score would have to be rewritten to not format the strings but instead return the list of p-field values. A priority-queue could then be used to check against the time values of the notes and pause and wait to generate new notes when the scheduling time demands it.

Ultimately, I was very pleased with the short amount of time required to write this code, as well as the succinctness of it.  One thing that has been on my mind is whether to use a CMask/PMask kind of approach to the p-field sequence generators.  In those systems, what would be comparable to the sequence generators–I think they’re just called Fields in those systems–actually take in a time argument.  That gives the generators some context about what to generate next and allows the ability to mask values over time.  I am fairly certain I will need to update or create an alternative to the gen-score function to allow generator functions.  I’ll have to consider whether to use a Clojure protocol, but I may be able to get away with just testing if the argument to gen-score is a function, sequence, or number and act appropriately.  (Something to look at next work session. :) )

How Csound Works – Presentation from the 2nd International Csound Conference

I recently gave a talk at the 2nd International Csound Conference, held at the Berklee College of Music in Boston, entitled “How Csound Works”.  In the talk, I went through high-level design, key data structures, the Orchestra compiler, the event and runtime system, and some other features.  I have placed a copy of the presentation slides as a zip file here:

Download Slides

Additionally, the slides can be viewed online here.

Regarding the slides, I used Hakim El Hattab’s wonderful Javascript Slide framework, Reveal.js.

Note: I believe the presentation was recorded.  When that is made available, I will update this entry with a link to the video.

2nd International Csound Conference 2013

The 2nd International Csound Conference took place this past weekend at the Berklee College of Music in Boston.  I had a fantastic time there getting to see Jean-Claude Risset, John Chowning, and Barry Vercoe all give keynotes.  There was a reallly nice tribute session for Max Matthews, with people like Tom Oberheim, David Ziccarelli, and Max’s family there sharing some beautiful stories about Max.

Beyond that, it was great to see all the latest going on in the Csound community. I thoroughly enjoyed Rory Walsh’s presentation on Cabbage, as well as Andres Cabrera’s presentation on CsoundQt. They’ve both made some great developments in their software!  I also really enjoyed Oeyvind Brandtsegg’s presentation “Sonification with Csound ­ Quasar Correlations”, discussing an upcoming installation work.

There were certainly many presentations given on the 2nd day, and as they were in parallel tracks one simply couldn’t attend everything.  I ended up giving one presentation on the first day, and two presentations on the second, one of which spanned two parts.  Because of that I certainly felt like I missed out on going to presentations, but I believe everything was being filmed so I am looking forward to watching those when they are out.

Regarding the presentations I gave, I think I did just a so-so job on the Blue presentation, and was happy with how the “How Csound Works” and “How to use the Csound API” talks went. Given that there was a lot to prepare, I was happy in the end with how it all turned out.

The concerts were a very nice variety of pieces in different aesthetics. I was happy to be listening to music on very nice speakers, and especially enjoyed being in the company of many friends to do so.  I also enjoyed meeting a number of new people, and also finally putting faces to names I had long known from the mailing list.

Overall, I had a great time in Boston. I think Dr. Boulanger and the Berklee College of Music did a wonder job in organizing and creating a very special and memorable event. I’m already looking forward to the next Csound Conference!

Julian Parker Ring Modulator

I wanted to share an implementation of Julian Parker’s digital model of a Ring Modulator. The paper he wrote from DAFx 2011 [1][2] was also used by the BBC Radiophonic Workshop in the “Recreating the sounds of the BBC Radiophonic Workhop using the Web Audio API” [3].

I’ve implemented the ringmodulator as a UDO, available here:

Blue Project and CSD (ringmod.zip) 

To run the CSD on the commandline, you can use:

csound -i adc -o dac -b 128 -B 512 ringmod.csd

The Blue project has knobs you can use to adjust the carrier’s amp and frequency. In the generated CSD, you can adjust gk_blue_auto0 for amp and gk_blue_auto1 for frequency, or just modify the poscil line in instr 1.

Note, the paper suggests using a high amount of oversampling (32x; 8x or 16x being reasonable for using sinusoidal carrier). This implementation does not do oversampling, which I believe the BBC version does do not as well.

As far as I’ve checked, the implementation matches the BBC one with the exception of using a limiter instead of a compressor. Also, I did one optimization to the wavetable generation to extract out a constant in the part where v > vl, but this is a minimal optimization as the wavetable generation is done only once really.

  • [1] – http://www.acoustics.hut.fi/publications/papers/dafx11-ringmod/
  • [2] – http://recherche.ircam.fr/pub/dafx11/Papers/66_e.pdf
  • [3] – http://webaudio.prototyping.bbc.co.uk/ring-modulator/

TimeSphere

Completed: 2013.07.27
Duration: 7:03
Ensemble: Electronic (blue, Csound)

MP3: Click Here
OGG: Click Here
FLAC: Click Here
Project Files – Click here (.blue, .csd)

“TimeSphere” is inspired by the idea that time is not infinite but bounded, like a sphere, and that there are inifinite possible projections through time within this sphere. (I don’t remember the exact origin of this thought, but I believe I may have derived it from Stephen Hawking’s idea of a closed universe in “Brief History of Time”.)

I’ve always found the world to be filled with many strata of time. Things move together, alone, and somewhere in between, moving from one time flow to another. The idea of a sphere of time in which the world moves was an inspiration for this work, and not interpreted literally. While composing this piece, I was very aware of the the interplay between rational development and the exploration of where intuition guided me.

waveseq – Wave Sequencing User-Defined Opcode for Csound

Lately I’ve been interested in a number of hardware synthesizers that came out during the late 80’s/early 90’s, as I’ve found their synthesis methods rather curious and inventive.  One of them, the Korg Wavestation, has a very interesting synthesis system, using a combination of Vector Synthesis and Wave Sequencing. Vector Synthesis is easy enough to implement using a cross-fading between different oscillators or sound generators, but I was curious to see about implementing the Wave Sequencing in Csound code.

To implement this, I used information obtained online, information in the manuals, time experimenting on a hardware Korg Wavestation, as well as time with the Korg Legacy Wavestation Software (I ended up purchasing the whole Legacy Collection). Here is an example of the waveseq User-Defined Opcode (UDO) using f-tables generated by GEN10:

Example 1:

As well as f-tables using sampled drum sounds:

Example 2:

The UDO is implemented such that it takes in an f-table that describes the entire wave sequence.  Therefore, most of the work to using this opcode is done in creating the set of f-tables to sequence through.  I did implement the following features:

  • Tempo: 24 duration is a quarter note; if tempo is non-zero, it will be used to set the duration of the quarter note, if 0, tempo is about 105 bpm
  • WaveSequence: start wave, looping type (0 = forwards, 1 = forwards and backwards), start wave for loop, end wave
  • Wave Tables: single-cycle wave/single-shot wave/looped wave (determined on whether sample rate given in the waveseq table is 0, positive, or negative), amplitude adjustment, cross-fade time, duration of table to play
About the design, a wave sequence table holds information about how many tables are in the sequence, and how to play them. For example, in example 2, the wave sequence table used is:
itab_bass ftgenonce 0, 0, -9512, 1, "BDRUM11.WAV", 0, 0, 0
itab_tom ftgenonce 0, 0, -17600, 1, "TOM5.WAV", 0, 0, 0
itab_snare ftgenonce 0, 0, -10947, 1, "SNARE11.WAV", 0, 0, 0

iwaveseqtab ftgenonce 0, 0, -32, -2, 3, 1, 0, 0, 2,  
	itab_bass, ftsr(itab_bass), 1, 1, ixfade, iwavedur,  
	itab_tom, ftsr(itab_tom), 2, 1, ixfade, iwavedur,  
	itab_snare, ftsr(itab_snare), 2, 1, ixfade, iwavedur
The iwaveseqtab has 32 size (just needs to be big enough to hold the information for the other tables), and in its first line it describes:
  • 3 tables are in this wave sequence
  • 1 is used to denote backwards and forwards playing through the sequence
  • 0 is the index of the start wave
  • 0 is the index of the loop start
  • 2 is the index of the loop end
after that come the tables to be used.  For example, the part that starts with itab_bass says:
  • sample rate of the table (positive here, so play as single-shot)
  • amplitude adjustment of 1 (amplitude is multiplied by this factor)
  • pitch adjustment of 1 (not currently implemented)
  • crossfade of 0 (ixfade = 0 earlier in the code, not listed above)
  • duration of 6 (iwavedur = 6 earlier in the code, not listed above), this is equivalent to a 16th note
The waveseq UDO is uses the tablexkt opcode, does manual incrementing of phaser variables, use linear amplitude adjustments when cross-fading, and a lot of code for reading from the wave sequence table and configuring things. The code still requires some cleanup work, but I wanted to go ahead and make this initial, mostly-complete implementation available.  I plan to create implement some further features for the waveseq opcode, then create either a full Blue instrument plugin or a BlueSynthBuilder version of this instrument that will allow easier creation and organization of f-tables into wave sequences. I am also thinking about adding Vector Synthesis as well (using then four waveseq instances).
Overall, it was quite an enjoyable experience to study the Wavestation and learn to implement wave sequencing in Csound code.  In the end, I’m still looking at where I might use this opcode in my own work, but it’s nice to know it’s available should I find a use for it.
Download the Examples and MP3’s here: waveseq – example CSD’s and MP3’s

 

thoughts and experiences