Moritz v3


Moritz Krystals 4.0 Assistant Composer Assistant Performer ResidentSynth Host

Introduction to version 3, January 2015
Top level user interface
Background
The future of music notation
A 21st century time paradigm
Summary
History (archived documentation)
Acknowledgments


Introduction to version 3, January 2015

Moritz was born in 2005, and was named after Max’s terrible twin (see Max and Moritz).
Max is a program which specializes in controlling information (sounds) at the MIDI event level and below. Moritz deals with the MIDI event level and above (musical form). MIDI events are the common interface (at the notehead symbol level) between these levels of information.

Moritz is an open source Visual Studio solution, written in C#, and hosted at GitHub.
Anyone is welcome to look at the code, fork the solution or its underlying ideas, and use them in any way they like. Any help and/or cooperation would be much appreciated, of course.

Moritz currently contains two Windows desktop applications.

The Assistant Performer has split off to become a separate program, written in HTML and Javascript, that runs in browsers on the web.

The Assistant Performer became a Web MIDI application for the following reason:
The Assistant Composer creates scores in an SVG format which will display in browsers, but the desktop Assistant Performer could not interact with the browser's display. It was impossible to set the playback start and end positions, disable tracks etc., and then play back subsections of the score. Also, of course, I wanted to reach more users by not requiring them to install any software. The use of a plug-in, or the Chrome Web-MIDI-API flag is, I hope, a temporary measure.

The code for the Assistant Composer was thoroughly overhauled during the autumn of 2014, and should now be much easier to understand. Many optimisations are still possible, but at least the worst of the spaghetti has disappeared. (My coding style is a bit pedestrian by present day C# standards, but maybe that's not such a bad thing.)

The biggest change, apart from cleaning up the code, is that scores can now contain both input and output chords. This enables much greater control over what happens when midi input information arrives during a live performance: Parallel processing can be used to enable a non-blocking, "advanced prepared piano" scenario. Single key presses can trigger either simple events or complex sequences of events, depending on how the links inside the score are organized. An example score can be viewed (but not yet played) here.

Another important change is in the way ornaments are composed in palettes. Ornament value strings are no longer related to krystals, but are now entered directly instead. This removes an unnecessary awkwardness, and should help create more interesting ornaments and ornament relations in future.


Top level user interface

Moritz’ opening window is a simple entry point, from where the Assistant Composer or the Krystals Editor can be started:

1aMoritz3Main.png

Moritz: preferences and about Moritz open Moritz’ Preferences and About Moritz dialogs respectively.
Assistant Composer: load settings opens the Assistant Composer with an existing score settings file.
(See the Assistant Composer documentation for how to create new score settings files.)
Krystals Editor: open opens the Krystals Editor.
quit quits Moritz.

Preferences

1cMoritzPreferences.png

Preferred MIDI output device: When Moritz starts, it scans the system for active MIDI output devices, storing them internally. When the Preferences dialog opens, their names are added to this selector. The device selected here is used to play events that are being defined in the Assistant Composer's Palettes Editor. In addition to my preferred device (the CoolSoft VirtualMIDISynth), this menu also contains the "Microsoft GS Wavetable Synth" that is part of the Windows operating system. I usually use the VirtualMDISynth set up with the Arachno SoundFont, or something similar. See Remarks about the recordings at this site
Local files and folders: The locations of files and folders on the user's computer. The values printed in blue are just for reference.
Moritz Preferences file: The preferences file is now kept in the user's AppData\Roaming\Moritz folder. If the preferences file does not exist, Moritz creates one at this location when it starts.
User's Moritz documents folder location: The device or folder in which the user's Moritz folder is kept. The user's Moritz folder contains the set of standard folders that Moritz needs to read, and the folder in which scores are created. On my main computer, I keep such folders on my D: drive. On my laptop they are in C:\Documents. Changing this location automatically updates the following five locations:
audio folder: Contains audio files that are used as models in the Assistant Composer's Palette Editor.
krystals folder: Used by both the Assistant Composer and the Krystals Editor.
expansion fields folder: Used by the Krystals Editor.
modulation operators folder: Used by the Krystals Editor.
score creation folder: The folder containing individual score folders. Each score has its own folder, where the Assistant Composer finds the score's settings file, and where it saves the score it has created. Scores are kept at this particular location so that they can be uploaded directly to the corresponding on-line location — where they can be accessed by the Assistant Performer.
Online folders: Currently just the location of the Online XML Schemas folder. This is just for my reference...

About

1bMoritzAbout


Background

Moritz' origins go back to the early 1980s, when I had been trying for a decade or so (as a professional copyist of Avant-Garde music) to find some answers to the conceptual mess that had been plaguing music notation since the beginning of the century. I suddenly understood that much of the confusion could be avoided by keeping the spatial and temporal domains of music notation strictly separate in one's mind. I realized that all music notation (including standard 20th century music notation) is timeless writing — ink smears on paper. It has no necessary relation to clock time. This meant that there was no reason for symbols to have to add up (in clock time) in bars, and that tuplet symbols are unnecessary. I published these ideas, backed up by further arguments, in the essay The Notation of Time in 1985.
Tuplets are, in fact, a 19th century mistake. Earlier music doesn't use them, so why should we?

This solved the conceptual problems in notation being faced by the Avant Garde at the time, but did not solve the parallel, temporal problems of performance practice and rehearsal time. The Early Music revival, which was in full swing at the time, used recordings to develop traditions of performance practice, but this option was not available to composers trying to develop their own, specific performance practices.

Having a default performance embedded in a score, is a possible answer to this problem:
The SVG-MIDI score format clearly differentiates between spatial and temporal responsibilities by containing symbols whose appearance (graphics) is independent of their meaning (embedded temporal information). The temporal information in such scores defines a default performance — a performance that can be experienced and imitated.
It is simply a mistake to try to define a (temporal) default performance using (spatial) graphics (tuplets, metronome marks etc.). The whole point of music notation is that it should be legible in real time, and be a reminder of something that has been experienced beforehand. It does not have to be an exhaustive description of the symbol's meaning.
SVG-MIDI scores can contain a composer's original default performance, but a different score having the same graphics could just as well contain a recording of a later performance, created by a real performer.

I have been trying very hard to get other people to listen to these ideas since 1985. This has, however, proved to be a very difficult undertaking: Unfortunately it is not enough, even on the web, just to provide good, logical arguments showing that standard 20th century music notation contains mistakes and is unnecessarily complicated.

Karlheinz Stockhausen, whose principal copyist I had been since 1974, said (referring to a source that I have forgotten) that if you have an idea that is really new, then it takes a generation for the synapses in people's brains to grow together to catch up. He was himself unable to take these ideas on board, and simply ignored them. Less excusably, the academic world is, after about 30 years, still looking the other way.

"Getting inside the machine" to develop software libraries is a way to build on previous work, and to make progress, even without much help from anyone else. Feedback is very important for the development of any ideas, even if it is self-reflexive feedback. Talking to machines at least means that ideas have to be formulated precisely, and that there is no way to cheat. Also, formulating ideas very precisely often suggests ways to develop them.

Re-usable software libraries instantiate and express generalizations, but they need to be tested using concrete instances. For Moritz, the test cases are currently just my compositions (which are experimental music). If other composers want to write for the Assistant Performer, they will either have to use Moritz' classes, fork the code and develop some of their own, or write their own SVG-MIDI score generators from scratch. The file format is itself style-neutral.

I'm interested in both the generalizations and the test cases, but for me the generalizations are actually more important — That is the area that is especially dependent on cooperation. That's where agreement needs to be found. Where culture is. The days of Romantic heroes, pretending that they have all the answers, are long gone.


The future of music notation

The Assistant Composer/Performer project is part of an effort to re-enable writing as a means of composing large-scale polyphonic processes. Music is not the only application.

Chord symbols, staves and beams evolved in music notation over several hundred years to communicate a high density of polyphonic information on the page. Needing to be readable in real time, they evolved to be as legible as possible, making them good candidates for use in graphic user interfaces on computers. That the symbols communicate a much higher information density than the space=time notations currently used everywhere in today’s music software, means that they ought to become a useful alternative in a large number of music applications.

But there are uses in any area that could benefit from being able to program multiple parallel threads. I can even imagine music notation being used as the basis for new kinds of programing language. (The symbols don't necessarily have to contain MIDI information).

Future uses of SVG-MIDI scores are not limited to simple music-minus-one played in private (think Mozart’s Clarinet concerto). The scores can describe any kind of temporal process, including both passive recordings and processes which require live control. Performances can be cooperative, with several live performers on different computers. Output devices can include ordinary synthesizers, robot orchestras (or other teams of robots), lighting systems etc.

The Assistant Composer currently writes only a few of the more important annotations — score metadata (title, author etc.), bar numbers, staff names — but there is no reason why other annotations (such as verbal instructions, tuplets etc.) should not be added in a non-functional layer in the SVG. If tuplet annotations were present, many 19th and 20th century scores could be presented with familiar graphics, making it easier for performers to continue their existing traditions of performance practice.


A 21st century time paradigm

The simple, 19th century common-sense time paradigm was that people are like expressive metronomes. This dualism lead to the mistaken development of tuplet symbols in standard music notation, and also meant that 19th century music notation theory took no account of performance practice traditions. These failings eventually lead to the collapse of written music at the beginning of the 20th century, so I think it would be a good idea to try to describe a more modern version in order to avoid such problems for as long as possible in future.

Working on Moritz, and reading the lay scientific literature, leads me to think of time as follows:
The arrow of time is part of a brain strategy for dealing with complexity. It is the prerequisite for perception (consciousness), because without it we would suffer from information overload.
In the 21st century, the mathematical equations that describe space-time seem not to have a time-arrow. The arrow seems to be more closely related to consciousness than to bed-rock physical reality.

Whatever else they are, brains are a part of the space-time manifold where the universe is looking at itself, so they can't see the whole picture. And they also need to simplify the information that they are perceiving (the percepts). I think of brains as being little bits of space-time that look at the universe by grabbing other little bits of space-time, separating them into space and time, calling them "here" and "now", and then using chunking to develop higher level structures for coping with the percepts.

Brains use chunking both in perceived space and perceived time. They also use it to create other, quasi-dimensions: We don't perceive the frequencies of colours or sounds, just their effect (greenness, blueness, stringy timbre, windy timbre etc.). Space, time, greenness, blueness, stringy timbre, windy timbre etc. don't exist at all at the physical, space-time level. The physical level is not perceived at all, and can only be described using mathematics. But, using the idea of chunking, the percepts are compatible with the equations.

The use of chunks is very much like the use of subroutines in computing. We have known, since their discovery in the early 1950s, that subroutines can create hermetically separated levels of information. This is an idea quite inaccessible to that generation of Avant-Garde composers.
Remember Stockhausen's fascination with the smooth transition from frequency to pulse in Kontakte... Remember too, that the digital revolution came too late to affect that generation's thinking (ca. 1970). The first MIDI standard was not published until 1983 — while I was working on The Notation of Time.

For whatever reason, there is a smallest duration that can be perceived.
It used to be thought, rather vaguely, that brains are analog devices that scan a time-continuum containing arbitrarily small instants. But is our perceived "now" really infinitesimally short? Does perceived time just appear continuous because we can't perceive the joins between the chunks of time we are perceiving? At the physical level, there is no perceived time at all, only space-time. For me, brains deal with chunked time, and the chunks are not infinitesimally small. Moritz, following the Web MIDI API, measures time in increments of 1 millisecond (the DOMTimeStamp).

The precise value for the smallest perceptible duration is not important here, but it seems to be around 2 milliseconds, depending on who is being tested and how the value is measured.
See also the units of measurement in Wikipedia: Orders of magnitude (time).

Once brains have begun to create time chunks from the space-time manifold, these are further grouped into higher level symbols so that they are more easily comprehensible.
This can be seen happening in the words we use for higher levels of musical structure. We have words for timbres, notes, phrases, bars, movements, symphonies and so on even though these are impossible to see in the flat space=time diagrams used in current sound editing/playing software. These high level musical structures, especially the smaller ones, seem to be somehow accessible to us as if they were there all at once. Time chunking seems to work in a very similar way to the spatial chunking that happens in music notation (noteheads, chord symbols, beams, bars, systems, staves etc.).

So perceived time does not have the simple, direct relation to a physical time-continuum that is pretended by standard tuplets. Musical time is not related to an ether-like, absolute, continuous, divisible time.
Note too, that tempi are at a much higher information level than timbre. Tempi, if any, in music are related to the sizes of dancing human bodies. Tempi are fundamental to standard 20th century music notation, but they are not fundamental to musical time. There is no perceived tempo at the lowest level of time perception.

Perceived time is related to perceived events and memory. I think of short term memory as being closely related to the direct reading of a region in the space-time manifold, and longer term memory as working differently. However the different levels of memory work, I think that no memory of any kind would be possible without the brain's ability to create higher level symbols from more basic ones.

Performers learn to control perceived time in the context of a real, living tradition of performance practice (which we perceive as being stored somehow in human memory). Max is about our interaction with physical machines, Moritz is about linking those machines to human memory.
But musicians need only be concerned with how their memories work in practice. Physicists and brain specialists can be left to discover how they work under the hood.


Summary

The 19th century time paradigm collapsed at the beginning of the 20th century. Avant-Guard composers wrestled hard with the inherited problems during the 1950s and 60s but finally gave up (for sociological, philosophical and technical reasons) around 1970, leading to the collapse of written New Music. We have seen enormous progress since then in the low level, physical control of sounds, but the development of high level musical grammars has stopped.
Nowadays, physical space (screen pixels, music notation) and time (milliseconds, recordings) can be stored separately, but linked, inside the same timeless computer file. And timeless writing is the key to any cultural development...
The Assistant Composer/Performer software exists to allow both

History (archived documentation)

Archived documentation for Moritz v1
Archived documentation for Moritz v2

Acknowledgments


Moritz Krystals 4.0 Assistant Composer Assistant Performer ResidentSynth Host