Distributed Editing, Responsibility and Quality

Recently a concern with distributed editing concepts has been brought to my attention. As this adds to other existing reservations I have already been aware of and as I’ll coincidentally be talking about exactly that matter at a “Winterschool” conference in a few weeks this is an opportunity for a post about responsibilities in distributed edition workflows. I am convinced that any reservations about compromising quality of musical editions by giving up established workflows or by incorporating the work of multiple contributors are completely unfounded, or rather solely based on fear of the unknown and fear of change. In fact distributed work with text based tools and version control gives lots of flexibility and exciting new possibilities and adds multiple layers of safety nets rather than adding risks.

Traditional Editing Chains

Workflows in publishing houses or academic edition institutes are characterized by a clear separation of concerns and responsibilities. Of course there are differences but in some way or the other they are variations of a document flow like:

  • Existing copy
    (Editor works on an arbitrary pre-existing score)
  • Review/engraver’s copy
    (Check editor’s work)
  • Typesetting
    (Enter the music in a new score document)
  • Proof-reading/editing loop
    (Check typesetting against engraver’s copy, re-check editor’s work)
  • Engraving fine-tuning
    (Bringing the score to publication quality (hopefully there won’t be any content changes required after that step)
  • Compilation of volume/prepress

with a number of possible reiteration loops. While individual persons may be assigned to more than one of these stages the point is that the stages are cleanly separated and responsibility for each stage is clearly attributed.

Passing Files Around

One major problem with this traditional toolchain is the need to constantly pass around files and copies of files. In an earlier post I outlined the serious problems that arise from that and how working with LilyPond and a version control system like Git simply makes them vanish. These advantages alone are probably sufficient to decide to switch to using a version control system.

Working on a Common Data Repository

At the core of a distributed workflow there is a common data repository which is controlled by a version control system like Git and hosted on a central server. Of course there are many exciting things to that but for today I’ll only mention one: As everybody has parallel access to all project files and all tools are freely available, technically each team member can perform any task whenever it’s necessary or they feel like it. An extreme manifestation of this would be a project where all responsibility is fully shared by all team members, leaving the actual process to self-organization.

This prospect seems frightening to people who are used to traditional editing workflows, and there are two reservations commonly expressed with regard to such a concept. Some people worry about quality control when access to the data isn’t restricted by a hierarchical division of labor, and some simply do not want their responsibilities changed and weakened (fearing that might open the door to anarchy and chaos).

Traditional Workflow With Version Control

The first thing that has to be said here is that version controlled set-ups do not require you to go all the way. Even with version control it is possible to model a completely traditional approach to the editing toolchain. One person may enter the music while another is proof-reading it, then the main editor does his critical review, after which the edition is proof-read again and the engraving beautified by a professional engraver. Finally a graphic designer could combine the score with textual elements and do the pre-press to submit the final compiled volume to the printer. Responsibilities can be tailored exactly like with other toolchains if this is desired. Already this would be an improvement, especially in terms of quality control.

At a fundamental level the basic difference to traditional toolchains is that in version-controlled environments documents don’t have to be passed around through shared drives or by email. Through this alone all the hassles and potential issues that arrive from creating digital copies of documents become obsolete. It is for example inconceivable to mess up a document by having two persons edit different copies of it independently. Also it is more or less impossible that changes to a document would go by unnoticed just because the last editor failed to document them. Put the other way round: it is not necessary to accompany a modified document with an email listing all the modifications done to it because the person working on it next can simply check the commit to see what has been done:

A “commit” reveals the detailed changes to a file. Click to see the full commit online.

A “commit” reveals the detailed changes to a file. Click to see the full commit online.

Already at this level it should become clear that using collaborative tools actually increases the level of quality control rather than giving way to poor standards or compromise.

Experiencing the Benefits of Version Control

Add to that the additional virtues of version control by stepping back from the strictly sequential workflow of the olden days and by loosening the fixed distribution of responsibilities. By allowing contributors to perform different tasks based on their skill set and current availability they have to waste much less time by waiting for appropriate work to flow in. If there should currently be no music to be proof-read an engraver could instead spend his time entering new music or working on the overall appearance of the engraving. (Basically this is an opportunity to implement the Kanban methodology from software development in musical edition processes.)

But more importantly version control provides an additional safety net through the possibility of working in isolated sessions (or branches). Work on a given topic (for example “the critical review of the second movement” or “entering the fingerings from the composer’s copy”) can be encapsulated in such a branch, and only when this task has been completed will the work be integrated (we say “merged”) into the main line or “master” branch. That master branch – which can be understood as representing the official state of the edition – remains unaffected up to this point of merging and proceeds directly from one consistent state to another. This functionality ensures that different people can work on different tasks in parallel, without any risk of causing confusion or messing up the documents. Additionally it is possible to install an additional layer of quality control by deciding who is eligible to actually perform that merge step.

So collaborative work does not cause more confusion – quite the contrary.

Constant Peer Review

Version controlled collaborative workflows not only take care of a more robust editing environment, but they actually allow assigning tasks to an arbitrary number of contributors and managing them reliably, which makes it possible to organize projects in completely new ways – without compromising quality.

About two years ago I wrote a number of posts on this blog documenting a “crowd engraving” project where we successfully experimented with exciting workflow techniques. What I found particularly intriguing was the extent to which contributors of wildly varying qualification could produce high-quality material given an appropriate project set-up. Our workflow was arranged around splitting up the huge project (the end result was 50 minutes of full orchestra with choirs and soloists, densely printed on 100 pages of A3 paper) into small chunks. Every little contribution was done in a separate branch, and the agreement was that whenever someone had finished entering some music someone else had to review it before merging back to the master branch. This approach – which can of course be equally applied to the stage of scholarly review – had several important implications which I’d like to sum up with the term constant peer review. The most obvious consequence was that every single measure of music integrated in the “official” score had already been proof-read once, that is seen by at least two pairs of eyes. So we didn’t permanently live under the pressure of “someday” having to do the proof-reading.

Not as obvious but at least equally important is the fact that such short-term peer review encourages direct communication between contributors. While this doesn’t necessarily sound dramatic it is actually boosting both creativity and scholarly scrutiny, as I’ve described in an earlier short post. In our project we made use of the scholarLY library to maintain annotations within the score document. And these things together actually had mind-blowing consequences. Contributors had the possibility to add “musical issue”-type annotations, pointing to problematic spots in the score. Knowing that someone else would be looking at the annotation before merging it (either commenting on it, changing it to a proper “critical remark” or even discarding it) significantly lowered the bar for people to spell out their observations. It was truly inspiring to see that the quality of these observations was very much independent of the formal qualification of the contributor. In other words: when entering music the hobby musician bank accountant noticed issues with the manuscript just as I did, and knowing there would be the musicologist taking over responsibility he didn’t hesitate to document them.

Full Documentation

As a closing remark I’ll comment on the feature that may provide the most fundamental safety net among all the bells and whistles of version control: full and automatic project documentation. I won’t go into detail here (maybe look at some of our posts tagged with version control) but documenting any modification and attributing it to its author, and the possibility to edit and revert any such change selectively at any later time are invaluable tools that massively increase the safety and eventually also the quality of the editorial results. And as a second aspect this fully documents each team member’s contributions, making it possible to credit the actual work in a pretty fine-grained manner.

The point of this story is that versioned workflows give projects a level of control that traditional approaches can’t even come near. There is absolutely nothing to be afraid of: neither loosening the attribution of responsibilities nor the inclusion of arbitrary numbers of contributors of different qualification pose any risk of weakening the quality standards of the resulting edition. Quite the contrary, properly applied strategies from software development can help to significantly boost creativity, scholarly scrutiny and overall efficiency and quality of any music edition project.

One thought on “Distributed Editing, Responsibility and Quality

  1. Joram

    In the engineering industry, there is the same fear and companies are reluctant to use agile methods and stick to the waterfall approach. Even though the efficiency and quality of products can be improved in a more flexible approach because all contributors/employees work towards the same high quality goal. If the quality of rockets, planes and cars can be assured that way, I am pretty sure it is also possible for scores.

    So this reluctance is widespread, but many companies are in the process of change and I think also the music publishing will realize this at some point in time.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *