FAST members present their work at MEC

FAST Report from the Music Encoding Conference, 17-20 May 2016, Montreal, Canada
(by David Weigl)

FAST project members David Weigl and Kevin Page (Oxford e-Research Centre) recently presented their work on Semantic Dynamic Notation at the Music Encoding Conference (MEC) in Montreal, Canada. The MEC is an important annual meeting of academics and industry professionals working on the next generation of digital music notation. The presented work builds on the Music Encoding Initiative (MEI) format for encoding musical documents in a machine-readable structure.

Semantic Dynamic Notation augments MEI using semantic technologies including RDF, JSON-LD, SPARQL, and the Open Annotation data model, enabling the fine-grained incorporation of musical notation within a web of Linked Data. This fusing of music and semantics affords the creation of rich Digital Music Objects supporting contemporary music consumption, performance, and musicological research.

The use case served by the presented demonstrator draws inspiration from an informal, late-night FAST project ‘jam session’ at the Oxford e-Research Centre. It enables musicians to annotate and manipulate musical notation during a performance in real-time, applying ‘call-outs’ to shape the structural elements of the performance, or signalling to the other players a new piece to transition to. Each performer’s rendered digital score synchronises to each shared action, making score adaptations immediately available to everyone in the session to support the collaborative performance. The supported actions transcend the symbolic representation of the music being played, and reference significant semantic context that can be captured or supplemented by metadata from related material (e.g., about the artist, or a particular style, or the music structure).

As well as augmenting and enabling dynamic manipulation of musical notation in a performance context, the system captures provenance information, providing insight into the temporal evolution of the performance in terms of the interactions with the musical score. This demonstrates how notation can be combined with semantic context within a Digital Music Object to provide rich additional interpretation, available to musicians in real-time during a performance, and to consumers as a performance outcome.

meld-screenshot