Oxford FAST team members who are researchers at the University of Oxford’s e-Research Centre, have announced version 1.0 of their Music Encoding and Linked Data (MELD) framework, a flexible software platform for research which combines digital representations of music – such as audio and notation – with contextual and interpretive knowledge in the Semantic Web.
The release of MELD represents a significant milestone in the Centre’s activities in the £5m EPSRC-funded Fusing Audio and Semantic Technologies (FAST) project, a collaboration with Queen Mary University of London and the University of Nottingham.
“MELD brings an innovative new model for combining multimedia music resources, moving beyond milliseconds and simple labels to capture meaningful associations derived from music theory” explains Senior Researcher Kevin Page, who leads the FAST Music Flows activity and MELD research within the Centre. Dr Page, a member of the group which produced the W3C Linked Data Platform (LDP) specification, adds “by extending standards including LDP, Web Annotations, and the Music Encoding Initiative (MEI), MELD provides a flexible, scaleable, core while simultaneously enabling the detailed application-specific customisations researchers and industry find valuable”.
Dr David Weigl, principal developer of the MELD framework, recounts that “what’s been fascinating and rewarding is the variety of research we’ve worked on. We’ve effectively created a new instrument to perform a contemporary piece of music, analysed how musicians rehearse and perform, and are now building an interface to explore historical catalogues in the British Library. It really highlights the adaptability of our approach”.
“Climb!”, a non-linear composition for Disklavier piano and electronics by Maria Kallionpää (pictured), is an example of one such application. The performance environment for Climb! was built by Nottingham’s Mixed Reality Lab in collaboration with the Oxford FAST team, and combines their Muzicodes software with MELD. Climb! will receive its next performance during our FAST Industry Day at the world famous Abbey Road Studios, where Oxford researchers will be on hand to demonstrate and explain their innovations to artists, journalists and industry professionals.
In collaboration with colleagues from the Faculty of Music, MELD will also play a supporting role at the forthcoming Digital Delius: Unlocking Digitised Music Manuscripts event at the British Library. It is the technical foundation for an experimental digital exhibition presenting scores and sketches, early recordings, photographs, and concert programmes showcasing the music of British-born composer Frederick Delius (1862–1934). The materials are complemented by expert commentary and an interactive MELD application which situates the role of the items within the creative process.
This was also the theme of a workshop earlier in the year, when Centre researcher David Lewis (pictured above) used MELD to record the performance adaptations made by a student ensemble under the tutelage of the Villiers Quartet.
Professor David De Roure, Oxford FAST Investigator, summarises: “In MELD, we’ve created an implementation of Digital Music Objects, or DMOs. This next generation technology realises the FAST research vision for end-to-end semantics across all stages of the music lifecycle, bringing fantastic creative opportunities to artists, the music industry and consumers.”
MELD is open source and available from github.
News Source: http://www.oerc.ox.ac.uk/news/centre-releases-music-encoding-software (13 Sept 2018)