Project > Home » Demonstrators
Show Results by Category
1: Moodplay Interactive Experience
Moodplay is a system that allows users to collectively control music and lighting effects to express desired emotions.
2: The Carolan Guitar
Scanning the Carolan guitar's different patterns takes you to a range of different information: the history of how it was made, details of who has played it, videos of their performances...
3: DMO Prototyping with Annalist: The Carolan Guitar
Annalist-created experimental Digital Music Object (DMO) structures and instances, supporting rapid evolution of semantic web prototype DMOs for FAST demonstrators.
4: Dynamic Music Objects (DMOs) & Semantic Player
A smart and creative mobile music player allowing interaction with dynamic music objects based on signal processing algorithms, semantic audio and various kinds of sensor information and user inputs.
5: Music Transcription in the Studio
PT-adir: Automatic methods for the transcription of a piano in the studio exploiting available information on the recording conditions within an alternating directions framework.
6: Semantic Alignment and Linking Tool for Music and Audio
SALT facilitates the rapid combination of datasets lacking common identifiers, enabling union views of the data.
7: Open Multitrack Testbed
The multitrack testbed project provides users with audio tracks and metadata about the tracks and the recording and mixing process.
8: Open Symphony: Supporting Audience-Performer Interaction
Open Symphony reimagines the music experience for a digital age, fostering alliances between performer and audience and our digital selves.
9: Notes and Chords
Harmonic visualiser is a high-resolution audio analysis tool. CView performs automatic recogntion of chords in music, and allows reauthoring of the transcriptions.
10: Multiple-instrument Automatic Music Transcription
This demonstrator project aims to create a general-purpose system for automatic music transcription.
11: Signal Processing Methods for Source Separation in Music Production
Sound source separation applied to the removal of interfering sound sources from studio and live recordings
12: Mobile Music
This demonstrator has developed 'geotracks' and 'geolists', music tracks and playlists of existing music that are aligned and adapted to specific journeys.
13: Semantic Dynamic Notation
This demonstrator explores the use of DMOs as inputs to music performance
14: MusicLynx
A web application for music discovery that enables users to explore an artist similarity graph constructed by linking together various open public data sources. connections between them.
15: Semantic Audio Feature Extraction (SAFE)
SAFE is a novel data collection architecture for the extraction and retrieval of semantic descriptions of musical timbre, deployed within the digital audio workstation.
16: Semantic Audio Effects
Novel audio effects that exploit Semantic Web technologies and content analysis at the point of content creation.
17: High-level Musical Feature Learning using Neural Networks
High-level musical feature such as mood tags can be learned using deep convolutional neural networks.
18: Ada Lovelace "Numbers into Notes"
In a series of experiments under the banner of "numbers into notes" we have explored the use of algorithmically generated music fragments co-created with human performer or audience.
19: Muzicodes
Muzicodes are user defined musical fragments (e.g. motif, rhythm, melody) that when performed and recognised by the system act as triggers for other media, such as an accompaniment or visuals during a performance.
20: CALMA and the Grateful Dead project
Enables innovative applications related to live music in the areas of music informatics and music information retrieval, as well as develops novel ways to navigate big data music collections for an improved user experience, and to support scholarship in popular musicology.
21: Rendering Decomposed Recordings of Musical Performances
Using ontologies, signal processing, and semantic rendering techniques to provide malleable educational and analytical user experiences based on the original audio rather than synthesized material.
22: MixRights: Fair Trade Music Ecosystem
MixRights based on emerging MPEG standards for achieving interoperability, such as IM AF format for interactive music services (remixing, karaoke & collaborative music creation) and MVCO ontology for IP rights tracking, aims to enable a royalties fair and transparent music ecosystem.
23: #Scanners
Novel tech to interact with music in a passive affect loop (how you respond to the music and how the music responds to you)
24: Music Encoding and Linked Data (MELD)
MELD - Music Encoding and Linked Data - is a framework that retrieves, distributes, and processes information addressing semantically distinguishable music elements.
25: Production Workflow in the Studio
This demonstrator focusses on music labelling in the Digital Audio Workstation (DAW) to help navigation or editing in the context of multitrack audio editing and navigation.
26: "Take it to the bridge"
A platform for multi-way communication between collaborating musicians through the dynamic, distributed annotation of digital music scores, employing the Music Encoding and Linked Data (MELD) framework.
27: Sonic Visualiser
Sonic Visualiser is an application for viewing and analysing the contents of music audio files.
28: Intelligent Studio Production
Real time tools for assistive studio production
29: Deep Pedal
Deep Pedal reveals the secrets of expressive piano playing through analysis and modelling of the pedalling gestures and techniques.
30: The Hexaphonic Guitar
The Hexaphonic Guitar demonstrates the new creative opportunties that become available once the strings of a guitar can be processed individually.
31: Jam with Jamendo
“Jam with Jamendo” brings music learners and unsigned artists together by recommending suitable songs as new and varied practice material.
32: FXive: Procedural Sound Effect Generation
FXive is a real-time online sound effect generation service.
33: Climb! Performance Archive
Climb! is a non-linear composition for Disklavier piano and electronics.
34: SOFA
The SOFA Ontological Fragment Assembler enables the combination of musical fragments into compositions, using semantic annotations to suggest compatible choices.
35: rCALMA Environment for Live Music Data Science
RCalma is a data science environment providing an analytical workflow for investigating large corpora of live music
36: FAST DJ
Fast DJ is a web-based automatic DJ system and module that can be embedded into any website.
37: The Grateful Dead Concert Explorer
A web application for the exploration of Grateful Dead concerts through digitised artefacts and audio recordings