FAST Live demonstrations at the QM Ideas Unwrapped yearly event, 26 April, Queen Mary
FAST team members successfully presented some of their project’s most exciting research during the yearly QM event ‘Ideas Unwrapped’ that took place on Thursday 26 April. The FAST event was a preparation for the FAST Industry day at Abbey Road Studios on 25 October
The two-hour session was attended by staff and students from Queen Mary in a relaxed and enjoyable atmosphere. Photos from the FAST event can be viewed here. The session was preceded by a talk entitled ‘FAST forward’ by Prof. Mark Sandler introducing the project. An audio of the talk can be accessed here.
The following live demonstrations were on show:
1) FXive: A Web Platform for Procedural Sound Synthesis
Parham Bahadoran and Adan Benito (Centre for Digital Music, EECS)
Fxive demonstrates a new way to perform sound design. High quality, artistic sound effects can be achieved by the use of lightweight and versatile sound synthesis models. Such models do not rely on stored samples, and provide a rich range of sounds that can be shaped at the point of creation.
2) Signal Processing Methods for Source Separation in Music Production
Delia Fano Yala (Centre for Digital Music, EECS)
3) Audio Commons: An Ecosystem for bringing Creative Commons content to the creative industries
George Fazekas (Centre for Digital Music, EECS)
Audio Commons addresses barriers to using Creative Commons content in the creative industries presented by uncertain licensing, insufficient metadata or variation in quality. Two demonstrators will be shown that have been developed for music production and game sound design use cases.
AudioTexture is a plugin prototype for sound texture synthesis developed by AudioGaming.
SampleSurfer provides an audio search engine that integrates instant listening capabilities, editing tools, and transparent creative commons (CC) licensing processes.
Rishi Shukla (Centre for Digital Music, EECS)
MusicLynx is a web platform for music discovery that collects information and reveals connections between artists from a range of online sources. The information is used to build a network that users can explore to discover new artists and how they are linked together.
5) Fast DJ
Florian Thalmann (Centre for Digital Music, EECS)
A minimal DJ web app that analyzes music files, makes automatic transitions, and learns from the user’s taste.
6) Grateful Dead Live
Thomas Wilmering (Centre for Digital Music, EECS)
A website for the exploration of recordings of Grateful Dead concerts, drawing its information from various sources on the web.
7) Numbers-into-Notes Semantic Remixer
John Pybus (Oxford e-Research Centre, University of Oxford)
8) The Prism audience perception app
Mat Willcoxson (Oxford e-Research Centre, University of Oxford)
A customisable app developed to obtain audience feedback during live performances, which has been used in live public experiments in Manchester and Oxford to investigate human perception of musical features. As well as supporting research, these have acted as public engagement events to engage audiences with music, maths, and particular composers.
(The PRiSM app has been developed in collaboration with colleagues in Oxford and the Royal Northern College of Music, and relates to other apps in FAST such as Mood Conductor.)
9) The Climb! performance and score archive – MELD and Muzicodes
Kevin Page (Oxford e-Research Centre, Oxford University)
“Climb!” is a non-linear musical work for Disklavier and electronics in which the pianist’s progression through the piece is not predetermined, but dynamically chosen according to scored challenges and choices. The challenges are implemented using two FAST technologies — Muzicodes and MELD (Music Encoding and Linked Data) — which are also used to create an interactive archive through which recorded performances of Climb! can be explored.
10) 6-Channel Guitar Dataset
Johan Pauwels (Centre for Digital Music, EECS)
A demonstration of a data collection procedure, as a prerequisite for future research.