FAST

Fusing Audio and Semantic Technologies
For Intelligent Music Production and Consumption

Dedicated to the life and works of Alan Blumiein

☰
Skip to content
  • Project
    • Overview
    • Publications
    • Presentations
    • Resources
      • Ontologies
      • Data
      • Tutorials
        • AES132 Tutorial
        • VamPy Tutorial
        • Music And The Web Of Data Tutorial
        • Getting Started With MoPy
    • Software
      • VAMP Plugins
      • Soundbite
      • Tony
      • Interactive Music App
      • HOTTTABS
      • Moodplay Interactive Music Player
      • Mood Conductor
      • Soundsoftware
  • Participants
    • Participants
    • Advisory Board Members
    • Project Team Members
    • Workshop & Meeting Photos
  • Demonstrators
  • Videos
  • Industry Day
    • Introduction
    • Schedule & Venue Info
    • Panel, Speakers, Artists
    • Programme
    • Artcodes
    • Partners & Sponsors
    • Photos
    • Videos
  • Showcase
  • Newsroom
    • Blog
    • News

Project > Home » Demonstrators

Show Results by Category

Studio Production
Live Performance
Music Discovery
Experience

1: Moodplay Interactive Experience

1: Moodplay Interactive Experience

Moodplay is a system that allows users to collectively control music and lighting effects to express desired emotions.

MORE

2: The Carolan Guitar

2: The Carolan Guitar

Scanning the Carolan guitar's different patterns takes you to a range of different information: the history of how it was made, details of who has played it, videos of their performances...

MORE

3: DMO Prototyping with Annalist: The Carolan Guitar

3: DMO Prototyping with Annalist: The Carolan Guitar

Annalist-created experimental Digital Music Object (DMO) structures and instances, supporting rapid evolution of semantic web prototype DMOs for FAST demonstrators.

MORE

4: Dynamic Music Objects (DMOs) & Semantic Player

4: Dynamic Music Objects (DMOs) & Semantic Player

A smart and creative mobile music player allowing interaction with dynamic music objects based on signal processing algorithms, semantic audio and various kinds of sensor information and user inputs.

MORE

5: Music Transcription in the Studio

5: Music Transcription in the Studio

PT-adir: Automatic methods for the transcription of a piano in the studio exploiting available information on the recording conditions within an alternating directions framework.

MORE

6: Semantic Alignment and Linking Tool for Music and Audio

6: Semantic Alignment and Linking Tool for Music and Audio

SALT facilitates the rapid combination of datasets lacking common identifiers, enabling union views of the data.

MORE

7: Open Multitrack Testbed

7: Open Multitrack Testbed

The multitrack testbed project provides users with audio tracks and metadata about the tracks and the recording and mixing process.

MORE

8: Open Symphony: Supporting Audience-Performer Interaction

8: Open Symphony: Supporting Audience-Performer Interaction

Open Symphony reimagines the music experience for a digital age, fostering alliances between performer and audience and our digital selves.

MORE

9: Notes and Chords

9: Notes and Chords

Harmonic visualiser is a high-resolution audio analysis tool. CView performs automatic recogntion of chords in music, and allows reauthoring of the transcriptions.

MORE

10: Multiple-instrument Automatic Music Transcription

10: Multiple-instrument Automatic Music Transcription

This demonstrator project aims to create a general-purpose system for automatic music transcription.

MORE

11: Signal Processing Methods for Source Separation in Music Production

11: Signal Processing Methods for Source Separation in Music Production

Sound source separation applied to the removal of interfering sound sources from studio and live recordings

MORE

12: Mobile Music

12: Mobile Music

This demonstrator has developed 'geotracks' and 'geolists', music tracks and playlists of existing music that are aligned and adapted to specific journeys.

MORE

13: Semantic Dynamic Notation

13: Semantic Dynamic Notation

This demonstrator explores the use of DMOs as inputs to music performance

MORE

14: MusicLynx

14: MusicLynx

A web application for music discovery that enables users to explore an artist similarity graph constructed by linking together various open public data sources. connections between them.

MORE

15: Semantic Audio Feature Extraction (SAFE)

15: Semantic Audio Feature Extraction (SAFE)

SAFE is a novel data collection architecture for the extraction and retrieval of semantic descriptions of musical timbre, deployed within the digital audio workstation.

MORE

16: Semantic Audio Effects

16: Semantic Audio Effects

Novel audio effects that exploit Semantic Web technologies and content analysis at the point of content creation.

MORE

17: High-level Musical Feature Learning using Neural Networks

17: High-level Musical Feature Learning using Neural Networks

High-level musical feature such as mood tags can be learned using deep convolutional neural networks.

MORE

18: Ada Lovelace "Numbers into Notes"

18: Ada Lovelace "Numbers into Notes"

In a series of experiments under the banner of "numbers into notes" we have explored the use of algorithmically generated music fragments co-created with human performer or audience.

MORE

19: Muzicodes

19: Muzicodes

Muzicodes are user defined musical fragments (e.g. motif, rhythm, melody) that when performed and recognised by the system act as triggers for other media, such as an accompaniment or visuals during a performance.

MORE

20: CALMA and the Grateful Dead project

20: CALMA and the Grateful Dead project

Enables innovative applications related to live music in the areas of music informatics and music information retrieval, as well as develops novel ways to navigate big data music collections for an improved user experience, and to support scholarship in popular musicology.

MORE

21: Rendering Decomposed Recordings of Musical Performances

21: Rendering Decomposed Recordings of Musical Performances

Using ontologies, signal processing, and semantic rendering techniques to provide malleable educational and analytical user experiences based on the original audio rather than synthesized material.

MORE

22: MixRights: Fair Trade Music Ecosystem

22: MixRights: Fair Trade Music Ecosystem

MixRights based on emerging MPEG standards for achieving interoperability, such as IM AF format for interactive music services (remixing, karaoke & collaborative music creation) and MVCO ontology for IP rights tracking, aims to enable a royalties fair and transparent music ecosystem.

MORE

23: #Scanners

23: #Scanners

Novel tech to interact with music in a passive affect loop (how you respond to the music and how the music responds to you)

MORE

24: Music Encoding and Linked Data (MELD)

24: Music Encoding and Linked Data (MELD)

MELD - Music Encoding and Linked Data - is a framework that retrieves, distributes, and processes information addressing semantically distinguishable music elements.

MORE

25: Production Workflow in the Studio

25: Production Workflow in the Studio

This demonstrator focusses on music labelling in the Digital Audio Workstation (DAW) to help navigation or editing in the context of multitrack audio editing and navigation.

MORE

26: "Take it to the bridge"

26: "Take it to the bridge"

A platform for multi-way communication between collaborating musicians through the dynamic, distributed annotation of digital music scores, employing the Music Encoding and Linked Data (MELD) framework.

MORE

27: Sonic Visualiser

27: Sonic Visualiser

Sonic Visualiser is an application for viewing and analysing the contents of music audio files.

MORE

28: Intelligent Studio Production

28: Intelligent Studio Production

Real time tools for assistive studio production

MORE

29: Deep Pedal

29: Deep Pedal

Deep Pedal reveals the secrets of expressive piano playing through analysis and modelling of the pedalling gestures and techniques.

MORE

30: The Hexaphonic Guitar

30: The Hexaphonic Guitar

The Hexaphonic Guitar demonstrates the new creative opportunties that become available once the strings of a guitar can be processed individually.

MORE

31: Jam with Jamendo

31: Jam with Jamendo

“Jam with Jamendo” brings music learners and unsigned artists together by recommending suitable songs as new and varied practice material.

MORE

32: FXive: Procedural Sound Effect Generation

32: FXive: Procedural Sound Effect Generation

FXive is a real-time online sound effect generation service.

MORE

33: Climb! Performance Archive

33: Climb! Performance Archive

Climb! is a non-linear composition for Disklavier piano and electronics.

MORE

34: SOFA

34: SOFA

The SOFA Ontological Fragment Assembler enables the combination of musical fragments into compositions, using semantic annotations to suggest compatible choices.

MORE

35: rCALMA Environment for Live Music Data Science

35: rCALMA Environment for Live Music Data Science

RCalma is a data science environment providing an analytical workflow for investigating large corpora of live music

MORE

36: FAST DJ

36: FAST DJ

Fast DJ is a web-based automatic DJ system and module that can be embedded into any website.

MORE

37: The Grateful Dead Concert Explorer

37: The Grateful Dead Concert Explorer

A web application for the exploration of Grateful Dead concerts through digitised artefacts and audio recordings

MORE

Sitemap

Privacy/Cookies

Disclaimer

Accessibility

Address

FAST IMPACt Project
School of Electronic Eng &
Computer Science
Queen Mary University of London
Peter Landin Building
10 Godward Square
London E1 4FZ

Subscribe

To subscribe to our
FAST IMPACt JISCMAIL
distribution list go to:

www.jiscmail.ac.uk/

SEMANTICAUDIO

Copyright

Website Design:

Nick Watts Design

Copyright:
©FAST IMPACt 2015.