Category Archives: News

FAST final celebration event

Friday, 1 November 2019, 1:30 pm – 5:30 pm
The Neuron pod, Whitechapel, Queen Mary University of London

The FAST partners from the Centre for Digital Music, Queen Mary, Oxford e-Research Centre, Oxford University and Mixed Reality Lab, Nottingham University will come together on the Friday 1 November to celebrate the outcomes of the five-year EPSRC funded FAST project coming to an end in December 2019. The event will consist of a mixture of talks, industry presentations and project demos. There will be plenty of time for socialising.

Final Programme

1:30 – 1:40 pm: Welcome by Mark Sandler, Queen Mary (FAST Principal Investigator)

1:40 – 2:10 pm: Academic & Principal Co-Investigator Talks

From Assisting Creativity to Human-Machine Co-Creation
Dave de Roure, Oxford

The UKRI Centre for Doctoral Training in Artificial Intelligence and Music
Simon Dixon, Queen Mary

The UKRI Centre for Doctoral Training in Artificial Intelligence and Music (AIM) will train a new generation of researchers who combine state-of-the-art ability in artificial intelligence (AI), machine learning and signal processing with cross- disciplinary sensibility to deliver groundbreaking original research and impact within the UK Creative Industries (CI) and cultural sector. Based at Queen Mary University of London (QMUL), the CDT will receive £6.2M UKRI, £2M from QMUL and over £1M from industry partners, who include key players in the music industry: Abbey Road Studios, Apple, Deezer, Music Tribe Brands, Roli, Spotify, Steinberg, Solid State Logic, Universal Music; as well as a range of innovative SMEs. AIM will run from 2019–2027 and will train five annual cohorts of 12-15 PhD students, starting in September 2019.

Challenges and Opportunities of Using Audio Semantics in Music Production
George Fazekas, Queen Mary

FXive: Transformative Innovations in Sound Design
Joshua Reiss, Queen Mary

Sound effects for games, film and TV are generally sourced using large databases of pre-recorded audio samples. Our alternative is procedural audio, where sounds are created using software algorithms. These are inflexible, limited and uncreative. Supported by the FAST grant, we have launched the high tech spin-out company FXive. FXive’s transformative innovation allows all sound effects to be created in real-time and with intuitive controls, without use of recorded samples, thus creating fully immersive auditory worlds. We aim to completely replace the reliance on sample libraries with our new approach that is more creative, more enjoyable, higher quality and more productive. Longer-term, it can lead to fully computer-generated sound design, just as Toy Story revolutionised the industry with fully computer generated animation.

Post-FAST Opportunities: Musical Knowledge, Inference and Formal Methods for Semantic Deployment, Geraint Wiggins (Queen Mary)

The FAST Blues, Steve Benford (Nottingham)

 

2:15 – 3:15 pm: FAST project demonstrations

D1. Climb! Interactive Archive (Chris Greenhalgh and Adrian Hazzard, Nottingham)

D2. rCALMA
(Kevin Page, Oxford)

D3. Plunderphonics (Florian Thalmann, Queen Mary)

D4. SOFA (John Pybus and Graham Klyne, Oxford)

D5. The Grateful Dead Concert Explorer (Thomas Wilmering, Queen Mary)

D6. Moodplay (tbc) (Alo Allik, Queen Mary)

3:15 – 3:45 pm: Coffee and cakes

3:45 – 4:15 pm: Industry Talk

Music Informatics in Academia and Industry — Personal Insights
Matthias Mauch, Apple

When I started working in music informatics, industry involvement was minimal. I’m going to trace my own journey, from my PhD studies at Queen Mary, via other academic jobs, to my current work in the Music Science team at Apple Media Products.  All stages have had their unique fascination, and I will share some insights on similarities and differences on the way.

4:15 – 5:00 pm: FAST project demos (cont’d)

D7. Carolan Guitar (Chris Greenhalgh and Adrian Hazzard, Nottingham)

D8. Folk Song Network Based on Different Similarity Measures (Cornelia Metzig, Queen Mary)

D9. Jam with Jamendo (Johann Pauwels, Queen Mary)

D10. Lohengrin Digital Companion (Kevin Page, Oxford)

D11. Notes and Chords (Ken O’Hanlon, Queen Mary)

D12. Numbers into Notes
(Dave de Roure, Oxford)

D13. Intelligent Music Production (David Moffat, Queen Mary)

5:15 pm: End

6:00 pm: Dinner at Pizza East, 56 Shoreditch High Street, E1 6JJ

Further information

FAST demonstrators:
https://www.semanticaudio.ac.uk/demonstrators/
FAST videos:
https://www.semanticaudio.ac.uk/vimeos/
FAST industry Day Abbey Road Studios:
https://www.semanticaudio.ac.uk/events/fast-industry-day/

 

TWITTER_LOGO

 

FAST at the Tate Modern

The FAST project combined forces with the ESRC PETRAS Internet of Things project in a public event at the at the Tate Modern on 8-9 February. The event explored how the Internet of Things is changing our lives now, and how it may influence or disrupt our futures – at home, at work and in our environment.

Professor David De Roure (University of Oxford) closed the event with a discussion about IoT, music, and creativity with his talk “The making of music: creative algorithmic interventions and the imagination of Ada Lovelace”, including demos of arduino-based and algorithmically-enhanced electronic instruments developed in the FAST project.

David’s talk was preceded by a piece of music with accompanying film by Alan Chamberlain (University of Nottingham) which was specially produced for the event alongside the live premiere of his piece “Pen Dinas in Voice” on Friday 8th February at the Bangor Music Festival. The composition used the ‘Numbers into Notes’ software produced by David De Roure, with the same algorithms used in the demonstrations in his talk.

The film can be seen here: https://www.youtube.com/watch?v=Pl3KrfYc3r8

Audio is available here: https://soundcloud.com/alain_du_norde/pen-dinas-in-voice

David De Roure commented, “This was an exciting and lively event, exploring our possible futures in two days of conversations with highly engaged, intrigued and insightful visitors to Tate Modern. For me it was a great opportunity to bring together two of our research projects, FAST and Petras, and to revisit a conversation about machines, music and creativity that has been going on since 1843… if not before”.

Alan Chamberlain added:  “The FAST project has been the catalyst for a range of innovative technologies and works of art. By using technologies such as ‘Numbers into Notes’ the people are able to engage and understand how such technologies might be used”.

A news post from the University of Nottingham can be found here – “Musical Composition and its Impact on Culture and Research”.

The event was well attended by over 1300 members of the public across two days, with lively discussion and engagement with an exciting range of demonstrations, including Living Room of the Future, Smart Utopia, Karma Kettles, Human Sensor, Coral Love Story, The Listening Wood, Tales of the Park, Data Feeders, and Move for Me Baby. Further information can be found on the Tate Modern website.

The PETRAS Internet of Things (IoT) Research Hub is a consortium of nine leading UK universities working together to explore critical issues in privacy, ethics, trust, reliability, acceptability, and security.

FAST documentation film

Music’s changing fast, FAST is changing music. Our FAST documentation film is now available to watch (click on one of the two links below):

Full film VIMEO: https://player.vimeo.com/video/307448834
Full film YOUTUBE: https://youtu.be/VF4qQRcGfl8

Filmed at various locations, including the legendary Abbey Road Studio, the short film tells the FAST story. Listen to Mark Sandler (FAST Principal Investigator), David de Roure (Oxford Co-investigator), Steve Benford (Nottingham Co-investigator) and Josh Reiss (Queen Mary Co-investigator) speak. Also speaking are Mark d’Inverno (FAST Advisory Board chair), pianist and composer Maria Kallionpää, and artist, composer and researcher Tracy Redhead. With background music by Tracy Redhead (also featured on our FAST LP).

More information about the FAST industry Day:

FAST Industry Day

FAST Press release: Music’s changing FAST, FAST is changing music

Showcasing the culmination of five years of digital music research, the FAST IMPACt project (Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption), led by Queen Mary University of London, hosted an invite-only industry day at Abbey Road Studios on the 25th October.

You can read our press release in full here.

FAST project member Kevin Page chairs DLfM

IMG_20180928_141003Seventy international researchers attended the 5th conference on Digital Libraries for Musicology (DLfM) in Paris, France, organised by the FAST project in collaboration with the Unlocking Musicology project and the IReMus centre of the CNRS (Institut de recherche en Musicologie, Centre national de la recherche scientifique).

Dr Kevin Page, senior researcher at FAST partner the University of Oxford, served as programme and general chair for the conference, working with local chairs Dr Christophe Guillotel-Nothmann and Dr Cécile Davy-Rigaux, CNRS Directrice de recherche. This year DLfM was held at the Institut de Recherche et Coordination Acoustique/Musique (IRCAM) in Paris, France, a world-renowned and ground-breaking centre working at the intersection of music and technology.

The programme showcased research into the application of computational and informatics approaches for the study, analysis, and organisation of digital music, and music-related, corpora. The papers were presented across four themed sessions on Technological Advances, Digital Studies, Recognition and Encoding, and Collections, with the proceedings forming part of the ACM ICPS series and available online with open access.

The FAST project’s MELD framework featured in a paper at the conference, co-authored by FAST team member and MELD lead developer David Weigl, alongside colleagues from the University of Oxford e-Research Centre and Faculty of Music. The paper describes how MELD was used to create an interactive exhibition on the work of composer Frederick Delius (1862–1934), using Linked Data to combine digital material from the British Library, Delius Trust, and an annotated string quartet performance recorded earlier this year in Oxford. The technical innovations drive an environment for enriched engagement with musical sources and give an insight into the creative process.

FAST Report on FAST Industry Day @Abbey Road Studios

44925610294_4954d2418e_o

Introduction by Professor Mark d’Inverno

Music’s changing fast: FAST is changing music. Showcasing the culmination of five years of digital music research, the FAST IMPACt project (Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption) led by Queen Mary University of London hosted an invite only industry day at Abbey Road Studios on Thursday 25 October, 2 – 8 pm. Presented by Professor Mark Sandler, Director of the Centre for Digital Music at Queen Mary, the event showcased to artists, journalists and industry professionals the next generation technologies that will shape the music industry – from production to consumption.

31777519618_832ce76f26_o

Professor Mark Sandler introducing FAST

FAST is looking at how new technologies can positively disrupt the recorded music industry. Research from across the project was presented to the audience, with work from partners at the University Nottingham and the University of Oxford presented alongside that from Queen Mary. The aim being that by the end of the FAST Industry day, people would gain some idea how AI and the Semantic Web can couple with Signal Processing to overturn conventional ways to produce and consume music. Along the way, industry attendees were able to preview some cool and interesting new ideas, apps and technology that the FAST team showcased.

44925000264_cace2133ab_o

Panel session members

One hundred and twenty (120) attendees were treated to an afternoon and evening of talks demonstrations, the Climb! performance, and an expert panel discussion with Jon Eaves (The Rattle), Paul Sanders (state51), Peter Langley (Origin UK), Tracy Redhead (award-winning musician, composer and interactive producer, University of Newcastle, Australia), Maria Kallionpää (composer and pianist, Hong Kong Baptist University) and Mark d’Inverno (Goldsmiths) who chaired the panel. Rivka Gottlieb, harpist and music therapist, performed some musical pieces based on her collaboration with PI David de Roure and the project ‘Numbers into Notes’, Oxford e-Research Centre, throughout the day. Oxford e-Research Centre, throughout the day. Other speakers included George Fazekas who outlined the Audio Commons Initiative, Tracy Redhead and Florian Thalmann who presented their work on the semantic player technologies and Ben White who spoke about the Open Music Archive project (exploring the intersection betweeen art, music and archives).

The FAST Industry Day was opened by Lord Tim Clement-Jones (Chair of Council, Queen Mary University of London) and was compered by Professor Mark d’Inverno (Professor of Computing at Goldsmiths College, London).

Below are some highlights:

Carolan Guitar: Connecting Digital to the Physical – The Carolan Guitar tells its own story. Play the guitar, contribute to its history, scan its decorative patterns and discover its story. Carolan uses a unique visual marker technology that enables the physical instrument to link to the places it’s been, the people who’ve played it and the songs it’s sung, and deep learning techniques to better event detection. https://carolanguitar.com

FAST DJ – Fast DJ is a web-based automatic DJ system and plugin that can be embedded into any website. It generates transitions between any pair of successive songs and uses machine learning to adapt to the user’s taste via simple interactive decisions.

Grateful Dead Concert Explorer – A Web service for the exploration of recordings of Grateful Dead concerts, drawing its information from various Web sources. It demonstrates how Semantic Audio and Linked Data technologies can produce an improved user experience for browsing and exploring music collections.  See Thomas Wilmering explaining more about the Grateful Dead Concert explorer: https://vimeo.com/297974486

Jam with Jamendo – Jam with Jamendo brings music learners and unsigned artists together by recommending suitable songs as new and varied practice material. In this web app, users are presented with a list of songs based on their selection of chords. They can then play along with the chord transcriptions or use the audio as backing tracks for solos and improvisations. Using AI-generated transcriptions makes it trivial to grow the underlying music catalogue without human effort. See Johan Pauwels explaining more about Jam with Jamendo: https://vimeo.com/297981584

MusicLynx – a web platform for music discovery that collects information and reveals connections between artists from a range of online sources. The information is used to build a network that users can explore to discover new artists and how they are linked together.

The SOFA Ontological Fragment Assembler – enables the combination of musical fragments – Digital Music Objects, or DMOs – into compositions, using semantic annotations to suggest compatible choices.

Numbers into Notes – experiments in algorithmic composition and the relationship between humans, machines, algorithms and creativity. See David de Roure explaining more about the research: https://vimeo.com/297989936

rCALMA Environment for Live Music Data Science – a big data visualisation of key in the Live Music Archive using Linked Data to combine programmes and audio feature analysis. See David Weigl talking about rCALMA: https://vimeo.com/297970119

Climb! Performance Archive – Climb! is a non-linear composition for Disklavier piano and electronics. This web-based archive creates a richly indexed and navigable archive of every performance of the work, allowing audiences and performers to engage with the work in new ways.

The FAST project brings together labs from three UK’s top Universities: Queen Mary¹s Centre for Digital Music, University of Nottingham¹s Mixed Reality Lab and the University of Oxford¹s e-Research Centre.

More about the FAST Industry Day:
https://www.semanticaudio.ac.uk/events/fast-industry-day

Full list of FAST demonstrators:
https://www.semanticaudio.ac.uk/demonstrators/

News item on Audio Commons demonstrators at the FAST industry day:
https://www.audiocommons.org/2018/10/23/abbey-road-industry.html

High-tech spin-out company FXive is formed

logo black on whiteResearch in the FAST project and several other projects has demonstrated a new way to perform sound design. High quality, artistic sound effects can be achieved by the use of lightweight and versatile sound synthesis models. Such models do not rely on stored samples, and provide a rich range of sounds that can be shaped at the point of creation.

This system is now live at https://fxive.com (requires the Chrome browser), a web platform for sound effect synthesis. On July 25th, Prof. Josh Reiss and collaborators co-founded the company FXive to commercialise the technology. FXive is now seeking investment and working towards a full commercial launch. FXive will be demonstrated at the FAST Industry Day.

 

Centre releases Music Encoding and Linked Data software

Oxford FAST team members who are researchers at the University of Oxford’s e-Research Centre, have announced version 1.0 of their Music Encoding and Linked Data (MELD) framework, a flexible software platform for research which combines digital representations of music – such as audio and notation – with contextual and interpretive knowledge in the Semantic Web

The release of MELD represents a significant milestone in the Centre’s activities in the £5m EPSRC-funded Fusing Audio and Semantic Technologies (FAST) project, a collaboration with Queen Mary University of London and the University of Nottingham.

MELD brings an innovative new model for combining multimedia music resources, moving beyond milliseconds and simple labels to capture meaningful associations derived from music theory” explains Senior Researcher Kevin Page, who leads the FAST Music Flows activity and MELD research within the Centre. Dr Page, a member of the group which produced the W3C Linked Data Platform (LDP) specification, adds “by extending standards including LDP, Web Annotations, and the Music Encoding Initiative (MEI), MELD provides a flexible, scaleable, core while simultaneously enabling the detailed application-specific customisations researchers and industry find valuable”.

Dr David Weigl, principal developer of the MELD framework, recounts that “what’s been fascinating and rewarding is the variety of research we’ve worked on. We’ve effectively created a new instrument to perform a contemporary piece of music, analysed how musicians rehearse and perform, and are now building an interface to explore historical catalogues in the British Library. It really highlights the adaptability of our approach”.

“Climb!”, a non-linear composition for Disklavier piano and electronics by Maria Kallionpää (pictured), is an example of one such application. The performance environment for Climb! was built by Nottingham’s Mixed Reality Lab in collaboration with the Oxford FAST team, and combines their Muzicodes software with MELD. Climb! will receive its next performance during our FAST Industry Day at the world famous Abbey Road Studios, where Oxford researchers will be on hand to demonstrate and explain their innovations to artists, journalists and industry professionals.

I

In collaboration with colleagues from the Faculty of Music, MELD will also play a supporting role at the forthcoming Digital Delius: Unlocking Digitised Music Manuscripts event at the British Library. It is the technical foundation for an experimental digital exhibition presenting scores and sketches, early recordings, photographs, and concert programmes showcasing the music of British-born composer Frederick Delius (1862–1934). The materials are complemented by expert commentary and an interactive MELD application which situates the role of the items within the creative process.

This was also the theme of a workshop earlier in the year, when Centre researcher David Lewis (pictured above) used MELD to record the performance adaptations made by a student ensemble under the tutelage of the Villiers Quartet.

Professor David De Roure, Oxford FAST Investigator, summarises: “In MELD, we’ve created an implementation of Digital Music Objects, or DMOs. This next generation technology realises the FAST research vision for end-to-end semantics across all stages of the music lifecycle, bringing fantastic creative opportunities to artists, the music industry and consumers.”

MELD is open source and available from github.

News Source: http://www.oerc.ox.ac.uk/news/centre-releases-music-encoding-software (13 Sept 2018)

FAST participates in BBC R & D ‘Sounds Amazing’ event

On Wednesday 2 May, FAST team members from the Centre for Digital Music at Queen Mary, Alo Allik and Josh Reiss, presented their latest research at the Sounds Amazing 2018 event at the BBC in London. The event was attended by researchers and professionals working in the field of spatial audio.

4320811664_IMG_0083

Alo Allik showing MusicLynx

Alo Allik gave a demonstration of MusicLynx, a web application for music discovery that enables users to explore an artist similarity graph constructed by linking together various open public data sources. Josh Reiss gave a demonstration of FXive, a new way to perform sound design. High quality, artistic sound effects can be achieved by the use of lightweight and versatile sound synthesis models. Such models do not rely on stored samples, and provide a rich range of sounds that can be shaped at the point of creation.

4320811664_IMG_0103

Josh Reiss showing FXive

Sounds Amazing was brought by the BBC Academy Fusion project, BBC R&D and the S3A partnership. This event builds on the successful ‘Sound: Now and Next 2015’ conference from BBC R&D. It consisted of a day of talks, panels and a tech. expo. that was aimed to inspire and inform those from production and engineering about the latest developments in the amazing world of audio.

The detailed programme for the event is available below:

Morning: Award Winning Audio Production, Commissioning and Top Tech Tips.
Matthew Postgate, BBC Chief Technology and Product Officer, leading the BBC’s Design & Engineering division will open the event. L.J. Rich – Technology presenter (BBC Click), sound designer, inventor and NASA Datanaut will be the host for the day.

Sounds Dramatic explores how great audio can add drama to your content whether fact or fiction, Radio, TV or emerging VR. Presentations include multi-award winning podcast producer, James Robinson on the heart-stopping drama ‘Tracks’ and the outstanding sound team of Kate Hopkins and Graham Wilde who were behind ‘Blue Planet II’ and ‘Planet Earth II’ for BBC 1 ’.

A Sound Commission looks at what ticks the Commissioners’ boxes as Ben Chapman – Head of Digital, BBC Radio & Music; Mohit Bakaya – Commissioning Editor, Factual, Radio 4 and Zillah Watson – Commissioning Editor, Virtual Reality, BBC VR Hub share their favourite projects and preferences.

Tips On Top Tech! Ali Shah – Head of Emerging Technology & Strategic Direction at BBC and Chris Pike – Lead Audio R&D Engineer at BBC provide a rapid fire guide to the latest technology to help you save money and sound great!

12:00-13:30 Networking Lunch and Tech Expo (Delivered by the S3A project)
An opportunity to immerse yourself in futuristic sound experiences, connect with cutting edge technology experts from universities and industry – and explore exciting new partnerships. Lunch included for ticket holders, held in Media Café.

 Afternoon: Live, Immersive and Interactive – Innovative Live Performance and Immersive Production Techniques. 

Kicking the Mic! – Fusing live tap dance, looping and fully sound reactive LED dress – a short multi-sensory show by groundbreaking artist Laura Kriefman.

 Sound Bites – Immersive Masterclass – Catherine Robinson – Audio Supervisor, BBC explores the differences between 3D, binaural and surround sound and talks about her top production tips, setting up her own Binaural Studio in Wales and the developments happening there in radio, digital, live and VR.

 Live and Kicking – Tom Parnell – Senior Audio Supervisor, BBC R&D joins Dr. Paul Ferguson – Associate Professor of Audio Engineering, Edinburgh Napier University and Guto Thomas – Workflow and
Mobile Technology Specialist at BBC Cymru Wales for a session exploring innovation in live recording.

Sounds Academic – Professor Trevor Cox – Professor of Acoustic Engineering, University of Salford presents his experiment into object based audio by sharing a radio drama in a novel way, using mobile phones.

 Sound Bites – Breaking The Sound Barrier!  Composer and Producer Matthew Herbert – BBC Radiophonic Workshop provides us with a glimpse into the razor sharp cutting edge of audio technology and his experience of breaking new territory.

 Talk To me – Interactive Sound – Mukul Devichand – Editor, Voice at BBC introduces the world of interactive sound and its potential for broadcasters. Henry Cooke – Senior Producer / Creative Technologist in BBC R&D gives us the producer’s view of ‘The Inspection Chamber’ an original interactive audio drama. Mark Savage – Music Reporter, BBC, describes his experience of living for a month with an Apple HomePod, Amazon Echo, Google Home…. putting 7 speakers to the test and talking to himself.