Category Archives: News

FAST participants attend Sonar+D 2016 festival

From 16-18 June 2016, the Centre for Digital Music (QMUL) including members from the FAST project presented a public exhibition of its research at the Sonar+D festival, a high-profile annual event in Barcelona which caters to musicians, the music technology industry, and members of the general public. C4DM was chosen by competitive application for a display booth (roughly 4m x 4m) on the exhibition floor, in a prime location near the entrance to the facility. 12 C4DM researchers, including PhD students, postdocs, early- and mid-career academics, attended to showcase their work.

The event was attended by thousands of people, mainly adults but also occasionally children. The booth had many visitors from both large and small businesses, including several different members of the music company Focusrite. Several musicians who were performing and speaking at Sonar also attended the booth, including the well-known composer Brian Eno who took an interest in several of the research projects. The event raised the public profile of C4DM and the individual projects within it.

The projects shown directly related to FAST were:

  •  Bela, an open-source platform for ultra-low-latency audio and sensor processing, which launched on Kickstarter in 2016. Bela attracted significant interest from Sonar attendees;
  • Moodplay, a mobile phone-based system that allows users to collectively control music and lighting effects to express desired emotions;
  • MixRights, a demo of how content reuse is enabled by emerging MPEG standards, such as IM AF format for interactive music apps and MVCO ontology for IP rights tracking, driving a shift of power in the music value chain.

Other C4DM projects shown were:

  • Augmented Violin, a sensor-based extension of the traditional violin to give students constructive feedback on their playing
  • Tape.pm, an interactive object which explores novel ways to record and share an improvisation that is created on a musical instrument;
  • Aural Character of Places, an interactive online demo of soundwalks conducted around London;
  • TouchKeys, a transformation of the piano-style keyboard into an expressive multi-touch control surface, launched on Kickstarter in 2013 and spun out into a company in 2016;
  • Collidoscope, a collaborative audio-visual musical instrument which was a viral hit online with over 10M views;
  • RTSFX (Real-Time Sound Effects), an online library of real-time synthesised (rather than sampled) sound effects for a variety of different objects and environmental sounds.

The Sonar+D exhibition generated substantial publicity for C4DM and the FAST project, with thousands of people attending the booth. The Sonar+D organisation also featured C4DM in its online media and videos, and we were interviewed for a broadcast on Spanish television. Finally, C4DM and FAST attendees at Sonar+D also had the opportunity to see other booths and talks at the event, generating new ideas and connections for future projects.

Professor Mark Sandler elected fellow of the Royal Academy of Engineering

Professor Mark Sandler is one of 50 of the UK’s engineers to be welcomed as new Fellows on the day to mark the Academy’s 40th annual general meeting. There are only 1,500 Fellows in total, and Fellowship of the Royal Academy of Engineering is considered to be one of the greatest national achievement that an engineer can receive.

ismir15_markThe honour was awarded in recognition for Professor Sandler’s research contributions in digital music, and in particular computer generated musical analysis.

Commenting on his election, Professor Sandler said: “I am truly honoured to be elected a Fellow of the Royal Academy of Engineering. I think I will be one of a small minority of Fellows interested in the Creative Industries, and so I hope to be able to raise that sector’s profile within the Academy.”

Professor Sandler has been leading the £5 million EPSRC-funded project “Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption” (FAST IMPACt) since June 2014 with UK partners from Nottingham University (Professor Steve Benford, Mixed Reality Lab) and University of Oxford (David de Roure, Oxford e-Research Centre).  For a full list of FAST partners, see our Participant page here.

Read full QMUL press release:
http://www.qmul.ac.uk/media/news/items/se/181405.html

Read full The Royal Academy of Engineering press release:
http://www.raeng.org.uk/news/news-releases/2016/september/academy-elects-top-engineers-as-fellows-at-its-40t

 

 

 

 

 

 

Oxford partners participate in the Digital Humanities Summer School

Digital Humanities at Oxford Summer School, 4 – 8 July 2016

The Digital Humanities at Oxford Summer School is the largest in Europe (and second largest in the world). It aims to encourage, inspire, and provide the skills for innovative research across the humanities using digital technologies, and to foster a community around it.

digitalmusicologyBannerOne of the Digital Humanities at Oxford Summer School workshops in the programme was the Digital Musicology workshop. It was convened by Dr Kevin Page (e-Oxford Research Centre), one of our FAST project partners from Oxford. The workshop provided an introduction to computational and informatics methods that can be, and have been, successfully applied to musicology. It also included a Linked Data and Musicology in the Linked Data strand. Many of these techniques have their foundations in computer science, library and information science, mathematics and most recently Music Information Retrieval (MIR); sessions were delivered by expert practitioners from these fields and presented in the context of their collaborations with musicologists, and by musicologists relating their experiences of these multidisciplinary investigations. The workshop comprised a series of lectures and hands-on sessions, supplemented with reports from musicology research exemplars. Theoretical lectures were paired with practical sessions in which attendees were guided through their own exploration of the topics and tools covered.

Other FAST partners contributing to the Digital Humanities workshop were Professor Dave De Roure and Dr David Weigl, both from the Oxford e-Research Centre. De Roure’s workshop on Social Humanities also included a session on designing music social machines.

Finally, FAST project member from Queen Mary, Chris Cannam (Centre for Digital Music), gave two tutorial sessions at the workshop: ‘Applied computational and informatics methods for enhancing musicology’ and ‘Using computers to analyse recordings, An introduction to signal processing (with co-tutor Ben Fields, Oxford)’. Both sessions introduced the basics of computational treatment of recordings of music, which are based on the concept of ‘features’ derivable from this ‘signal’ by suitable processing. The hands-on session ‘Using computers to analyse recordings’ exposed the participants to software for extracting features from recordings, visualising those features, and helped them understand how features relate to perceptual and musical concepts.
Relevant links with further information:

http://www.oerc.ox.ac.uk/news/digi-humanities-summer-school
http://digital.humanities.ox.ac.uk/dhoxss/2016/
http://digital.humanities.ox.ac.uk/dhoxss/2016/workshops/digitalmusicology
http://digital.humanities.ox.ac.uk/dhoxss/2016/workshops/sochums

 

 

 

 

 

 

Collaboration between FAST and Audiolabs Erlangen

By Sebastian Ewert (Centre for Digital Music, Queen Mary University of London)

AudioLabsFAST project member Sebastian Ewert (Queen Mary University of London) visited in July our project partner International Audio Laboratories Erlangen, a joint institution of the Friedrich-Alexander-University Erlangen-Nuernberg (FAU) and Fraunhofer Institut fuer Integrierte Schaltungen IIS. In a week-long meeting the partners exchanged their knowledge and experience regarding various aspects of Neural Networks, a technology currently dominating recent developments in signal processing and machine learning with applications ranging from automatically colouring gray scale images (*1), repainting photos in the style of famous artists (*2), to translation between languages and automatically steering cars and robots in the real world (*4). The discussion with Prof. Meinard Mueller and his doctoral students from the Semantic Audio Signal Processing group focused around the development of several new concepts to use or develop new variants of neural networks capable of extracting different types of semantical information from music recordings.

(*1) http://tinyclouds.org/colorize/
(*2) https://arxiv.org/abs/1508.06576
(*3) https://www.engadget.com/2016/03/11/google-is-using-neural-networks-to-improve-translate/
(*4) http://spectrum.ieee.org/computing/embedded-systems/bringing-big-neural-networks-to-selfdriving-cars-smartphones-and-drones

FAST members present their work at MEC

FAST Report from the Music Encoding Conference, 17-20 May 2016, Montreal, Canada
(by David Weigl)

FAST project members David Weigl and Kevin Page (Oxford e-Research Centre) recently presented their work on Semantic Dynamic Notation at the Music Encoding Conference (MEC) in Montreal, Canada. The MEC is an important annual meeting of academics and industry professionals working on the next generation of digital music notation. The presented work builds on the Music Encoding Initiative (MEI) format for encoding musical documents in a machine-readable structure.

Semantic Dynamic Notation augments MEI using semantic technologies including RDF, JSON-LD, SPARQL, and the Open Annotation data model, enabling the fine-grained incorporation of musical notation within a web of Linked Data. This fusing of music and semantics affords the creation of rich Digital Music Objects supporting contemporary music consumption, performance, and musicological research.

The use case served by the presented demonstrator draws inspiration from an informal, late-night FAST project ‘jam session’ at the Oxford e-Research Centre. It enables musicians to annotate and manipulate musical notation during a performance in real-time, applying ‘call-outs’ to shape the structural elements of the performance, or signalling to the other players a new piece to transition to. Each performer’s rendered digital score synchronises to each shared action, making score adaptations immediately available to everyone in the session to support the collaborative performance. The supported actions transcend the symbolic representation of the music being played, and reference significant semantic context that can be captured or supplemented by metadata from related material (e.g., about the artist, or a particular style, or the music structure).

As well as augmenting and enabling dynamic manipulation of musical notation in a performance context, the system captures provenance information, providing insight into the temporal evolution of the performance in terms of the interactions with the musical score. This demonstrates how notation can be combined with semantic context within a Digital Music Object to provide rich additional interpretation, available to musicians in real-time during a performance, and to consumers as a performance outcome.

meld-screenshot

 

FAST partners participate in the annual CHI conference

CHI Carolan 2The FAST project was in full force at the ACM’s annual Computer-Human Interaction conference – widely known as CHI – in San Jose in May. CHI is the leading conference for research into interfaces and interaction and was attended by over 3,000 delegates this year.

First up for FAST was Nottingham team’s work on the Carolan guitar as an example of an ‘accountable artefact’ a physical object that can becomes associated with a growing digital record and it is passed among different custodians over its lifetime and that can interrogated to tell various stories of provenance, use or personal meaning based on this. Their research paper on Carolan was well received, winning a “Best of CHI” honorable mention (awarded to the top 5% of all submissions) and also being nominated for a Best Art paper (the first year for this category). The Nottingham FAST team (Mixed Reality Lab) also presented Carolan as an exhibit at Interactivity,  CHI’s hand-on programme of exhibits. FAST also contributed a paper on brain-controlled interfaces for entertainment, specifically a study of the design and experience of our associate artist (and PhD student) Richard Ramchurn’s unique movie #Scanners. Richard, Matthew and the team also exhibited #Scanners at CHI Interactivity, attracting great interest and invitations to stag the work elsewhere.

CHI2016_Music_HCI_workshop_talk_Barthet_panoQueen Mary University of London’s FAST team participated in the CHI 2016’s Music and HCI workshop and gave Open Symphony performances at the CHI’s Interactivity session. Several FAST demonstrators were presented in their review paper “Crossroads: Interactive Music Systems Transforming Performance, Production and Listening” (http://bit.ly/crossroadsMusicHCI) at the Music and HCI workshop which gathered international experts and pioneers of the field of Music Interaction.

Read more about this news item on our FAST blog.

Related links:
https://www.nottingham.ac.uk/ComputerScience/Outreach/Carolan-Guitar.aspx
http://isophonics.net/content/opensymphony

FAST participates in The Able Orchestra project

The Able Orchestra project is the latest in a series of projects achieved through the partnership between Orchestras Live and Nottinghamshire Music Hub. The project has been co-produced by County Youth Arts with Orchestras Live and aims to bring world class orchestral work to under-served parts of the UK. During the project music was created by young people with disabilities from Fountaindale School and students from Outwood Academy Portland, along with members of the Halle Orchestra; all by the BBC’s ’Ten Pieces’ initiative. The event took place at the Palace Theatre Mansfield on Monday 9th May 2015.

Amy Dickens, Associate PhD student from Mixed Reality Laboratory (MRL), University of Nothingham, assisted the Fountaindale School students along with Si Tew (Music Producer), Ronnie Sampson (Electronic musician & singer-songwriter), John Sampson (Music Producer) and Bec Smith (Digital Artist) from Urban Projections. Amy helped the students to create sounds through movement using the Leap motion sensors and helped to facilitate other musical performance with iPads. The involvement with the project has enabled ethnographic study of HCI challenges in accessible and has contributed to the Amy’s research in gesture controlled musical interventions for users with limited movement.

Amy’s involvement in the project came about through a FAST IMPACt meeting with the MRL and potential partners at Nottingham County Youth Arts and the Nottinghamshire Music Hub, where the project was described and opened to researchers who might have some interest in being involved in the project.

Amy held a design workshop to discuss the ideas of gesture controlled sound in this setting. This lead to the development of a prototype and introduction of the sensor to the project and use in the project sessions – it is hoped that data from the project will provide key points from which to adapt the prototype and continue developing technology for a more tailored interaction.

 

Professor Steve Benford announces ‘Artcodes’

Our partner from Nottingham, Professor Steve Benford, Mixed Reality Lab, just announced Artcodes – “the greatly extended and rebranded Aestheticodes app”.

The Carolfront-side-angle1an guitar connects to its digital record through its decorative inlay. Specifically, the Celtic knotwork patterns spread around Carolan contain six different computer-readable codes hidden within them that the users can scan using a mobile device to connect to various webpages. The Artcodes app reads the codes in the patterns and then maps them onto particular webpages.

The extended Artcodes app enables the users to create and share their own interactive experiences that involve scanning decorated objects. First, they can learn to draw scannable patterns that contain codes. Then they can map these to webpages of their choosing. Alternatively the users can appropriate an existing experience, opening it up and editing its codes to point at different pages of their choosing as described in the following Carolan blog post. Finally, they can share their new experience with other Artcodes users. They can also update their experience later on, resetting its codes to point at new web pages.

The Carolan guitar team led by Professor Steve Benford are hoping that these features will empower all sorts of creative people become Artcode designers and users and that they can grow an active Artcodes community.

To find out more, visit Professor Steve Benford’s latest Carolan blog post.

FAST partners meet in Paris for their 2nd all-hands meeting

24900797660_1f47119ff5_cOn the 15-16 February this year, FAST IMPACt partners met at the University of London Institute in Paris for their 2nd all-hands meeting located in beautiful surroundings of the Invalides quarter, and in sunny Parisian weather. The meeting was a combination of keynote talks, ‘show and tell’ presentations, group discussions and break-out sessions. Forty-five (45) partners participated in the meeting activities.

The first day of the meeting began with an opening session by Professor Mark Sandler, FAST IMPACt Principal Investigator, Queen Mary University of London, with his overview of the latest project progress. The first keynote presentation of the day, “Creative Musical Expression with Human-Computer Partherships”, was delivered by Professor Wendy Mackay (Université Paris-Saclay). Mackay’s presentation described her and her colleague’s work with contemporary music composers at IRCAM, in Paris, which led to a series of interactive composition tools based on interactive paper: composers were given the possibility to express highly diverse musical ideas on paper and make their own associations with powerful composition tools, such as Open Music. The goal is to empower creative professionals through reciprocal human-computer partnerships, such that users and interactive systems each learn from and affect each others behaviour.

Then, the Co-Investigators, Prof Steve Benford (Nottingham), Professor Dave de Roure (Oxford), and Professor Geraint Wiggins (Queen Mary University of London), presented their progress reports on five (5) of the eight (8) project’s Work Threads – Inference, Workflow, Interface, Digital Musical Objects and Ethnography and Design. This was followed by the second keynote presentation by Jean-Julien Aucouturier (IRCAM) on “Musical Friends and Foes: Coordinated Musical Behaviour Can Communicate Social Intent”. He and his team tasked dyads of expert improvisers to use music to communicate a series of inter-personal attitudes, such as being domineering, disdainful or conciliatory, to one another and then asked external judges to attempt to decode the type of social intent communicated in the recordings of these duets, by relying dichotically on either both or only one audio channel. The results obtained establish that music can not only mediate, but also directly communicate both affiliatory and non-affiliatory social behaviours.

The first day ended with a show tell session in which the postdocs and PhD students presented their latest research related to the project demonstrators.

The morning session of the second day began with a group discussion of the project partners on the results presented so far facilitated by François Pachet, SONY CSL, followed by the third and last keynote presentation by Professor Ichiro Fujinaga (Mc Gill University), “Single Interface for Music Score Searching and Analysis Project”. The goal of the Single Interface for Music Score Searching and Analysis project (SIMSSA:http://simssa.ca), as explained Fujinaga, is to teach computers to recognize the musical symbols in these images and assemble the data on a single website, making it a comprehensive search and analysis system for online musical scores. Based on the optical music recognition (OMR) technology, they are creating an infra-structure and tools for processing music documents, transforming vast music collections into symbolic representations that can be searched, studied, analyzed, and performed.

Outcomes of the 2nd all-hands meeting will be discussed among the project partners during the following months; they will reconvene in December for their 3rd midterm all-hands meeting in the Cotswolds, 5th-8th December 2016.

For photos of the Paris meeting, click here:
https://www.semanticaudio.ac.uk/workshop-photos/?tag=54

 

Seminar ‘Cross-cultural Music Mood Recognition’

22 April 2016, 2 – 3 pm,
Conference Room (room 278),
Oxford e-Research Centre, 7 Keble Road, Oxford, OX1 3QG

Dr. Xiao Hu will give a seminar on “Cross-cultural Music Mood Recognition” at the Oxford e-Research Centre on 22nd April.

The seminar is open to all. For further information, see:
http://www.oerc.ox.ac.uk/events/cross-cultural-music-mood-recognition

Dr. Xiao Hu is an Assistant Professor in the Division of Information and Technology Studies in the Faculty of Education of the University of Hong Kong. She has been studying music mood recognition and MIR evaluation since 2006 and won several international awards. Dr. Hu was a tutorial speaker on music affect recognition (2012) and a conference co-chair
(2014) in the International Society for Music Information Retrieval
(ISMIR) Conference. Dr. Hu obtained her Ph.D. in Library and Information Science and a Master’s in Computer Science from the University of Illinois.