Paper ID | MLSP-23.5 | ||
Paper Title | LOOPNET: MUSICAL LOOP SYNTHESIS CONDITIONED ON INTUITIVE MUSICAL PARAMETERS | ||
Authors | Pritish Chandna, Antonio Ramires, Xavier Serra, Emilia Gómez, Universitat Pompeu Fabra, Spain | ||
Session | MLSP-23: Applications in Music and Audio Processing | ||
Location | Gather.Town | ||
Session Time: | Wednesday, 09 June, 16:30 - 17:15 | ||
Presentation Time: | Wednesday, 09 June, 16:30 - 17:15 | ||
Presentation | Poster | ||
Topic | Machine Learning for Signal Processing: [MLR-MUSAP] Applications in music and audio processing | ||
IEEE Xplore Open Preview | Click here to view in IEEE Xplore | ||
Abstract | Loops, seamlessly repeatable musical segments, are a corner-stone of modern music production. Contemporary artists often mix and match various sampled or pre-recorded loops based on musical criteria such as rhythm, harmony and timbral texture to create com-positions. Taking such criteria into account, we present LoopNet, a feedforward generative model for creating loops conditioned on intuitive parameters. We leverage Music Information Retrieval (MIR) models as well as a large collection of public loop samples in our study and use the Wave-U-Net architecture to map control parameters to audio. We also evaluate the quality of the generated audio and propose intuitive controls for composers to map the ideas in their minds to an audio loop. |