home projects teaching cv/resume coursework music

Automated Rhythmic Transformation of Musical Audio

authors: Jason A. Hockman, Matthew E. P. Davies, Juan P. Bello, Mark D. Plumbley

abstract: Time-scale transformations of audio signals have traditionally relied exclusively upon manipulations of tempo. We present a novel technique for automatic mixing and synchronization between two musical signals. In this transformation, the original signal assumes the tempo, meter, and rhythmic structure of the model signal, while the extracted downbeats and salient intra-measure infrastructure of the original are maintained.


Crossfade Between Genres Original Model Crossfade
rhumba into breakbeat with rhumba rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED
Original Transformed Original Model Individual
pop with reggae rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED
electronic dance with 2-step rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED
Comparison of Original and Transformed Original Model Mixed**
electronic dance with jazz rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED
rock with electronic dance rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED
hip hop with rumba rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED
drum & bass with rock rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED
rock with drum & bass rhythm
AUTOREPLACED
AUTOREPLACED
AUTOREPLACED

**MIXED** files consist of:

  1. model signal on left channel
  2. transformed signal on right channel