In [ ]:
%pylab inline
import matplotlib.pylab as pylab
pylab.rcParams['figure.figsize'] = 16, 8  # that's default image size for this interactive session

TimeSide API

Timeside API is based on different core processing unit called processors :

  • Decoders (timeside.api.IDecoder) that enables to decode a giving audio source and split it up into frames for further processing
  • Analyzers (timeside.api.IAnalyzer) that provides some signal processing module to analyze incoming audio frames
  • Encoders (timeside.api.IEncoder) that can encode incoming frames back into an audio object
  • Graphers (timeside.api.IGrapher) that can display some representations of the signal or corresponding extracted features

Decoders

In [ ]:
import timeside.core

from timeside.core import list_processors

list_processors(timeside.core.api.IDecoder)

Analyzers

In [ ]:
list_processors(timeside.core.api.IAnalyzer)

Encoders

In [ ]:
list_processors(timeside.core.api.IEncoder)

Graphers

In [ ]:
list_processors(timeside.core.api.IGrapher)

Processors pipeline

All these processors can be chained to form a process pipeline.

Let first define a decoder that reads and decodes audio from a file

In [ ]:
from timeside.core import get_processor

from timeside.core.tools.test_samples import samples
file_decoder = get_processor('file_decoder')(samples['C4_scale.wav'])

And then some other processors

In [ ]:
# analyzers
pitch = get_processor('aubio_pitch')()
level = get_processor('level')()

# Encoder
mp3 = get_processor('mp3_encoder')('/tmp/guitar.mp3', overwrite=True)

# Graphers
specgram = get_processor('spectrogram_lin')()
waveform = get_processor('waveform_simple')()

Let's now define a process pipeline with all these processors and run it

In [ ]:
pipe = (file_decoder | pitch | level | mp3 | specgram | waveform)
pipe.run()

Analyzers results are available through the pipe:

In [ ]:
pipe.results.keys()

or from the analyzer:

In [ ]:
pitch.results.keys()
In [ ]:
pitch.results['aubio_pitch.pitch'].keys()
In [ ]:
pitch.results['aubio_pitch.pitch']

Grapher result can also be display or save into a file

In [ ]:
imshow(specgram.render(), origin='lower')
In [ ]:
imshow(waveform.render(), origin='lower')
In [ ]:
waveform.render('/tmp/waveform.png')

And TimeSide can be embedded into a web page dynamically. For example, in Telemeta:

In [ ]:
from IPython.display import HTML
HTML('<iframe width=1300 height=260 frameborder=0 scrolling=no marginheight=0 marginwidth=0 src=http://demo.telemeta.org/archives/items/6/player/1200x170></iframe>')