Interactive navigable audio visualization using WebAudio and Canvas.
Create an instance:
var wavesurfer = Object.create(WaveSurfer);
Initialize it with a container element (plus some options):
wavesurfer.init({
container: '#wave',
waveColor: 'violet',
progressColor: 'purple'
});
Subscribe to some events:
wavesurfer.on('ready', function () {
wavesurfer.play();
});
Load an audio file from a URL:
wavesurfer.load('example/media/demo.wav');
Or visualize your audio files via drag'n'drop:
wavesurfer.bindDragNDrop(document.body);
See the example code here.
container
– CSS-selector or HTML-element where the waveform to be drawn. This is the only required parameter.height
– the height of the waveform.128
by default.skipLength
– number of seconds to skip with theskipForward()
andskipBackward()
methods (2
by default).minPxPerSec
– minimum number of pixels per second of audio (1
by default).fillParent
– whether to fill the entire container or draw only according tominPxPerSec
(true
by default).scrollParent
– whether to scroll the container with a lengthy waveform. Otherwise the waveform is shrinked to container width (seefillParent
).normalize
– iftrue
, normalize by the maximum peak instead of 1.0 (false
by default).pixelRatio
–1
by default for performance considerations (see #22), but you can set it towindow.devicePixelRatio
.audioContext
– use your own previously initializedAudioContext
or leave blank.cursorWidth
– 1 px by default.markerWidth
– 1 px by default.waveColor
– the fill color of the waveform.progressColor
cursorColor
All methods are intentionally public, but the most readily available are the following:
init(params)
– initializes with the options listed above.on(eventName, callback)
– subscribes to an event.load(url)
– loads an audio from URL via XHR.play()
– starts playback from the current position.pause()
– stops playback.playPause()
– plays if paused, pauses if playing.stop()
– stops and goes to the beginning.skipForward()
skipBackward()
skip(offset)
– skips a number of seconds from the current position (use a negative value to go backwards).setVolume(newVolume)
– sets the playback volume to a new value (use a floating point value between 0 and 1, 0 being no volume and 1 being full volume).toggleMute()
– toggles the volume on and off.mark(options)
– creates a visual marker on the waveform. Options areid
(random if not set),position
(in seconds),color
andwidth
(defaults to the global optionmarkerWidth
). Returns a marker object which you can update later (marker.update(options)
).clearMarks()
– removes all markers.bindDragNDrop([dropTarget])
– starts listening to drag'n'drop on an element. The default element isdocument
. Loads the dropped audio.empty()
– clears the waveform as if a zero-length audio is loaded.
You can insert your own WebAudio nodes into the graph using the method setFilter
. Example:
var lowpass = wavesurfer.backend.ac.createBiquadFilter();
wavesurfer.backend.setFilter(lowpass);
You can listen to the following events:
ready
– when audio is loaded, decoded and the waveform drawn.loading
– fires continuously when loading via XHR or drag'n'drop. Callback recieves loading progress in percents (from 0 to 100) and the event target.seek
– on seeking.play
– when it starts playing.mark
– when a mark is reached. Passes the mark object.error
– on error, passes an error message.
Each of mark objects also fire the event reached
when played over.
Initial idea by Alex Khokhulin. Many thanks to the awesome contributors!
This work is licensed under a Creative Commons Attribution 3.0 Unported License.