HTML and CSS Reference
In-Depth Information
mozSetup( channels , sampleRate )
Defines the channels and sample rate for the generated audio stream
mozWriteAudio( buffer )
Writes the samples, from an array, for the generated audio
mozCurrentSampleOffset()
Gets the current playback position of the audio, denoted in samples
Discussion
This particular implementation of audio has somewhat limited support. In fact, only
Firefox 4+ and Chrome Beta currently support it. As such, it is more an experimental
approach than something geared for mainstream use.
If you happen to be the experimental type, though, check out this short video presen-
tation of what is possible with the Mozilla Audio Data API: http://www.youtube.com/
watch?v=1Uw0CrQdYYg .
See Also
The transcript for the “jasmid—MIDI synthesis with JavaScript and HTML5 audio”
talk from Barcamp London 8 provides a very high-level discussion about the challenges
and practical implications of generating audio on the fly, in the browser: http://matt
.west.co.tt/music/jasmid-midi-synthesis-with-javascript-and-html5-audio/ .
4.4 Visualizing <audio> Using <canvas>
Problem
You want to create a visualization of your HTML5 audio using canvas .
Solution
This example delivers a rudimentary canvas implementation that visualizes audio with
waveforms (see Figure 4-3 ):
<audio src="audio.ogg"></audio>
<canvas width="512" height="100"></canvas>
<button title="Generate Waveform" onclick="genWave();">Generate Waveform</button>
<script>
function genWave(){
var audio = document.getElementsByTagName("audio")[0];
var canvas = document.getElementsByTagName("canvas")[0];
var context = canvas.getContext('2d');
audio.addEventListener("MozAudioAvailable", buildWave, false);
function buildWave (event){
var channels = audio.mozChannels;
var frameBufferLength = audio.mozFrameBufferLength;
 
Search WWH ::




Custom Search