I'm experimenting with the W3C Web Audio API implemented and enabled in Google Chrome Canary builds. (i have 15.0.865.0 canary on Windows 7 x64 at the moment) Demo applications work just fine.
I like to use MediaElementAudioSourceNode
because i like to play a larger OGG music. For such audio files the API says that i should use this instead of AudioBufferSourceNode
.
I've written a very simple example:
<script type="text/jav开发者_如何转开发ascript">
var audioElement = document.createElement('audio');
audioElement.setAttribute('src', 'dubfx.ogg');
var audioContext = new webkitAudioContext();
var streamingAudioSource = audioContext.createMediaElementSource(audioElement);
streamingAudioSource.connect(audioContext.destination);
</script>
However i just hear clicks instead of actual audio. I see no errors in the JavaScript console so i guess the code is okay, maybe some initialization is missing. If i call audioElement.play()
instead of the routing through the Web Audio API, the music plays just fine. Did i miss something or maybe the current WebKit implementation is broken?
Just talked to Chris Rogers (the spec editor) about this today. The MediaElementAudioSourceNode
is exposed in Canary/Chromium builds but the internals are not hooked up yet. It'll be a while longer before you can use the <audio>
tag with the Web Audio API.
精彩评论