This article needs a technical review. How you can help.
The stream
property of the AudioContext
interface represents a MediaStream
containing a single AudioMediaStreamTrack
with the same number of channels as the node itself.
You can use this property to get a stream out of the audio graph and feed it into another construct, such as a Media Recorder.
SyntaxEdit
var audioCtx = new AudioContext();
var destination = audioCtx.createMediaStreamDestination();
var myStream = destination.stream;
Value
A MediaStream
.
ExampleEdit
In the following simple example, we create a MediaStreamAudioDestinationNode
, an OscillatorNode
and a MediaRecorder
(the example will therefore only work in Firefox at this time.) The MediaRecorder
is set up to record information from the MediaStreamDestinationNode
.
When the button is clicked, the oscillator starts, and the MediaRecorder
is started. When the button is stopped, the oscillator and MediaRecorder
both stop, and the MediaRecorder
requests data. This causes the dataavailable
event to fire, and the event data is pushed into the chunks
array. After that, the stop
event fires, a new blob
is made of type opus — which contains the data in the chunks
array, and a new window (tab) is then opened that points to a URL created from the blob.
From here, you can play and save the opus file.
<!DOCTYPE html>
<html>
<head>
<title>createMediaStreamDestination() demo</title>
</head>
<body>
<h1>createMediaStreamDestination() demo</h1>
<p>Encoding a pure sine wave to an Opus file </p>
<button>Make sine wave</button>
<script>
var b = document.querySelector("button");
var clicked = false;
var chunks = [];
var ac = new AudioContext();
var osc = ac.createOscillator();
var dest = ac.createMediaStreamDestination();
var mediaRecorder = new MediaRecorder(dest.stream);
osc.connect(dest);
b.addEventListener("click", function(e) {
if (!clicked) {
mediaRecorder.start();
osc.start(0);
e.target.innerHTML = "Stop recording";
clicked = true;
} else {
mediaRecorder.stop();
mediaRecorder.requestData();
osc.stop(0);
e.target.disabled = true;
}
});
mediaRecorder.ondataavailable = function(evt) {
// push each chunk (blobs) in an array
chunks.push(evt.data);
};
mediaRecorder.onstop = function(evt) {
// Make blob out of our blobs, and open it.
var blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
window.location.href = URL.createObjectURL(blob);
};
</script>
</body>
</html>
Note: You can view this example live, or study the source code, on Github.
SpecificationsEdit
Specification | Status | Comment |
---|---|---|
Web Audio API The definition of 'stream' in that specification. |
Working Draft |