--- title: AudioContext.createMediaStreamDestination() slug: Web/API/AudioContext/createMediaStreamDestination translation_of: Web/API/AudioContext/createMediaStreamDestination ---
{{ APIRef("Web Audio API") }}
{{ domxref("AudioContext") }}接口的createMediaStreamDestination()方法用于创建一个新的对象,该对象关联着表示音频流的一个
WebRTC {{domxref("MediaStream")}} ,音频流可以存储在本地文件或者被发送到另外一台计算机.
The {{domxref("MediaStream")}} is created when the node is created and is accessible via the {{domxref("MediaStreamAudioDestinationNode")}}'s stream
attribute. This stream can be used in a similar way as a MediaStream
obtained via {{domxref("navigator.getUserMedia") }} — it can, for example, be sent to a remote peer using the RTCPeerConnection
addStream()
method.
For more details about media stream destination nodes, check out the {{domxref("MediaStreamAudioDestinationNode")}} reference page.
var audioCtx = new AudioContext(); var destination = audioCtx.createMediaStreamDestination();
A {{domxref("MediaStreamAudioDestinationNode")}}.
In the following simple example, we create a {{domxref("MediaStreamAudioDestinationNode")}}, an {{ domxref("OscillatorNode") }} and a {{ domxref("MediaRecorder") }} (the example will therefore only work in Firefox at this time.) The MediaRecorder
is set up to record information from the MediaStreamDestinationNode
.
When the button is clicked, the oscillator starts, and the MediaRecorder
is started. When the button is stopped, the oscillator and MediaRecorder
both stop. Stopping the MediaRecorder
causes the dataavailable
event to fire, and the event data is pushed into the chunks
array. After that, the stop
event fires, a new blob
is made of type opus — which contains the data in the chunks
array, and a new window (tab) is then opened that points to a URL created from the blob.
From here, you can play and save the opus file.
<!DOCTYPE html> <html> <head> <title>createMediaStreamDestination() demo</title> </head> <body> <h1>createMediaStreamDestination() demo</h1> <p>Encoding a pure sine wave to an Opus file </p> <button>Make sine wave</button> <audio controls></audio> <script> var b = document.querySelector("button"); var clicked = false; var chunks = []; var ac = new AudioContext(); var osc = ac.createOscillator(); var dest = ac.createMediaStreamDestination(); var mediaRecorder = new MediaRecorder(dest.stream); osc.connect(dest); b.addEventListener("click", function(e) { if (!clicked) { mediaRecorder.start(); osc.start(0); e.target.innerHTML = "Stop recording"; clicked = true; } else { mediaRecorder.stop(); osc.stop(0); e.target.disabled = true; } }); mediaRecorder.ondataavailable = function(evt) { // push each chunk (blobs) in an array chunks.push(evt.data); }; mediaRecorder.onstop = function(evt) { // Make blob out of our blobs, and open it. var blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' }); document.querySelector("audio").src = URL.createObjectURL(blob); }; </script> </body> </html>
Note: You can view this example live, or study the source code, on Github.
Specification | Status | Comment |
---|---|---|
{{SpecName('Web Audio API', '#widl-AudioContext-createMediaStreamDestination-MediaStreamAudioDestinationNode', 'createMediaStreamDestination()')}} | {{Spec2('Web Audio API')}} |