如何處理使用者的麥克風音訊(')

François Beaufort
François Beaufort

您可以使用 Media Capture and Streams API,在網頁平台上存取使用者的攝影機和麥克風。getUserMedia() 方法會提示使用者存取攝影機和/或麥克風,以擷取媒體串流。接著,您可以在獨立的 Web Audio 執行緒中,使用 AudioWorklet 處理這個串流,提供延遲極低的音訊處理功能。

以下範例說明如何以高效能方式處理使用者麥克風的音訊。

let stream;

startMicrophoneButton.addEventListener("click", async () => {
  // Prompt the user to use their microphone.
  stream = await navigator.mediaDevices.getUserMedia({
    audio: true,
  });
  const context = new AudioContext();
  const source = context.createMediaStreamSource(stream);

  // Load and execute the module script.
  await context.audioWorklet.addModule("processor.js");
  // Create an AudioWorkletNode. The name of the processor is the
  // one passed to registerProcessor() in the module script.
  const processor = new AudioWorkletNode(context, "processor");

  source.connect(processor).connect(context.destination);
  log("Your microphone audio is being used.");
});

stopMicrophoneButton.addEventListener("click", () => {
  // Stop the stream.
  stream.getTracks().forEach(track => track.stop());

  log("Your microphone audio is not used anymore.");
});
// processor.js
// This file is evaluated in the audio rendering thread
// upon context.audioWorklet.addModule() call.

class Processor extends AudioWorkletProcessor {
  process([input], [output]) {
    // Copy inputs to outputs.
    output[0].set(input[0]);
    return true;
  }
}

registerProcessor("processor", Processor);

瀏覽器支援

MediaDevices.getUserMedia()

Browser Support

  • Chrome: 53.
  • Edge: 12.
  • Firefox: 36.
  • Safari: 11.

Source

Web Audio

Browser Support

  • Chrome: 35.
  • Edge: 12.
  • Firefox: 25.
  • Safari: 14.1.

Source

AudioWorklet

Browser Support

  • Chrome: 66.
  • Edge: 79.
  • Firefox: 76.
  • Safari: 14.1.

Source

延伸閱讀

示範

開啟試用版