From 074785cea106179cb3305637055ab0a009ca74f2 Mon Sep 17 00:00:00 2001 From: Peter Bengtsson Date: Tue, 8 Dec 2020 14:42:52 -0500 Subject: initial commit --- .../guide/audio_and_video_manipulation/index.html | 404 +++++++++++++++++++++ 1 file changed, 404 insertions(+) create mode 100644 files/ru/web/guide/audio_and_video_manipulation/index.html (limited to 'files/ru/web/guide/audio_and_video_manipulation/index.html') diff --git a/files/ru/web/guide/audio_and_video_manipulation/index.html b/files/ru/web/guide/audio_and_video_manipulation/index.html new file mode 100644 index 0000000000..0fc7587321 --- /dev/null +++ b/files/ru/web/guide/audio_and_video_manipulation/index.html @@ -0,0 +1,404 @@ +--- +title: Обработка аудио и видео +slug: Web/Guide/Audio_and_video_manipulation +tags: + - Видео + - Медиа + - Обучение + - Примеры + - Рекомендации + - аудио +translation_of: Web/Guide/Audio_and_video_manipulation +--- +
+

Веб-технологии примечательны тем, что они позволяют использовать различные инструменты в совокупности. Например, можно проводить манипуляции над имеющимися в браузере аудио и видео потоками с помощью {{htmlelement("canvas")}}, WebGL или Web Audio API: напрямую изменять аудио и видео, т.е. добавлять эффекты к аудио (реверберацию, компрессор), или к видео (фильтры ч/б, сепия и т.д.). В этой статье рассказывается о том, как это сделать.

+
+ +
+

Далее ещё в процессе перевода.

+
+ +

Обработка видео

+ +

Иногда удобно перенимать индивидуально размеры кадра в пикселях с каждого конкретного видео.

+ +

Видео и холст (canvas)

+ +

Элемент "холст" ({{htmlelement("canvas")}}) — представляет поверхность (область) для рисования графикой на веб-странице. Это очень мощный инструмент, поэтому он может использоваться совместно с видео.

+ +

Обычно это происходит следующим образом:

+ +
    +
  1. Write a frame from the {{htmlelement("video")}} element to an intermediary {{htmlelement("canvas")}} element.
  2. +
  3. Read the data from the intermediary <canvas> element and manipulate it.
  4. +
  5. Write the manipulated data to your "display" <canvas>.
  6. +
  7. Pause and repeat.
  8. +
+ +

For example, let's process a video to display it in greyscale. In this case, we'll show both the source video and the output greyscale frames. Ordinarily, if you were implementing a "play video in greyscale" feature, you'd probably add display: none to the style for the <video> element, to keep the source video from being drawn to the screen while showing only the canvas showing the altered frames.

+ +

HTML

+ +

We can set up our video player and <canvas> element like this:

+ +
<video id="my-video" controls="true" width="480" height="270" crossorigin="anonymous">
+  <source src="http://jplayer.org/video/webm/Big_Buck_Bunny_Trailer.webm" type="video/webm">
+  <source src="http://jplayer.org/video/m4v/Big_Buck_Bunny_Trailer.m4v" type="video/mp4">
+</video>
+
+<canvas id="my-canvas" width="480" height="270"></canvas>
+ +

JavaScript

+ +

This code handles altering the frames.

+ +
var processor = {
+  timerCallback: function() {
+    if (this.video.paused || this.video.ended) {
+      return;
+    }
+    this.computeFrame();
+    var self = this;
+    setTimeout(function () {
+      self.timerCallback();
+    }, 16); // roughly 60 frames per second
+  },
+
+  doLoad: function() {
+    this.video = document.getElementById("my-video");
+    this.c1 = document.getElementById("my-canvas");
+    this.ctx1 = this.c1.getContext("2d");
+    var self = this;
+
+    this.video.addEventListener("play", function() {
+      self.width = self.video.width;
+      self.height = self.video.height;
+      self.timerCallback();
+    }, false);
+  },
+
+  computeFrame: function() {
+    this.ctx1.drawImage(this.video, 0, 0, this.width, this.height);
+    var frame = this.ctx1.getImageData(0, 0, this.width, this.height);
+    var l = frame.data.length / 4;
+
+    for (var i = 0; i < l; i++) {
+      var grey = (frame.data[i * 4 + 0] + frame.data[i * 4 + 1] + frame.data[i * 4 + 2]) / 3;
+
+      frame.data[i * 4 + 0] = grey;
+      frame.data[i * 4 + 1] = grey;
+      frame.data[i * 4 + 2] = grey;
+    }
+    this.ctx1.putImageData(frame, 0, 0);
+
+    return;
+  }
+};  
+ +

Когда страница загрузилась осуществите вызов:

+ +
processor.doLoad()
+ +

Результат

+ +

{{EmbedLiveSample("Video_and_canvas", '100%', 580)}}

+ +

This is a pretty simple example showing how to manipulate video frames using a canvas. For efficiency, you should consider using {{domxref("Window.requestAnimationFrame", "requestAnimationFrame()")}} instead of setTimeout() when running on browsers that support it.

+ +
+

Примечание: Due to potential security issues if your video is on a different domain than your code, you'll need to enable CORS (Cross Origin Resource Sharing) on your video server.

+
+ +

Видео и WebGL

+ +

WebGL is a powerful API that uses canvas to draw hardware-accelerated 3D or 2D scenes. You can combine WebGL and the {{htmlelement("video")}} element to create video textures, which means you can put video inside 3D scenes.

+ +

{{EmbedGHLiveSample('webgl-examples/tutorial/sample8/index.html', 670, 510) }}

+ +
+

Примечание: You can find the source code of this demo on GitHub (see it live also).

+
+ +

Скорость воспроизведения

+ +

We can also adjust the rate that audio and video plays at using an attribute of the {{htmlelement("audio")}} and {{htmlelement("video")}} element called {{domxref("HTMLMediaElement.playbackRate", "playbackRate")}}. playbackRate is a number that represents a multiple to be applied to the rate of playback, for example 0.5 represents half speed while 2 represents double speed.

+ +

Note that the playbackRate property works with both <audio> and <video>, but in both cases, it changes the playback speed but not the pitch. To manipulate the audio's pitch you need to use the Web Audio API. See the {{domxref("AudioBufferSourceNode.playbackRate")}} property.

+ +

HTML

+ +
<video id="my-video" controls
+       src="http://jplayer.org/video/m4v/Big_Buck_Bunny_Trailer.m4v">
+</video>
+ +

JavaScript

+ +
var myVideo = document.getElementById('my-video');
+myVideo.playbackRate = 2;
+ + + +

{{ EmbedLiveSample('Playable_code', 700, 425) }}

+ +
+

Примечание: Попробуйте запустить этот пример.

+
+ +

Обработка аудио

+ +

playbackRate aside, to manipulate audio you'll typically use the Web Audio API.

+ +

Выбор источника аудио

+ +

The Web Audio API can receive audio from a variety of sources, then process it and send it back out to an {{domxref("AudioDestinationNode")}} representing the output device to which the sound is sent after processing.

+ + + + + + + + + + + + + + + + + + + + + + + + + + +
If the audio source is...Use this Web Audio node type
An audio track from an HTML {{HTMLElement("audio")}} or {{HTMLElement("video")}} element{{domxref("MediaElementAudioSourceNode")}}
A plain raw audio data buffer in memory{{domxref("AudioBufferSourceNode")}}
An oscillator generating a sine wave or other computed waveform{{domxref("OscillatorNode")}}
An audio track from WebRTC (such as the microphone input you can get using {{domxref("MediaDevices.getUserMedia", "getUserMedia()")}}.{{domxref("MediaStreamAudioSourceNode")}}
+ +

Аудио фильтры

+ +

The Web Audio API has a lot of different filter/effects that can be applied to audio using the {{domxref("BiquadFilterNode")}}, for example.

+ +

HTML

+ +
<video id="my-video" controls
+       src="myvideo.mp4" type="video/mp4">
+</video>
+ +

JavaScript

+ +
var context = new AudioContext(),
+    audioSource = context.createMediaElementSource(document.getElementById("my-video")),
+    filter = context.createBiquadFilter();
+audioSource.connect(filter);
+filter.connect(context.destination);
+
+// Configure filter
+filter.type = "lowshelf";
+filter.frequency.value = 1000;
+filter.gain.value = 25;
+ + + +

{{ EmbedLiveSample('Playable_code_2', 700, 425) }}

+ +
+

Примечание: unless you have CORS enabled, to avoid security issues your video should be on the same domain as your code.

+
+ +

Типичные для аудио фильтры

+ +

These are some of the common types of audio filter you can apply:

+ + + +
+

Примечание: Более подробно смотрите здесь: {{domxref("BiquadFilterNode")}}

+
+ +

Convolutions and impulses

+ +

It's also possible to apply impulse responses to audio using the {{domxref("ConvolverNode")}}. An impulse response is the sound created after a brief impulse of sound (like a hand clap). An impulse response will signify the environment in which the impulse was created (for example, an echo created by clapping your hands in a tunnel).

+ +

Пример

+ +
var convolver = context.createConvolver();
+convolver.buffer = this.impulseResponseBuffer;
+// Connect the graph.
+source.connect(convolver);
+convolver.connect(context.destination);
+
+ +

See this Codepen for an applied (but very, very silly; like, little kids will giggle kind of silly) example.

+ +

Spatial audio

+ +

We can also position audio using a panner node. A panner node—{{domxref("PannerNode")}}—lets us define a source cone as well as positional and directional elements, all in 3D space as defined using 3D cartesian coordinates.

+ +

Пример

+ +
var panner = context.createPanner();
+panner.coneOuterGain = 0.2;
+panner.coneOuterAngle = 120;
+panner.coneInnerAngle = 0;
+
+panner.connect(context.destination);
+source.connect(panner);
+source.start(0);
+
+// Position the listener at the origin.
+context.listener.setPosition(0, 0, 0);
+ +
+

Примечание: You can find an example on our GitHub repository (see it live also).

+
+ +

Кодеки JavaScript

+ +

It's also possible to manipulate audio at a low level using JavaScript. This can be useful should you want to create audio codecs.

+ +

Libraries currently exist for the following formats :

+ + + +
+

Примечание: At Audiocogs, you can Try out a few demos; Audiocogs also provides a framework, Aurora.js, which is intended to help you author your own codecs in JavaScript.

+
+ +

Примеры

+ + + +

См. также

+ +

Tutorials

+ + + +

Reference

+ + + +
{{QuickLinksWithSubpages("/en-US/docs/Web/Apps/Fundamentals/")}}
+ +
-- cgit v1.2.3-54-g00ecf