blob: cc63e7992c36ecbf395e59234a65ea3d0f1c2fc8 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
|
---
title: NotifyAudioAvailableEvent
slug: Web/API/NotifyAudioAvailableEvent
translation_of: Web/API/NotifyAudioAvailableEvent
---
<p>{{APIRef("Web Audio API")}}{{Non-standard_header}}{{Deprecated_header}}This interface defines the event for audio elements triggered when the Audiobuffer is full.</p>
<p class="brush: html" id="Attributes">Attributes</p>
<div id="section_2">
<table class="standard-table">
<tbody>
<tr>
<td class="header">Attribute</td>
<td class="header">Type</td>
<td class="header">Description</td>
</tr>
<tr>
<td><code>frameBuffer</code></td>
<td><a href="/en-US/docs/JavaScript/Typed_arrays/Float32Array"><code>Float32Array</code></a></td>
<td>The <strong>frameBuffer</strong> attribute contains a typed array (<code>Float32Array</code>) with the raw audio data (32-bit float values) obtained from decoding the audio (e.g., the raw data being sent to the audio hardware vs. encoded audio). This is of the form [channel1, channel2, ..., channelN, channel1, channel2, ..., channelN, ...]. All audio frames are normalized to a length of channels * 1024 by default, but could be any length between 512 and 16384 if the user has set a different length using the <strong>mozFrameBufferLength</strong> attribute. <strong>Read only.</strong></td>
</tr>
<tr>
<td><code>time</code></td>
<td><code>float</code></td>
<td>The <strong>time</strong> attribute contains a float representing the time in seconds of the first sample in the <strong>frameBuffer</strong> array since the start of the audio track.</td>
</tr>
</tbody>
</table>
</div>
<p> </p>
|