aboutsummaryrefslogtreecommitdiff
path: root/files/de/web/api
diff options
context:
space:
mode:
Diffstat (limited to 'files/de/web/api')
-rw-r--r--files/de/web/api/animationevent/index.html190
-rw-r--r--files/de/web/api/audiocontext/index.html232
-rw-r--r--files/de/web/api/canvasrenderingcontext2d/setlinedash/index.html179
-rw-r--r--files/de/web/api/eventtarget/index.html174
-rw-r--r--files/de/web/api/file/getastext/index.html78
-rw-r--r--files/de/web/api/rtcpeerconnection/index.html379
-rw-r--r--files/de/web/api/webxr_device_api/index.html298
7 files changed, 0 insertions, 1530 deletions
diff --git a/files/de/web/api/animationevent/index.html b/files/de/web/api/animationevent/index.html
deleted file mode 100644
index 7bd808e0ca..0000000000
--- a/files/de/web/api/animationevent/index.html
+++ /dev/null
@@ -1,190 +0,0 @@
----
-title: AnimationEvent
-slug: Web/API/AnimationEvent
-tags:
- - API
- - Experimental
- - Expérimental(2)
- - Interface
- - NeedsTranslation
- - Reference
- - Référence(2)
- - TopicStub
- - Web Animations
-translation_of: Web/API/AnimationEvent
----
-<p>{{SeeCompatTable}}{{APIRef("Web Animations API")}}</p>
-
-<p>The <strong><code>AnimationEvent</code></strong> interface represents events providing information related to <a href="/en-US/docs/Web/Guide/CSS/Using_CSS_animations">animations</a>.</p>
-
-<p>{{InheritanceDiagram}}</p>
-
-<h2 id="Properties">Properties</h2>
-
-<p><em>Also inherits properties from its parent {{domxref("Event")}}</em>.</p>
-
-<dl>
- <dt>{{domxref("AnimationEvent.animationName")}} {{readonlyInline}}</dt>
- <dd>Is a {{domxref("DOMString")}} containing the value of the {{cssxref("animation-name")}} CSS property associated with the transition.</dd>
- <dt>{{domxref("AnimationEvent.elapsedTime")}} {{readonlyInline}}</dt>
- <dd>Is a <code>float</code> giving the amount of time the animation has been running, in seconds, when this event fired, excluding any time the animation was paused. For an <code>"animationstart"</code> event, <code>elapsedTime</code> is <code>0.0</code> unless there was a negative value for {{cssxref("animation-delay")}}, in which case the event will be fired with <code>elapsedTime</code> containing  <code>(-1 * </code><em>delay</em><code>)</code>.</dd>
- <dt>{{domxref("AnimationEvent.pseudoElement")}} {{readonlyInline}}</dt>
- <dd>Is a {{domxref("DOMString")}}, starting with <code>'::'</code>, containing the name of the <a href="/en-US/docs/Web/CSS/Pseudo-elements" title="/en-US/docs/Web/CSS/Pseudo-elements">pseudo-element</a> the animation runs on. If the animation doesn't run on a pseudo-element but on the element, an empty string: <code>''</code><code>.</code></dd>
-</dl>
-
-<h2 id="Constructor">Constructor</h2>
-
-<dl>
- <dt>{{domxref("AnimationEvent.AnimationEvent", "AnimationEvent()")}}</dt>
- <dd>Creates an <code>AnimationEvent</code> event with the given parameters.</dd>
-</dl>
-
-<h2 id="Methods">Methods</h2>
-
-<p><em>Also inherits methods from its parent {{domxref("Event")}}</em>.</p>
-
-<dl>
- <dt>{{domxref("AnimationEvent.initAnimationEvent()")}} {{non-standard_inline}}{{deprecated_inline}}</dt>
- <dd>Initializes a <code>AnimationEvent</code> created using the deprecated {{domxref("Document.createEvent()", "Document.createEvent(\"AnimationEvent\")")}} method.</dd>
-</dl>
-
-<h2 id="Specifications">Specifications</h2>
-
-<table class="standard-table">
- <thead>
- <tr>
- <th scope="col">Specification</th>
- <th scope="col">Status</th>
- <th scope="col">Comment</th>
- </tr>
- </thead>
- <tbody>
- <tr>
- <td>{{ SpecName('CSS3 Animations', '#AnimationEvent-interface', 'AnimationEvent') }}</td>
- <td>{{ Spec2('CSS3 Animations') }}</td>
- <td>Initial definition.</td>
- </tr>
- </tbody>
-</table>
-
-<h2 id="Browser_compatibility">Browser compatibility</h2>
-
-<p>{{CompatibilityTable}}</p>
-
-<div id="compat-desktop">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Chrome</th>
- <th>Firefox (Gecko)</th>
- <th>Internet Explorer</th>
- <th>Opera</th>
- <th>Safari</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>
- <p>1.0 {{ property_prefix("webkit") }}</p>
-
- <p>{{CompatChrome(43.0)}}</p>
- </td>
- <td>{{ CompatGeckoDesktop("6.0") }}</td>
- <td>10.0</td>
- <td>12 {{ property_prefix("o") }}<br>
- 12.10<br>
- 15.0 {{ property_prefix("webkit") }}</td>
- <td>4.0 {{ property_prefix("webkit") }}</td>
- </tr>
- <tr>
- <td><code>AnimationEvent()</code> constructor</td>
- <td>
- <p>{{CompatChrome(43.0)}}</p>
- </td>
- <td>{{ CompatGeckoDesktop("23.0") }}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- </tr>
- <tr>
- <td><code>initAnimationEvent()</code> {{non-standard_inline}}{{deprecated_inline}}</td>
- <td>1.0</td>
- <td>{{ CompatGeckoDesktop("6.0") }}<br>
- Removed in {{ CompatGeckoDesktop("23.0") }}</td>
- <td>10.0</td>
- <td>12</td>
- <td>4.0</td>
- </tr>
- <tr>
- <td><code>pseudoelement</code></td>
- <td>{{CompatNo}}</td>
- <td>{{ CompatGeckoDesktop("23.0") }}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<div id="compat-mobile">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Android</th>
- <th>Firefox Mobile (Gecko)</th>
- <th>IE Mobile</th>
- <th>Opera Mobile</th>
- <th>Safari Mobile</th>
- <th>Chrome for Android</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>{{CompatVersionUnknown}}{{ property_prefix("webkit") }}</td>
- <td>{{ CompatGeckoMobile("6.0") }}</td>
- <td>10.0</td>
- <td>12 {{ property_prefix("o") }}<br>
- 12.10<br>
- 15.0 {{ property_prefix("webkit") }}</td>
- <td>{{CompatVersionUnknown}}{{ property_prefix("webkit") }}</td>
- <td>{{CompatChrome(43.0)}}</td>
- </tr>
- <tr>
- <td><code>AnimationEvent()</code> constructor</td>
- <td>{{CompatNo}}</td>
- <td>{{ CompatGeckoMobile("23.0") }}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatChrome(43.0)}}</td>
- </tr>
- <tr>
- <td><code>initAnimationEvent()</code> {{non-standard_inline}}{{deprecated_inline}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{ CompatGeckoMobile("6.0") }}<br>
- Removed in {{ CompatGeckoMobile("23.0") }}</td>
- <td>10.0</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatNo}}</td>
- </tr>
- <tr>
- <td><code>pseudoelement</code></td>
- <td>{{CompatNo}}</td>
- <td>{{ CompatGeckoMobile("23.0") }}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<h2 id="See_also">See also</h2>
-
-<ul>
- <li><a href="/en-US/docs/CSS/Using_CSS_animations">Using CSS animations</a></li>
- <li>Animation-related CSS properties and at-rules: {{cssxref("animation")}}, {{cssxref("animation-delay")}}, {{cssxref("animation-direction")}}, {{cssxref("animation-duration")}}, {{cssxref("animation-fill-mode")}}, {{cssxref("animation-iteration-count")}}, {{cssxref("animation-name")}}, {{cssxref("animation-play-state")}}, {{cssxref("animation-timing-function")}}, {{cssxref("@keyframes")}}.</li>
-</ul>
diff --git a/files/de/web/api/audiocontext/index.html b/files/de/web/api/audiocontext/index.html
deleted file mode 100644
index cc2c2db92e..0000000000
--- a/files/de/web/api/audiocontext/index.html
+++ /dev/null
@@ -1,232 +0,0 @@
----
-title: AudioContext
-slug: Web/API/AudioContext
-translation_of: Web/API/AudioContext
----
-<p>{{APIRef("Web Audio API")}}</p>
-
-<div>
-<p>Die <code>AudioContext</code> Schnittstelle bildet einen Audioverarbeitungsdiagramm, aus mehreren miteinander verbundenen Audiomodulen bestehend, ab. Bei jedem dieser Module handelt es sich um einen Knoten ({{domxref("AudioNode")}}). Ein AudioContext kontrolliert sowohl die Erstellung der einzelnen in ihm enthaltenen Knoten als auch den Prozess der Audioverarbeitung oder des Dekodierens. Als erster Schritt muss immer ein Audio Kontext angelegt werden, da sämtliche Funktionen innerhalb dieses Kontextes ausgeführt werden.</p>
-</div>
-
-<p><code>Ein AudioContext</code> kann das Ziel von Events sein, aufgrund dessen unterstützt er auch die {{domxref("EventTarget")}} Schnittstelle.</p>
-
-<h2 id="Eigenschaften">Eigenschaften</h2>
-
-<dl>
- <dt>{{domxref("AudioContext.currentTime")}} {{readonlyInline}}</dt>
- <dd>Returns a double representing an ever-increasing hardware time in seconds used for scheduling. It starts at <code>0</code>.</dd>
- <dt>{{domxref("AudioContext.destination")}} {{readonlyInline}}</dt>
- <dd>Returns an {{domxref("AudioDestinationNode")}} representing the final destination of all audio in the context. It can be thought of as the audio-rendering device.</dd>
- <dt>{{domxref("AudioContext.listener")}} {{readonlyInline}}</dt>
- <dd>Returns the {{domxref("AudioListener")}} object, used for 3D spatialization.</dd>
- <dt>{{domxref("AudioContext.sampleRate")}} {{readonlyInline}}</dt>
- <dd>Returns a float representing the sample rate (in samples per second) used by all nodes in this context. The sample-rate of an {{domxref("AudioContext")}} cannot be changed.</dd>
- <dt>{{domxref("AudioContext.state")}} {{readonlyInline}}</dt>
- <dd>Returns the current state of the <code>AudioContext</code>.</dd>
- <dt>{{domxref("AudioContext.mozAudioChannelType")}} {{ non-standard_inline() }} {{readonlyInline}}</dt>
- <dd>Used to return the audio channel that the sound playing in an {{domxref("AudioContext")}} will play in, on a Firefox OS device.</dd>
-</dl>
-
-<h3 id="Event_handlers">Event handlers</h3>
-
-<dl>
- <dt>{{domxref("AudioContext.onstatechange")}}</dt>
- <dd>An event handler that runs when an event of type {{event("statechange")}} has fired. This occurs when the <code>AudioContext</code>'s state changes, due to the calling of one of the state change methods ({{domxref("AudioContext.suspend")}}, {{domxref("AudioContext.resume")}}, or {{domxref("AudioContext.close")}}.)</dd>
-</dl>
-
-<h2 id="Methoden">Methoden</h2>
-
-<p><em>Implementiert zusätzlich die Methoden der Schnittstelle </em>{{domxref("EventTarget")}}.</p>
-
-<dl>
- <dt>{{domxref("AudioContext.close()")}}</dt>
- <dd>Closes the audio context, releasing any system audio resources that it uses.</dd>
- <dt>{{domxref("AudioContext.createBuffer()")}}</dt>
- <dd>Creates a new, empty {{ domxref("AudioBuffer") }} object, which can then be populated by data and played via an {{ domxref("AudioBufferSourceNode") }}.</dd>
- <dt>{{domxref("AudioContext.createBufferSource()")}}</dt>
- <dd>Creates an {{domxref("AudioBufferSourceNode")}}, which can be used to play and manipulate audio data contained within an {{ domxref("AudioBuffer") }} object. {{ domxref("AudioBuffer") }}s are created using {{domxref("AudioContext.createBuffer")}} or returned by {{domxref("AudioContext.decodeAudioData")}} when it successfully decodes an audio track.</dd>
- <dt>{{domxref("AudioContext.createMediaElementSource()")}}</dt>
- <dd>Creates a {{domxref("MediaElementAudioSourceNode")}} associated with an {{domxref("HTMLMediaElement")}}. This can be used to play and manipulate audio from {{HTMLElement("video")}} or {{HTMLElement("audio")}} elements.</dd>
- <dt>{{domxref("AudioContext.createMediaStreamSource()")}}</dt>
- <dd>Creates a {{domxref("MediaStreamAudioSourceNode")}} associated with a {{domxref("MediaStream")}} representing an audio stream which may come from the local computer microphone or other sources.</dd>
- <dt>{{domxref("AudioContext.createMediaStreamDestination()")}}</dt>
- <dd>Creates a {{domxref("MediaStreamAudioDestinationNode")}} associated with a {{domxref("MediaStream")}} representing an audio stream which may be stored in a local file or sent to another computer.</dd>
- <dt>{{domxref("AudioContext.createScriptProcessor()")}}</dt>
- <dd>Creates a {{domxref("ScriptProcessorNode")}}, which can be used for direct audio processing via JavaScript.</dd>
- <dt>{{domxref("AudioContext.createStereoPanner()")}}</dt>
- <dd>Creates a {{domxref("StereoPannerNode")}}, which can be used to apply stereo panning to an audio source.</dd>
- <dt>{{domxref("AudioContext.createAnalyser()")}}</dt>
- <dd>Creates an {{domxref("AnalyserNode")}}, which can be used to expose audio time and frequency data and for example to create data visualisations.</dd>
- <dt>{{domxref("AudioContext.createBiquadFilter()")}}</dt>
- <dd>Creates a {{domxref("BiquadFilterNode")}}, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc.</dd>
- <dt>{{domxref("AudioContext.createChannelMerger()")}}</dt>
- <dd>Creates a {{domxref("ChannelMergerNode")}}, which is used to combine channels from multiple audio streams into a single audio stream.</dd>
- <dt>{{domxref("AudioContext.createChannelSplitter()")}}</dt>
- <dd>Creates a {{domxref("ChannelSplitterNode")}}, which is used to access the individual channels of an audio stream and process them separately.</dd>
- <dt>{{domxref("AudioContext.createConvolver()")}}</dt>
- <dd>Creates a {{domxref("ConvolverNode")}}, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.</dd>
- <dt>{{domxref("AudioContext.createDelay()")}}</dt>
- <dd>Creates a {{domxref("DelayNode")}}, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.</dd>
- <dt>{{domxref("AudioContext.createDynamicsCompressor()")}}</dt>
- <dd>Creates a {{domxref("DynamicsCompressorNode")}}, which can be used to apply acoustic compression to an audio signal.</dd>
- <dt>{{domxref("AudioContext.createGain()")}}</dt>
- <dd>Creates a {{domxref("GainNode")}}, which can be used to control the overall volume of the audio graph.</dd>
- <dt>{{domxref("AudioContext.createOscillator()")}}</dt>
- <dd>Creates an {{domxref("OscillatorNode")}}, a source representing a periodic waveform. It basically generates a tone.</dd>
- <dt>{{domxref("AudioContext.createPanner()")}}</dt>
- <dd>Creates a {{domxref("PannerNode")}}, which is used to spatialise an incoming audio stream in 3D space.</dd>
- <dt>{{domxref("AudioContext.createPeriodicWave()")}}</dt>
- <dd>Creates a {{domxref("PeriodicWave")}}, used to define a periodic waveform that can be used to determine the output of an {{ domxref("OscillatorNode") }}.</dd>
- <dt>{{domxref("AudioContext.createWaveShaper()")}}</dt>
- <dd>Creates a {{domxref("WaveShaperNode")}}, which is used to implement non-linear distortion effects.</dd>
- <dt>{{domxref("AudioContext.createAudioWorker()")}}</dt>
- <dd>Creates an {{domxref("AudioWorkerNode")}}, which can interact with a web worker thread to generate, process, or analyse audio directly. This was added to the spec on August 29 2014, and is not implemented in any browser yet.</dd>
- <dt>{{domxref("AudioContext.decodeAudioData()")}}</dt>
- <dd>Asynchronously decodes audio file data contained in an {{domxref("ArrayBuffer")}}. In this case, the ArrayBuffer is usually loaded from an {{domxref("XMLHttpRequest")}}'s <code>response</code> attribute after setting the <code>responseType</code> to <code>arraybuffer</code>. This method only works on complete files, not fragments of audio files.</dd>
- <dt>{{domxref("AudioContext.resume()")}}</dt>
- <dd>Resumes the progression of time in an audio context that has previously been suspended.</dd>
- <dt>{{domxref("AudioContext.suspend()")}}</dt>
- <dd>Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.</dd>
-</dl>
-
-<h2 id="Obsolete_Methoden">Obsolete Methoden</h2>
-
-<dl>
- <dt>{{domxref("AudioContext.createJavaScriptNode()")}}</dt>
- <dd>Creates a {{domxref("JavaScriptNode")}}, used for direct audio processing via JavaScript. This method is obsolete, and has been replaced by {{domxref("AudioContext.createScriptProcessor()")}}.</dd>
- <dt>{{domxref("AudioContext.createWaveTable()")}}</dt>
- <dd>Creates a {{domxref("WaveTableNode")}}, used to define a periodic waveform. This method is obsolete, and has been replaced by {{domxref("AudioContext.createPeriodicWave()")}}.</dd>
-</dl>
-
-<h2 id="Beispiele">Beispiele</h2>
-
-<p>Grundsätzliche Deklarierung eines Audio Kontextes:</p>
-
-<pre class="brush: js">var audioCtx = new AudioContext();</pre>
-
-<p>Browserunabhängige Variante:</p>
-
-<pre class="brush: js">var AudioContext = window.AudioContext || window.webkitAudioContext;
-var audioCtx = new AudioContext();
-
-var oscillatorNode = audioCtx.createOscillator();
-var gainNode = audioCtx.createGain();
-var finish = audioCtx.destination;
-// etc.</pre>
-
-<h2 id="Spezifikationen">Spezifikationen</h2>
-
-<table class="standard-table">
- <tbody>
- <tr>
- <th scope="col">Specification</th>
- <th scope="col">Status</th>
- <th scope="col">Comment</th>
- </tr>
- <tr>
- <td>{{SpecName('Web Audio API', '#the-audiocontext-interface', 'AudioContext')}}</td>
- <td>{{Spec2('Web Audio API')}}</td>
- <td> </td>
- </tr>
- </tbody>
-</table>
-
-<h2 id="Browserkompatibilität">Browserkompatibilität</h2>
-
-<div>{{CompatibilityTable}}</div>
-
-<div id="compat-desktop">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Chrome</th>
- <th>Firefox (Gecko)</th>
- <th>Internet Explorer</th>
- <th>Opera</th>
- <th>Safari (WebKit)</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>{{CompatChrome(10.0)}}{{property_prefix("webkit")}}<br>
- 35</td>
- <td>{{CompatGeckoDesktop(25.0)}} </td>
- <td>{{CompatNo}}</td>
- <td>15.0{{property_prefix("webkit")}}<br>
- 22</td>
- <td>6.0{{property_prefix("webkit")}}</td>
- </tr>
- <tr>
- <td><code>createStereoPanner()</code></td>
- <td>{{CompatChrome(42.0)}}</td>
- <td>{{CompatGeckoDesktop(37.0)}} </td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- </tr>
- <tr>
- <td><code>onstatechange</code>, <code>state</code>, <code>suspend()</code>, <code>resume()</code></td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatGeckoDesktop(40.0)}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<div id="compat-mobile">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Android</th>
- <th>Firefox Mobile (Gecko)</th>
- <th>Firefox OS</th>
- <th>IE Mobile</th>
- <th>Opera Mobile</th>
- <th>Safari Mobile</th>
- <th>Chrome for Android</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatGeckoDesktop(37.0)}} </td>
- <td>2.2</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatVersionUnknown}}</td>
- </tr>
- <tr>
- <td><code>createStereoPanner()</code></td>
- <td>{{CompatNo}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatVersionUnknown}}</td>
- </tr>
- <tr>
- <td><code>onstatechange</code>, <code>state</code>, <code>suspend()</code>, <code>resume()</code></td>
- <td>{{CompatNo}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatVersionUnknown}}</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<h2 id="Siehe_auch">Siehe auch</h2>
-
-<ul style="margin-left: 40px;">
- <li><a href="/en-US/docs/Web_Audio_API/Using_Web_Audio_API">Using the Web Audio API</a></li>
- <li>{{domxref("OfflineAudioContext")}}</li>
-</ul>
diff --git a/files/de/web/api/canvasrenderingcontext2d/setlinedash/index.html b/files/de/web/api/canvasrenderingcontext2d/setlinedash/index.html
deleted file mode 100644
index 38aadbbfe3..0000000000
--- a/files/de/web/api/canvasrenderingcontext2d/setlinedash/index.html
+++ /dev/null
@@ -1,179 +0,0 @@
----
-title: CanvasRenderingContext2D.setLineDash()
-slug: Web/API/CanvasRenderingContext2D/setLineDash
-translation_of: Web/API/CanvasRenderingContext2D/setLineDash
----
-<div>{{APIRef}}</div>
-
-<p>The <code><strong>CanvasRenderingContext2D</strong></code><strong><code>.setLineDash()</code></strong> method of the Canvas 2D API sets the line dash pattern.</p>
-
-<h2 id="Syntax">Syntax</h2>
-
-<pre class="syntaxbox">void <var><em>ctx</em>.setLineDash(segments);</var>
-</pre>
-
-<h3 id="Parameters">Parameters</h3>
-
-<dl>
- <dt><code>segments</code></dt>
- <dd>An {{jsxref("Array")}}. A list of numbers that specifies distances to alternately draw a line and a gap (in coordinate space units). If the number of elements in the array is odd, the elements of the array get copied and concatenated. For example, <code>[5, 15, 25]</code> will become <code>[5, 15, 25, 5, 15, 25]</code>. An empty array clears the dashes, so that a solid line will be drawn.</dd>
-</dl>
-
-<h2 id="Examples">Examples</h2>
-
-<h3 id="Using_the_setLineDash_method">Using the <code>setLineDash</code> method</h3>
-
-<p>This is just a simple code snippet which uses the <code>setLineDash</code> method to draw a dashed line.</p>
-
-<h4 id="HTML">HTML</h4>
-
-<pre class="brush: html">&lt;canvas id="canvas"&gt;&lt;/canvas&gt;
-</pre>
-
-<h4 id="JavaScript">JavaScript</h4>
-
-<pre class="brush: js; highlight:[4]">var canvas = document.getElementById("canvas");
-var ctx = canvas.getContext("2d");
-
-ctx.setLineDash([5, 15]);
-
-ctx.beginPath();
-ctx.moveTo(0,100);
-ctx.lineTo(400, 100);
-ctx.stroke();
-</pre>
-
-<p>Edit the code below and see your changes update live in the canvas:</p>
-
-<div class="hidden">
-<h6 id="Playable_code">Playable code</h6>
-
-<pre class="brush: html">&lt;canvas id="canvas" width="400" height="200" class="playable-canvas"&gt;&lt;/canvas&gt;
-&lt;div class="playable-buttons"&gt;
-  &lt;input id="edit" type="button" value="Edit" /&gt;
-  &lt;input id="reset" type="button" value="Reset" /&gt;
-&lt;/div&gt;
-&lt;textarea id="code" class="playable-code"&gt;
-ctx.setLineDash([5, 15]);
-ctx.beginPath();
-ctx.moveTo(0,100);
-ctx.lineTo(400, 100);
-ctx.stroke();&lt;/textarea&gt;
-</pre>
-
-<pre class="brush: js">var canvas = document.getElementById("canvas");
-var ctx = canvas.getContext("2d");
-var textarea = document.getElementById("code");
-var reset = document.getElementById("reset");
-var edit = document.getElementById("edit");
-var code = textarea.value;
-
-function drawCanvas() {
- ctx.clearRect(0, 0, canvas.width, canvas.height);
- eval(textarea.value);
-}
-
-reset.addEventListener("click", function() {
- textarea.value = code;
- drawCanvas();
-});
-
-edit.addEventListener("click", function() {
- textarea.focus();
-})
-
-textarea.addEventListener("input", drawCanvas);
-window.addEventListener("load", drawCanvas);
-</pre>
-</div>
-
-<p>{{ EmbedLiveSample('Playable_code', 700, 360) }}</p>
-
-<h2 id="Specifications">Specifications</h2>
-
-<table class="standard-table">
- <tbody>
- <tr>
- <th scope="col">Specification</th>
- <th scope="col">Status</th>
- <th scope="col">Comment</th>
- </tr>
- <tr>
- <td>{{SpecName('HTML WHATWG', "scripting.html#dom-context-2d-setlinedash", "CanvasRenderingContext2D.setLineDash")}}</td>
- <td>{{Spec2('HTML WHATWG')}}</td>
- <td> </td>
- </tr>
- </tbody>
-</table>
-
-<h2 id="Browser_compatibility">Browser compatibility</h2>
-
-<p>{{CompatibilityTable}}</p>
-
-<div id="compat-desktop">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Chrome</th>
- <th>Firefox (Gecko)</th>
- <th>Internet Explorer</th>
- <th>Opera</th>
- <th>Safari</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{ CompatGeckoDesktop(27) }}</td>
- <td>{{ CompatIE(11) }}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatVersionUnknown}}</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<div id="compat-mobile">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Android</th>
- <th>Chrome for Android</th>
- <th>Firefox Mobile (Gecko)</th>
- <th>IE Mobile</th>
- <th>Opera Mobile</th>
- <th>Safari Mobile</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{ CompatGeckoMobile(27) }}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatVersionUnknown}}</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<h2 id="Gecko-specific_notes">Gecko-specific notes</h2>
-
-<ul>
- <li>Starting with Gecko 7.0 {{geckoRelease("7.0")}}, the non-standard and deprecated property <code>mozDash</code> has been implemented to set and get a dash list. This property will be deprecated and removed in the future, see {{bug(931643)}}. Use <code>setLineDash()</code> instead.</li>
-</ul>
-
-<h2 id="WebKit-specific_notes">WebKit-specific notes</h2>
-
-<ul>
- <li>In WebKit-based browsers (e.g. Safari), the non-standard and deprecated property <code>webkitLineDash</code> is implemented besides this method. Use <code>setLineDash()</code> instead.</li>
-</ul>
-
-<h2 id="See_also">See also</h2>
-
-<ul>
- <li>The interface defining it, {{domxref("CanvasRenderingContext2D")}}</li>
- <li>{{domxref("CanvasRenderingContext2D.getLineDash()")}}</li>
- <li>{{domxref("CanvasRenderingContext2D.lineDashOffset")}}</li>
-</ul>
diff --git a/files/de/web/api/eventtarget/index.html b/files/de/web/api/eventtarget/index.html
deleted file mode 100644
index 3ed264119e..0000000000
--- a/files/de/web/api/eventtarget/index.html
+++ /dev/null
@@ -1,174 +0,0 @@
----
-title: EventTarget
-slug: Web/API/EventTarget
-tags:
- - API
- - DOM
- - DOM Events
- - Interface
- - NeedsTranslation
- - TopicStub
-translation_of: Web/API/EventTarget
----
-<p>{{ApiRef("DOM Events")}}</p>
-
-<p><code>EventTarget</code> is an interface implemented by objects that can receive events and may have listeners for them.</p>
-
-<p>{{domxref("Element")}}, {{domxref("document")}}, and {{domxref("window")}} are the most common event targets, but other objects can be event targets too, for example {{domxref("XMLHttpRequest")}}, {{domxref("AudioNode")}}, {{domxref("AudioContext")}}, and others.</p>
-
-<p>Many event targets (including elements, documents, and windows) also support setting <a href="/en-US/docs/Web/Guide/DOM/Events/Event_handlers">event handlers</a> via <code>on...</code> properties and attributes.</p>
-
-<h2 id="Methods">Methods</h2>
-
-<dl>
- <dt>{{domxref("EventTarget.addEventListener()")}}</dt>
- <dd>Register an event handler of a specific event type on the <code>EventTarget</code>.</dd>
- <dt>{{domxref("EventTarget.removeEventListener()")}}</dt>
- <dd>Removes an event listener from the <code>EventTarget</code>.</dd>
- <dt>{{domxref("EventTarget.dispatchEvent()")}}</dt>
- <dd>Dispatch an event to this <code>EventTarget</code>.</dd>
-</dl>
-
-<h3 id="Additional_methods_for_Mozilla_chrome_code">Additional methods for Mozilla chrome code</h3>
-
-<p>Mozilla extensions for use by JS-implemented event targets to implement on* properties. See also <a href="/en-US/docs/Mozilla/WebIDL_bindings">WebIDL bindings</a>.</p>
-
-<ul>
- <li>void <strong>setEventHandler</strong>(DOMString type, EventHandler handler) {{non-standard_inline}}</li>
- <li>EventHandler <strong>getEventHandler</strong>(DOMString type) {{non-standard_inline}}</li>
-</ul>
-
-<h2 id="Example">Example:</h2>
-
-<h3 id="_Simple_implementation_of_EventTarget" name="_Simple_implementation_of_EventTarget">Simple implementation of EventTarget</h3>
-
-<pre class="brush: js">var EventTarget = function() {
- this.listeners = {};
-};
-
-EventTarget.prototype.listeners = null;
-EventTarget.prototype.addEventListener = function(type, callback) {
- if (!(type in this.listeners)) {
- this.listeners[type] = [];
- }
- this.listeners[type].push(callback);
-};
-
-EventTarget.prototype.removeEventListener = function(type, callback) {
- if (!(type in this.listeners)) {
- return;
- }
- var stack = this.listeners[type];
- for (var i = 0, l = stack.length; i &lt; l; i++) {
- if (stack[i] === callback){
- stack.splice(i, 1);
- return;
- }
- }
-};
-
-EventTarget.prototype.dispatchEvent = function(event) {
- if (!(event.type in this.listeners)) {
- return true;
- }
- var stack = this.listeners[event.type];
- event.target = this;
- for (var i = 0, l = stack.length; i &lt; l; i++) {
- stack[i].call(this, event);
- }
- return !event.defaultPrevented;
-};
-</pre>
-
-<p>{{ EmbedLiveSample('_Simple_implementation_of_EventTarget') }}</p>
-
-<h2 id="Specifications">Specifications</h2>
-
-<table class="standard-table">
- <tbody>
- <tr>
- <th scope="col">Specification</th>
- <th scope="col">Status</th>
- <th scope="col">Comment</th>
- </tr>
- <tr>
- <td>{{SpecName('DOM WHATWG', '#interface-eventtarget', 'EventTarget')}}</td>
- <td>{{Spec2('DOM WHATWG')}}</td>
- <td>No change.</td>
- </tr>
- <tr>
- <td>{{SpecName('DOM3 Events', 'DOM3-Events.html#interface-EventTarget', 'EventTarget')}}</td>
- <td>{{Spec2('DOM3 Events')}}</td>
- <td>A few parameters are now optional (<code>listener</code>), or accepts the <code>null</code> value (<code>useCapture</code>).</td>
- </tr>
- <tr>
- <td>{{SpecName('DOM2 Events', 'events.html#Events-EventTarget', 'EventTarget')}}</td>
- <td>{{Spec2('DOM2 Events')}}</td>
- <td>Initial definition.</td>
- </tr>
- </tbody>
-</table>
-
-<h2 id="Browser_compatibility">Browser compatibility</h2>
-
-<p>{{CompatibilityTable}}</p>
-
-<div id="compat-desktop">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Chrome</th>
- <th>Edge</th>
- <th>Firefox (Gecko)</th>
- <th>Internet Explorer</th>
- <th>Opera</th>
- <th>Safari (WebKit)</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>1.0</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatGeckoDesktop("1")}}</td>
- <td>9.0</td>
- <td>7</td>
- <td>1.0<sup>[1]</sup></td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<div id="compat-mobile">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Android</th>
- <th>Edge</th>
- <th>Firefox Mobile (Gecko)</th>
- <th>IE Mobile</th>
- <th>Opera Mobile</th>
- <th>Safari Mobile</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>1.0</td>
- <td>{{CompatVersionUnknown}}</td>
- <td>{{CompatGeckoMobile("1")}}</td>
- <td>9.0</td>
- <td>6.0</td>
- <td>1.0</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<p>[1] <code>window.EventTarget</code> does not exist.</p>
-
-<h2 id="See_Also">See Also</h2>
-
-<ul>
- <li><a href="/en-US/docs/Web/Reference/Events">Event reference</a> - the events available in the platform.</li>
- <li><a href="/en-US/docs/Web/Guide/DOM/Events">Event developer guide</a></li>
- <li>{{domxref("Event")}} interface</li>
-</ul>
diff --git a/files/de/web/api/file/getastext/index.html b/files/de/web/api/file/getastext/index.html
deleted file mode 100644
index fefda6647a..0000000000
--- a/files/de/web/api/file/getastext/index.html
+++ /dev/null
@@ -1,78 +0,0 @@
----
-title: File.getAsText()
-slug: Web/API/File/getAsText
-tags:
- - DOM
- - Files
-translation_of: Web/API/File/getAsText
----
-<p>{{APIRef("File API") }}{{non-standard_header}}</p>
-
-<p>{{deprecated_header(7.0)}}</p>
-
-<h2 id="Summary">Summary</h2>
-
-<p>The <code>getAsText</code> method provides the file's data interpreted as text using a given encoding.</p>
-
-<div class="note">
-<p><strong>Note:</strong> This method is obsolete; you should use the {{domxref("FileReader")}} method {{domxref("FileReader.readAsText()","readAsText()")}} instead.</p>
-</div>
-
-<h2 id="Syntaxe">Syntaxe</h2>
-
-<pre>var str = instanceOfFile.getAsText(encoding);</pre>
-
-<h3 id="Parameters">Parameters</h3>
-
-<dl>
- <dt>encoding</dt>
- <dd>A string indicating the encoding to use for the returned data. If this string is empty, UTF-8 is assumed.</dd>
-</dl>
-
-<h3 id="Returns">Returns</h3>
-
-<p>A string containing the file's data interpreted as text in the specified <code>encoding</code>.</p>
-
-<h2 id="Example">Example</h2>
-
-<pre class="brush: js">// fileInput is a HTMLInputElement: <input>
-var fileInput = document.getElementById("myfileinput");
-
-// files is a FileList object (similar to NodeList)
-var files = fileInput.files;
-
-// object for allowed media types
-var accept = {
- binary : ["image/png", "image/jpeg"],
- text : ["text/plain", "text/css", "application/xml", "text/html"]
-};
-
-var file;
-
-for (var i = 0; i &lt; files.length; i++) {
- file = files[i];
-
- // if file type could be detected
- if (file !== null) {
- if (accept.text.indexOf(file.mediaType) &gt; -1) {
- // file is of type text, which we accept
- // make sure it's encoded as utf-8
- var data = file.getAsText("utf-8");
- // modify data with string methods
-
- } else if (accept.binary.indexOf(file.mediaType) &gt; -1) {
- // binary
- }
- }
-}</pre>
-
-<h2 id="Specification">Specification</h2>
-
-<p>Not part of any specification.</p>
-
-<h2 id="See_also">See also</h2>
-
-<ul>
- <li>{{domxref("File")}}</li>
- <li>{{domxref("FileReader")}}</li>
-</ul>
diff --git a/files/de/web/api/rtcpeerconnection/index.html b/files/de/web/api/rtcpeerconnection/index.html
deleted file mode 100644
index df67ef624c..0000000000
--- a/files/de/web/api/rtcpeerconnection/index.html
+++ /dev/null
@@ -1,379 +0,0 @@
----
-title: RTCPeerConnection
-slug: Web/API/RTCPeerConnection
-translation_of: Web/API/RTCPeerConnection
----
-<p>{{APIRef}}{{SeeCompatTable}}</p>
-
-<p>The <strong><code>RTCPeerConnection</code></strong> interface represents a WebRTC connection between the local computer and a remote peer. It is used to handle efficient streaming of data between the two peers.</p>
-
-<div class="note">
-<p><strong>Note:</strong> <code>RTCPeerConnection</code> and {{domxref("RTCSessionDescription")}} are currently prefixed in most browsers, and the {{domxref("navigator.getUserMedia()")}} method is prefixed in many versions of some browsers; you should use code like the following to access these:</p>
-
-<pre class="brush: js">var peerConnection = window.RTCPeerConnection || window.mozRTCPeerConnection ||
- window.webkitRTCPeerConnection || window.msRTCPeerConnection;
-var sessionDescription = window.RTCSessionDescription || window.mozRTCSessionDescription ||
- window.webkitRTCSessionDescription || window.msRTCSessionDescription;
-
-navigator.getUserMedia = navigator.getUserMedia || navigator.mozGetUserMedia ||
- navigator.webkitGetUserMedia || navigator.msGetUserMedia;
-</pre>
-
-<p>Simple code such as this is all it takes to ensure that your project will work on as many versions of as many browsers as possible.</p>
-</div>
-
-<h2 id="Basic_usage">Basic usage</h2>
-
-<p>Basic <code>RTCPeerConnection</code> usage involves negotiating a connection between your local machine and a remote one by generating <a href="http://en.wikipedia.org/wiki/Session_Description_Protocol">Session Description Protocol</a> to exchange between the two. The caller starts the process by sending an offer to the remote machine, which responds by either accepting or rejecting the call.</p>
-
-<p>Both parties (the caller and the called party) need to set up their own <code>RTCPeerConnection</code> instances to represent their end of the peer-to-peer connection:</p>
-
-<pre class="brush: js">var pc = new RTCPeerConnection();
-pc.onaddstream = function(obj) {
- var vid = document.createElement("video");
- document.appendChild(vid);
- vid.srcObject = obj.stream;
-}
-
-// Helper functions
-function endCall() {
- var videos = document.getElementsByTagName("video");
- for (var i = 0; i &lt; videos.length; i++) {
- videos[i].pause();
- }
-
- pc.<a href="#close()">close</a>();
-}
-
-function error(err) {
- endCall();
-}
-</pre>
-
-<h3 id="Initializing_the_call">Initializing the call</h3>
-
-<p>If you are the one initiating the call, you would use {{domxref("navigator.getUserMedia()")}} to get a video stream, then add the stream to the <code>RTCPeerConnection</code>. Once that's been done, call {{domxref("RTCPeerConnection.createOffer()")}} to create an offer, configure the offer, then send it to the server through which the connection is being mediated.</p>
-
-<pre class="brush: js">// Get a list of friends from a server
-// User selects a friend to start a peer connection with
-navigator.getUserMedia({video: true}, function(stream) {
- // Adding a local stream won't trigger the onaddstream callback,
- // so call it manually.
- pc.onaddstream({stream: stream});
- pc.<a href="#addStream()">addStream</a>(stream);
-
- pc.<a href="#createOffer()">createOffer</a>(function(offer) {
- pc.<a href="#setLocalDescription()">setLocalDescription</a>(new <span class="nx">RTCSessionDescription</span>(offer), function() {
- // send the offer to a server to be forwarded to the friend you're calling.
- }, error);
- }, error);
-});
-</pre>
-
-<h3 id="Answering_a_call">Answering a call</h3>
-
-<p>On the opposite end, the friend will receive the offer from the server using whatever protocol is being used to do so. Once the offer arrives, {{domxref("navigator.getUserMedia()")}} is once again used to create the stream, which is added to the <code>RTCPeerConnection</code>. An {{domxref("RTCSessionDescription")}} object is created and set up as the remote description by calling {{domxref("RTCPeerConnection.setRemoteDescription()")}}.</p>
-
-<p>Then an answer is created using {{domxref("RTCPeerConnection.createAnswer()")}} and sent back to the server, which forwards it to the caller.</p>
-
-<pre class="brush: js">var offer = getOfferFromFriend();
-navigator.getUserMedia({video: true}, function(stream) {
- pc.onaddstream({stream: stream});
- pc.<a href="#addStream()">addStream</a>(stream);
-
- pc.setRemoteDescription(new <span class="nx">RTCSessionDescription</span>(offer), function() {
- pc.<a href="#createAnswer()">createAnswer</a>(function(answer) {
- pc.<a href="#setLocalDescription()">setLocalDescription</a>(new <span class="nx">RTCSessionDescription</span>(answer), function() {
- // <span style="font-size: 1rem;">send the answer to a server to be forwarded back to the caller (you)</span>
- }, error);
- }, error);
- }, error);
-});
-</pre>
-
-<h3 id="Handling_the_answer">Handling the answer</h3>
-
-<p>Back on the original machine, the response is received. Once that happens, call {{domxref("RTCPeerConnection.setRemoteDescription()")}} to set the response as the remote end of the connection.</p>
-
-<pre class="brush: js">// pc was set up earlier when we made the original offer
-var offer = getResponseFromFriend();
-pc.<a href="#createAnswer()">setRemoteDescription</a>(new <span class="nx">RTCSessionDescription</span>(offer), function() { }, error);
-</pre>
-
-<h2 id="Constructor">Constructor</h2>
-
-<dl>
- <dt>{{domxref("RTCPeerConnection.RTCPeerConnection", "RTCPeerConnection()")}}</dt>
- <dd>Constructor; returns a new <code>RTCPeerConnection</code> object.</dd>
-</dl>
-
-<h2 id="Properties">Properties</h2>
-
-<p><em>This interface inherits properties from its parent interface, {{domxref("EventTarget")}}.</em></p>
-
-<dl>
- <dt>{{domxref("RTCPeerConnection.iceConnectionState")}} {{ReadOnlyInline}}</dt>
- <dd>Returns an enum of type <code>RTCIceConnectionState</code> that describes the ICE connection state for the connection. When this value changes, a {{event("iceconnectionstatechange")}} event is fired on the object.</dd>
- <dt>{{domxref("RTCPeerConnection.iceGatheringState")}} {{ReadOnlyInline}}</dt>
- <dd>Returns an enum of type <code>RTCIceGatheringState</code> that describes the ICE gathering state for the connection.</dd>
- <dt>{{domxref("RTCPeerConnection.localDescription")}} {{ReadOnlyInline}}</dt>
- <dd>Returns a {{domxref("RTCSessionDescription")}} describing the session for the local end of the connection. If it has not yet been set, it can be <code>null</code>.</dd>
- <dt>{{domxref("RTCPeerConnection.peerIdentity")}} {{ReadOnlyInline}}</dt>
- <dd>Returns a <code>RTCIdentityAssertion</code>, that is a couple of a domain name (<code>idp</code>) and a name (<code>name</code>) representing the identity of the remote peer of this connection, once set and verified. If no peer has yet been set and verified, this property will return <code>null</code>. Once set, via the appropriate method, it can't be changed.</dd>
- <dt>{{domxref("RTCPeerConnection.remoteDescription")}} {{ReadOnlyInline}}</dt>
- <dd>Returns a {{domxref("RTCSessionDescription")}} describing the session for the remote end of the connection. If it has not yet been set, it can be <code>null</code>.</dd>
- <dt>{{domxref("RTCPeerConnection.signalingState")}} {{ReadOnlyInline}}</dt>
- <dd>Returns an enum of type <code>RTCSignalingState</code> that describes the signaling state of the local connection. This state describes the SDP offer, that defines the configuration of the connections like the description of the locally associated objects of type {{domxref("MediaStream")}}, the codec/RTP/RTCP options, the candidates gathered by the ICE Agent. When this value changes, a {{event("signalingstatechange")}} event is fired on the object.</dd>
-</dl>
-
-<h3 id="Event_handlers">Event handlers</h3>
-
-<dl>
- <dt>{{domxref("RTCPeerConnection.onaddstream")}}</dt>
- <dd>Is the event handler called when the {{event("addstream")}} event is received. Such an event is sent when a {{domxref("MediaStream")}} is added to this connection by the remote peer. The event is sent immediately after the call {{domxref("RTCPeerConnection.setRemoteDescription()")}} and doesn't wait for the result of the SDP negotiation.</dd>
- <dt>{{domxref("RTCPeerConnection.ondatachannel")}}</dt>
- <dd>Is the event handler called when the {{event("datachannel")}} event is received. Such an event is sent when a {{domxref("RTCDataChannel")}} is added to this connection.</dd>
- <dt>{{domxref("RTCPeerConnection.onicecandidate")}}</dt>
- <dd>Is the event handler called when the {{event("icecandidate")}} event is received. Such an event is sent when a {{domxref("RTCICECandidate")}} object is added to the script.</dd>
- <dt>{{domxref("RTCPeerConnection.oniceconnectionstatechange")}}</dt>
- <dd>Is the event handler called when the {{event("iceconnectionstatechange")}} event is received. Such an event is sent when the value of {{domxref("RTCPeerConnection.iceConnectionState", "iceConnectionState")}} changes.</dd>
- <dt>{{domxref("RTCPeerConnection.onidentityresult")}}</dt>
- <dd>Is the event handler called when the {{event("identityresult")}} event is received. Such an event is sent when an identity assertion is generated, via {{domxref("RTCPeerConnection.getIdentityAssertion()", "getIdentityAssertion()")}}, or during the creation of an offer or an answer.</dd>
- <dt>{{domxref("RTCPeerConnection.onidpassertionerror")}}</dt>
- <dd>Is the event handler called when the {{event("idpassertionerror")}} event is received. Such an event is sent when the associated identity provider (IdP) encounters an error while generating an identity assertion.</dd>
- <dt>{{domxref("RTCPeerConnection.onidpvalidationerror")}}</dt>
- <dd>Is the event handler alled when the {{event("idpvalidationerror")}} event is received. Such an event is sent when the associated identity provider (IdP) encounters an error while validating an identity assertion.</dd>
- <dt>{{domxref("RTCPeerConnection.onnegotiationneeded")}}</dt>
- <dd>Is the event handler called when the {{event("negotiationneeded")}} event, sent by the browser to inform that negotiation will be required at some point in the future, is received.</dd>
- <dt>{{domxref("RTCPeerConnection.onpeeridentity")}}</dt>
- <dd>Is the event handler called when the {{event("peeridentity")}} event, sent when a peer identity has been set and verified on this connection, is received.</dd>
- <dt>{{domxref("RTCPeerConnection.onremovestream")}}</dt>
- <dd>Is the event handler called when the {{event("removestream")}} event, sent when a {{domxref("MediaStream")}} is removed from this connection, is received.</dd>
- <dt>{{domxref("RTCPeerConnection.onsignalingstatechange")}}</dt>
- <dd>Is the event handler called when the {{event("signalingstatechange")}} event, sent when the value of {{domxref("RTCPeerConnection.signalingState", "signalingState")}} changes, is received.</dd>
-</dl>
-
-<h2 id="Methods">Methods</h2>
-
-<dl>
- <dt>{{domxref("RTCPeerConnection.addIceCandidate()")}}</dt>
- <dd> </dd>
- <dt>{{domxref("RTCPeerConnection.addStream()")}}</dt>
- <dd>Adds a {{domxref("MediaStream")}} as a local source of audio or video. If the negotiation already happened, a new one will be needed for the remote peer to be able to use it.</dd>
- <dt>{{domxref("RTCPeerConnection.close()")}}</dt>
- <dd>Abruptly closes a connection.</dd>
- <dt>{{domxref("RTCPeerConnection.createAnswer()")}}</dt>
- <dd>Creates an answer to the offer received by the remote peer, in a two-part offer/answer negotiation of a connection. The two first parameters are respectively success and error callbacks, the optional third one represent options for the answer to be created.</dd>
- <dt>{{domxref("RTCPeerConnection.createDataChannel()")}}</dt>
- <dd>Creates a new {{domxref("RTCDataChannel")}} associated with this connection. The method takes a dictionary as parameter, with the configuration required for the underlying data channel, like its reliability.</dd>
- <dt>{{domxref("RTCPeerConnection.createDTMFSender()")}}</dt>
- <dd>Creates a new {{domxref("RTCDTMFSender")}}, associated to a specific {{domxref("MediaStreamTrack")}}, that will be able to send {{Glossary("DTMF")}} phone signaling over the connection.</dd>
- <dt>{{domxref("RTCPeerConnection.createOffer()")}}</dt>
- <dd>Creates a request to find a remote peer with a specific configuration. </dd>
- <dt>{{domxref("RTCPeerConnection.generateCertificate()")}}</dt>
- <dd>Creates and stores an X.509 certificate and corresponding private key then returns an {{domxref("RTCCertificate")}}, providing access to it.</dd>
- <dt>{{domxref("RTCPeerConnection.getConfiguration()")}}</dt>
- <dd> </dd>
- <dt>{{domxref("RTCPeerConnection.getIdentityAssertion()")}}</dt>
- <dd>Initiates the gathering of an identity assertion. This has an effect only if the {{domxref("RTCPeerConnection.signalingState", "signalingState")}} is not <code>"closed"</code>. It is not expected for the application dealing with the <code>RTCPeerConnection</code>: this is automatically done; an explicit call only allows to anticipate the need.</dd>
- <dt>{{domxref("RTCPeerConnection.getLocalStreams()")}}</dt>
- <dd>Returns an array of {{domxref("MediaStream")}} associated with the local end of the connection. The array may be empty.</dd>
- <dt>{{domxref("RTCPeerConnection.getRemoteStreams()")}}</dt>
- <dd>Returns an array of {{domxref("MediaStream")}} associated with the remote end of the connection. The array may be empty.</dd>
- <dt>{{domxref("RTCPeerConnection.getStats()")}}</dt>
- <dd>Creates a new {{domxref("RTCStatsReport")}} that contains and allows access to statistics regarding the connection.</dd>
- <dt>{{domxref("RTCPeerConnection.getStreamById()")}}</dt>
- <dd>Returns the {{domxref("MediaStream")}} with the given id that is associated with local or remote end of the connection. If no stream matches, it returns <code>null</code>.</dd>
- <dt>{{domxref("RTCPeerConnection.removeStream()")}}</dt>
- <dd>Removes a {{domxref("MediaStream")}} as a local source of audio or video. If the negotiation already happened, a new one will be needed for the remote peer to stop using it.</dd>
- <dt>{{domxref("RTCPeerConnection.setIdentityProvider()")}}</dt>
- <dd>Sets the Identity Provider (IdP) to the triplet given in parameter: its name, the protocol used to communicate with it (optional) and an optional username. The IdP will be used only when an assertion will be needed.</dd>
- <dt>{{domxref("RTCPeerConnection.setLocalDescription()")}}</dt>
- <dd>Changes the local description associated with the connection. The description defines the properties of the connection like its codec. The connection is affected by this change and must be able to support both old and new descriptions. The method takes three parameters, a {{domxref("RTCSessionDescription")}} object to set, and two callbacks, one called if the change of description succeeds, another called if it failed.</dd>
- <dt>{{domxref("RTCPeerConnection.setRemoteDescription()")}}</dt>
- <dd>Changes the remote description associated with the connection. The description defines the properties of the connection like its codec. The connection is affected by this change and must be able to support both old and new descriptions. The method takes three parameters, a {{domxref("RTCSessionDescription")}} object to set, and two callbacks, one called if the change of description succeeds, another called if it failed.</dd>
- <dt>{{domxref("RTCPeerConnection.updateIce()")}}</dt>
- <dd> </dd>
-</dl>
-
-<h3 id="Constructor_2">Constructor</h3>
-
-<pre>new RTCPeerConnection({{domxref("RTCConfiguration")}} configuration, optional {{domxref("MediaConstraints")}} constraints);</pre>
-
-<div class="note">
-<p><strong>Note:</strong> While the PeerConnection specification reads like passing an RTCConfiguration object is required, Firefox will supply a default if you don't.</p>
-</div>
-
-<h2 id="Methods_2">Methods</h2>
-
-<h3 id="createOffer">createOffer</h3>
-
-<p><code>void createOffer({{domxref("RTCSessionDescriptionCallback")}} successCallback, {{domxref("RTCPeerConnectionErrorCallback")}} failureCallback, optional {{domxref("MediaConstraints")}} constraints);</code></p>
-
-<p>Create offer generates a blob of description data to facilitate a PeerConnection to the local machine. Use this when you've got a remote Peer connection and you want to set up the local one.</p>
-
-<h4 id="Example">Example</h4>
-
-<pre class="prettyprint">var pc = new PeerConnection();
-pc.addStream(video);
-pc.createOffer(function(desc){
- pc.setLocalDescription(desc, function() {
- // send the offer to a server that can negotiate with a remote client
- });
-}</pre>
-
-<h4 id="Arguments">Arguments</h4>
-
-<dl>
- <dt>successCallback</dt>
- <dd>An {{domxref("RTCSessionDescriptionCallback")}} which will be passed a single {{domxref("RTCSessionDescription")}} object</dd>
- <dt>errorCallback</dt>
- <dd>An {{domxref("RTCPeerConnectionErrorCallback")}} which will be passed a single {{domxref("DOMError")}} object</dd>
- <dt>[optional] constraints</dt>
- <dd>An optional {{domxref("MediaConstraints")}} object.</dd>
-</dl>
-
-<h3 id="createAnswer">createAnswer</h3>
-
-<p><code>void createAnswer({{domxref("RTCSessionDescriptionCallback")}} successCallback, {{domxref("RTCPeerConnectionErrorCallback")}} failureCallback, optional {{domxref("MediaConstraints")}} constraints)")</code></p>
-
-<p>Respond to an offer sent from a remote connection.</p>
-
-<h4 id="Example_2">Example</h4>
-
-<pre class="line">var pc = new PeerConnection();
-pc.setRemoteDescription(new RTCSessionDescription(offer), function() {
- pc.createAnswer(function(answer) {
- pc.setLocalDescription(answer, function() {
- // send the answer to the remote connection
- })
- })
-});</pre>
-
-<h4 id="Arguments_2">Arguments</h4>
-
-<dl>
- <dt>successCallback</dt>
- <dd>An {{domxref("RTCSessionDescriptionCallback")}} which will be passed a single {{domxref("RTCSessionDescription")}} object</dd>
- <dt>errorCallback</dt>
- <dd>An {{domxref("RTCPeerConnectionErrorCallback")}} which will be passed a single {{domxref("DOMError")}} object</dd>
- <dt>[optional] constraints</dt>
- <dd>An optional {{domxref("MediaConstraints")}} object.</dd>
-</dl>
-
-<h3 id="updateIce()">updateIce()</h3>
-
-<p>updateIce(optional {{domxref("RTCConfiguration")}} configuration, optional {{domxref("MediaConstraints")}} constraints)</p>
-
-<p>The updateIce method updates the ICE Agent process of gathering local candidates and pinging remote candidates. If there is a mandatory constraint called "IceTransports" it will control how the ICE engine can act. This can be used to limit the use to TURN candidates by a callee to avoid leaking location information prior to the call being accepted. This call may result in a change to the state of the ICE Agent, and may result in a change to media state if it results in connectivity being established.</p>
-
-<h4 id="Example_3">Example</h4>
-
-<pre> </pre>
-
-<h3 id="addIceCandidate()">addIceCandidate()</h3>
-
-<p>addIceCandidate ({{domxref("RTCIceCandidate")}} candidate, {{domxref("Function")}} successCallback, {{domxref("RTCPeerConnectionErrorCallback")}} failureCallback);</p>
-
-<p>The addIceCandidate() method provides a remote candidate to the ICE Agent. In addition to being added to the remote description, connectivity checks will be sent to the new candidates as long as the "IceTransports" constraint is not set to "none". This call will result in a change to the connection state of the ICE Agent, and may result in a change to media state if it results in different connectivity being established.</p>
-
-<h4 id="Example_4">Example</h4>
-
-<pre> pc.addIceCandidate(new RTCIceCandidate(candidate));
-</pre>
-
-<h3 id="createDataChannel">createDataChannel</h3>
-
-<p><code>{{domxref("RTCDataChannel")}} createDataChannel ({{domxref("DOMString")}} label, optional {{domxref("RTCDataChannelInit")}} dataChannelDict);</code></p>
-
-<p>Creates a data channel for sending non video or audio data across the peer connection</p>
-
-<h4 id="Example_5">Example</h4>
-
-<pre class="brush: js">var pc = new PeerConnection();
-var channel = pc.createDataChannel("Mydata");
-channel.onopen = function(event) {
- <code>channel.send('sending a message');</code>
-}
-channel.onmessage = function(event) { console.log(event.data); }</pre>
-
-<h2 id="Specifications">Specifications</h2>
-
-<table class="standard-table">
- <tbody>
- <tr>
- <th scope="col">Specification</th>
- <th scope="col">Status</th>
- <th scope="col">Comment</th>
- </tr>
- <tr>
- <td>{{SpecName('WebRTC 1.0')}}</td>
- <td>{{Spec2('WebRTC 1.0')}}</td>
- <td>Initial definition.</td>
- </tr>
- </tbody>
-</table>
-
-<h2 id="Browser_compatibility">Browser compatibility</h2>
-
-<div>{{CompatibilityTable}}</div>
-
-<div id="compat-desktop">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Chrome</th>
- <th>Firefox (Gecko)</th>
- <th>Internet Explorer</th>
- <th>Opera</th>
- <th>Safari (WebKit)</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>{{CompatVersionUnknown}}</td>
- <td> </td>
- <td> </td>
- <td> </td>
- <td> </td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<div id="compat-mobile">
-<table class="compat-table">
- <tbody>
- <tr>
- <th>Feature</th>
- <th>Android</th>
- <th>Android Webview</th>
- <th>Firefox Mobile (Gecko)</th>
- <th>Firefox OS</th>
- <th>IE Mobile</th>
- <th>Opera Mobile</th>
- <th>Safari Mobile</th>
- <th>Chrome for Android</th>
- </tr>
- <tr>
- <td>Basic support</td>
- <td>{{CompatNo}}</td>
- <td>{{CompatVersionUnknown}}</td>
- <td> </td>
- <td> </td>
- <td> </td>
- <td> </td>
- <td> </td>
- <td>{{CompatVersionUnknown}}</td>
- </tr>
- </tbody>
-</table>
-</div>
-
-<h2 id="See_also">See also</h2>
-
-<ul>
- <li><a href="https://github.com/jesup/nightly-gupshup/blob/master/static/js/chat.js">https://github.com/jesup/nightly-gupshup/blob/master/static/js/chat.js</a></li>
- <li><a href="http://www.html5rocks.com/en/tutorials/webrtc/basics/#toc-simple">http://www.html5rocks.com/en/tutorials/webrtc/basics/#toc-simple</a></li>
- <li><a href="http://dev.w3.org/2011/webrtc/editor/webrtc.html">http://dev.w3.org/2011/webrtc/editor/webrtc.html</a></li>
-</ul>
diff --git a/files/de/web/api/webxr_device_api/index.html b/files/de/web/api/webxr_device_api/index.html
deleted file mode 100644
index 69e10d7d3b..0000000000
--- a/files/de/web/api/webxr_device_api/index.html
+++ /dev/null
@@ -1,298 +0,0 @@
----
-title: WebXR-Geräte-API
-slug: Web/API/WebXR_Device_API
-translation_of: Web/API/WebXR_Device_API
----
-<p>{{APIRef("WebXR Device API")}}{{Draft}}</p>
-
-<p><span class="seoSummary"><strong>WebXR</strong> ist eine Gruppe von Standards, die zusammen verwendet werden, um das Rendern von 3D-Szenen auf Hardware zu unterstützen, die für die Darstellung virtueller Welten<strong>(Virtuelle Realität</strong>oder <strong>VR )</strong>entwickelt wurde, oder um der realen Welt grafische Bilder hinzuzufügen (<strong>Augmented Reality</strong>oder <strong>AR</strong>).</span> Die <strong>WebXR-Geräte-API</strong> implementiert den Kern des WebXR-Feature-Sets, verwaltet die Auswahl von Ausgabegeräten, rendert die 3D-Szene mit der entsprechenden Bildrate auf das ausgewählte Gerät und verwaltet Bewegungsvektoren, die mit Eingabecontrollern erstellt wurden.</p>
-
-<p>WebXR-kompatible Geräte umfassen vollständig immersive 3D-Headsets mit Bewegungs- und Orientierungsverfolgung, Brillen, die Grafiken über der realen Szene überlagern, die durch die Rahmen gehen, und Handheld-Handys, die die Realität erweitern, indem sie die Welt mit einer Kamera erfassen und diese Szene mit computergenerierten Bildern ergänzen.</p>
-
-<p>Um dies zu erreichen, bietet die WebXR-Geräte-API die folgenden Schlüsselfunktionen:</p>
-
-<ul>
- <li>Finden Sie kompatible VR- oder AR-Ausgangsgeräte</li>
- <li>Rendern einer 3D-Szene auf dem Gerät mit einer entsprechenden Bildrate</li>
- <li>(Optional) spiegeln Sie den Ausgang auf ein 2D-Display</li>
- <li>Erstellen von Vektoren, die die Bewegungen von Eingabesteuerelementen darstellen</li>
-</ul>
-
-<p>Auf der grundlegendsten Ebene wird eine Szene in 3D dargestellt, indem die Perspektive berechnet wird, die auf die Szene angewendet werden soll, um sie aus der Sicht jedes benutzerischen Augen zu rendern, wobei der typische Abstand zwischen den Augen berücksichtigt wird, und dann die Szene zweimal, einmal für jedes Auge, gerendert wird. Die resultierenden Bilder (oder Bilder, wenn die Szene zweimal auf einem einzigen Frame gerendert wird, die Hälfte pro Auge) werden dann dem Benutzer angezeigt.</p>
-
-<p>Da <a href="/en-US/docs/Web/API/WebGL_API">WebGL</a> zum Rendern der 3D-Welt in die WebXR-Sitzung verwendet wird, sollten Sie zunächst mit der allgemeinen Verwendung von WebGL und mit den Grundlagen der 3D-Grafik im Allgemeinen vertraut sein. Sie verwenden höchstwahrscheinlich nicht direkt die WebGL-API, sondern eines der Frameworks oder Bibliotheken, die auf WebGL erstellt werden, um die Verwendung zu erleichtern. Zu den beliebtesten von ihnen ist <a href="https://threejs.org/">three.js</a>.</p>
-
-<p>Ein besonderer Vorteil der Verwendung einer Bibliothek anstelle der direkten Verwendung der WebGL-API besteht darin, dass Bibliotheken dazu neigen, virtuelle Kamerafunktionen zu implementieren. OpenGL (und damit WebGL durch Erweiterung) bietet nicht direkt eine Kameraansicht, mit einer Bibliothek, die eine in Ihrem Namen simuliert kann Ihre Arbeit viel, viel einfacher machen, vor allem beim Erstellen von Code, die freie Bewegung durch Ihre virtuelle Welt ermöglicht.</p>
-
-<h2 id="Wichtige_Gesundheits-_und_Sicherheitshinweise">Wichtige Gesundheits- und Sicherheitshinweise</h2>
-
-<p>Da der gesamte Akt der Schaffung einer virtuellen 3D-Welt im Wesentlichen ein Trick ist, der unser Verständnis davon nutzt, wie Augen Licht sammeln und wie das Gehirn die gesammelten Daten interpretiert, ist es wichtig zu bedenken, dass Software-Designer und Entwickler als solche die Verantwortung haben, noch vorsichtiger als üblich zu sein, um sicherzustellen, dass die Ergebnisse korrekt sind.</p>
-
-<p>Defekte, Fehlstellungen oder Verzerrungen können die Augen und das Gehirn verwirren, was zu allem von schmerzenden Augen oder Kopfschmerzen bis hin zu schwindelerregendem Schwindel, Schwindel oder potenziell schwerer Übelkeit führt. Es ist auch wichtig, wachsam zu sein für alles, was Sie anzeigen können, die das Potenzial haben können, Anfälle auszulösen, angesichts der allumfassenden Natur der VR-Brille; Der Benutzer ist möglicherweise nicht in der Lage, schnell von den Bildern wegzuschauen, die Sie präsentieren, wenn es Zuflucht verursacht.</p>
-
-<p>Wenn Sie Inhalte haben, die für Benutzer von Gefahr sein können, sollten Sie eine Warnmeldung bereitstellen. Besser sicher sein als entschuldigung!</p>
-
-<h2 id="WebXR_Device_API-Konzepte_und_-Nutzung">WebXR Device API-Konzepte und -Nutzung</h2>
-
-<h3 id="WebXR_AR_and_VR">WebXR: AR and VR</h3>
-
-<figure style="background: #eee; padding: 0.5em; border: 1px solid #aaa; border-radius: 1em; max-width: 20em; margin-bottom: 1em; margin-right: 2em; float: left;">
-<figcaption>Example WebXR hardware setup</figcaption>
-<img alt='Sketch of a person in a chair with wearing goggles labelled "Head mounted display (HMD)" facing a monitor with a webcam labeled "Position sensor"' src="https://mdn.mozillademos.org/files/11035/hw-setup.png"></figure>
-
-<p>While the older <a href="/en-US/docs/Web/API/WebVR_API">WebVR API</a> was designed solely to support Virtual Reality (VR), WebXR provides support for both VR and Augmented Reality (AR) on the web. Support for AR functionality is added by the WebXR Augmented Reality Module.</p>
-
-<p>A typical XR device can have either 3 or 6 degrees of freedom and might or might not have an external positional sensor.</p>
-
-<p>The equipment may also include an accelerometer, barometer, or other sensors which are used to sense when the user moves through space, rotates their head, or the like.</p>
-
-<h3 id="WebXR_application_lifecycle">WebXR application lifecycle</h3>
-
-<p>Most applications using WebXR will follow a similar overall design pattern:</p>
-
-<ol>
- <li>Check to see if the user's device and browser are both capable of presenting the XR experience you want to provide.
- <ol>
- <li>Make sure the WebXR API is available; if {{domxref("navigator.xr")}} is undefined, you can assume the user's browser and/or device doesn't support WebXR. If it's not supported, disable any user interface used to activate XR features and abort any attempts to enter XR mode.</li>
- <li>Call {{DOMxRef("XR.isSessionSupported","navigator.xr.isSessionSupported()")}}, specifying the WebXR experience mode you want to provide: , , or , in order to determine whether or not the type of session you wish to provide is available.<code>inline</code><code>immersive-vr</code><code>immersive-ar</code></li>
- <li>If the session type you want to use is available, provide the appropriate interface to the user to allow them to activate it.</li>
- </ol>
- </li>
- <li>When the user requests the activation of WebXR functionality by engaging with the user interface enabled above, request an {{DOMxRef("XRSession")}} using the desired mode. This is done by calling {{DOMxRef("XR.requestSession","navigator.xr.requestSession()")}}, again specifying the string indicating the mode you want to enable: , , or .<code>inline</code><code>immersive-vr</code><code>immersive-ar</code></li>
- <li>If the promise returned by  resolves, use the new {{domxref("XRSession")}} to run the frame loop for the entire duration of the WebXR experience.<code>requestSession()</code>
- <ol>
- <li>Call the {{domxref("XRSession")}} method {{DOMxRef("XRSession.requestAnimationFrame", "requestAnimationFrame()")}} to schedule the first frame render for the XR device.</li>
- <li>Each  callback should use the information provided about the objects located in the 3D world to render the frame using WebGL.<code>requestAnimationFrame()</code></li>
- <li>Keep calling {{DOMxRef("XRSession.requestAnimationFrame", "requestAnimationFrame()")}} from within the callback to schedule each successive frame to be rendered.</li>
- </ol>
- </li>
- <li>When the time comes, end the XR session; otherwise, continue the loop until the user chooses to exit XR mode.
- <ol>
- <li>To end the XR session yourself, call {{DOMxRef("XRSession.end", "XRSession.end()")}}.</li>
- <li>Include a handler for the {{domxref("XRSession")}} event {{domxref("XRSession.end_event", "end")}} event to be informed when the session is ending, regardless of whether your code, the user, or the browser initiated the termination of the session.</li>
- </ol>
- </li>
-</ol>
-
-<h3 id="Permissions_and_security">Permissions and security</h3>
-
-<p>The WebXR Device API is subject to a number of permission and security controls. While not onerous, they are worth being aware of. These mostly revolve around the fully-immersive  session mode, but there are things to be aware of when setting up an AR session, as well.<code>immersive-vr</code></p>
-
-<h4 id="Immersive_presentation_of_VR">Immersive presentation of VR</h4>
-
-<p>First, any requests to activate the  mode are rejected if the domain issuing the request does not have permission to enable an immersive session. This permission comes from the  <a href="/en-US/docs/Web/HTTP/Feature_Policy">feature policy</a>.<code>immersive-vr</code><code>xr-spatial-tracking</code></p>
-
-<p>Once that check is passed, the request to enter mode is allowed if all of the following are true:<code>immersive-vr</code></p>
-
-<ul>
- <li>The  call was issued by code executing within the handler for a user event, or the from the startup code for a user-launched <a href="/en-US/docs/Web/Progressive_web_apps">web application</a>.<code>requestSession()</code></li>
- <li>The document is considered trustworthy, in that it is responsible and is both currently active and has focus.</li>
- <li>The user's intent to enter immersive VR mode is well understood; see {{anch("User intent")}} below for details.</li>
-</ul>
-
-<p>If all of that is true, the promise returned by  is resolved, and the new {{domxref("XRSession")}} object is passed into the fulfillment handler. Otherwise, an appropriate exception is thrown, such as  if the document doesn't have permission to enter immersive mode.<code>requestSession()</code><code>SecurityError</code></p>
-
-<h4 id="Inline_presentation">Inline presentation</h4>
-
-<p>When you request an {{domxref("XRSession")}} with the mode set to , and any features are required or requested, the browser will only allow the session to be created if the call to {{domxref("XR.requestSession", "requestSession()")}} was made by code which is executing expressly due to <strong>user intent</strong>.<code>inline</code></p>
-
-<p>Specifically:</p>
-
-<ul>
- <li>If the  call isn't coming from within the handler executed in response to a user event, and is not being issued while launching a web application, the request is denied and  is delivered to the promise's fulfillment handler.<code>requestSession()</code><code>false</code></li>
- <li>If the document making the request isn't the one which is responsible for the script, the request is denied.</li>
- <li>If the document making the request isn't trustworthy, the request is denied and  is returned through the promise's fulfillment routine. A trustworthy document is one which is both responsible and active, and which currently has focus.<code>false</code></li>
- <li>If the user's intent to open an inline XR presentation is not well understood, the request is denied. Understanding of the {{anch("User intent", "user's intent")}} may be either implicit or explicit.</li>
-</ul>
-
-<div class="blockIndicator note">
-<p><strong>Note:</strong> Additional requirements may be put into effect due to the specific features requested by the options object when calling .<code>requestSession()</code></p>
-</div>
-
-<h4 id="User_intent">User intent</h4>
-
-<p><strong>User intent</strong> is the concept of whether or not an action being performed by code is being performed because of something the user intends to do or not. There are two kinds of user intent: <strong>implicit</strong> and <strong>explicit</strong>.</p>
-
-<p><strong>Explicit user intent</strong> (explicit user consent) is granted when the user has specifically and expressly been asked for permission to perform an action.</p>
-
-<p><strong>Implicit user intent</strong> (implicit user consent) is assumed if either of the following scenarios is the case:</p>
-
-<ul>
- <li>The user has interacted with the document in some way which has in turn caused your request to occur. For example, if you have an "Enter XR mode" button, and the user clicks it, calling  from the button's {{domxref("Element.click_event", "click")}} event handler will permitted.<code>requestSession()</code></li>
- <li>If your code is executing during the launch of a web application, the runtime may consider the act of launching your web application to qualify as user intent.</li>
-</ul>
-
-<h3 id="WebXR_availability">WebXR availability</h3>
-
-<p>As a new and still in development API, WebXR support is limited to specific devices and browsers; and even on those, it may not be enabled by default. There may be options available to allow you to experiment with WebXR even if you don't have a compatible system, however.</p>
-
-<h4 id="WebXR_polyfill">WebXR polyfill</h4>
-
-<p>The team designing the WebXR specification has published a <a href="https://github.com/immersive-web/webxr-polyfill">WebXR polyfill</a> which you can use to simulate WebXR on browsers which don't have support for the WebXR APIs. If the browser supports the older <a href="/en-US/docs/Web/API/WebVR_API">WebVR API</a>, that is used. Otherwise, the polyfill falls back to an implementation which uses Google's Cardboard VR API.</p>
-
-<p>The polyfill is maintained alongside the specification, and is kept up to date with the specification. Additionally, it is updated to maintain compatibility with browsers as their support for WebXR and other technologies related to it and to the implementation of the polyfill change over time.</p>
-
-<p>Be sure to read the readme carefully; the polyfill comes in several versions depending on what degree of compatibility with newer JavaScript features your target browsers include.</p>
-
-<h4 id="WebXR_API_Emulator_extension">WebXR API Emulator extension</h4>
-
-<p>The <a href="https://mixedreality.mozilla.org/">Mozilla WebXR team</a> has created a <a href="https://blog.mozvr.com/webxr-emulator-extension/">WebXR API Emulator</a> browser extension, compatible with both Firefox and Chrome, which emulates the WebXR API, simulating a variety of compatible devices such as the HTC Vive, the Oculus Go and Oculus Quest, Samsung Gear, and Google Cardboard. With the extension in place, you can open up a developer tools panel that lets you control the position and orientation of the headset and any hand controllers, as well as button presses on the controllers.</p>
-
-<h5 id="Emulator_usage">Emulator usage</h5>
-
-<p>While somewhat awkward compared to using an actual headset, this makes it possible to experiment with and developer WebXR code on a desktop computer, where WebXR isn't normally available. It also lets you perform some basic testing before taking your code to a real device. Be aware, however, that the emulator does not yet completely emulate all of the WebXR API, so you may run into problems you're not expecting. Again, carefully read the readme file and make sure you're aware of the limitations before you begin.</p>
-
-<div class="blockIndicator note">
-<p><strong>Important:</strong> You should <em>always</em> test your code on actual AR and/or VR hardware before releasing or shipping a product! Emulated, simulated, or polyfilled environments are <em>not</em> an adequate substitute for actual testing on physical devices.</p>
-</div>
-
-<h5 id="Getting_the_extension">Getting the extension</h5>
-
-<p>Download the WebXR API Emulator for your supported browser below:</p>
-
-<ul>
- <li><a href="https://chrome.google.com/webstore/detail/webxr-api-emulator/mjddjgeghkdijejnciaefnkjmkafnnje">Google Chrome</a></li>
- <li><a href="https://addons.mozilla.org/en-US/firefox/addon/webxr-api-emulator/">Mozilla Firefox</a></li>
-</ul>
-
-<p>The <a href="https://github.com/MozillaReality/WebXR-emulator-extension">source code for the extension</a> is also available on GitHub.</p>
-
-<h5 id="Emulator_issues_and_notes">Emulator issues and notes</h5>
-
-<p>While this isn't the place for a full article about the extension, there are some specific things worth mentioning.</p>
-
-<p>Version 0.4.0 of the extension was announced on March 26, 2020. It introduced support for augmented reality (AR) through the <a href="https://www.w3.org/TR/webxr-ar-module-1/">WebXR AR Module</a>, which has is approaching a stable state. Documentation for AR is forthcoming shortly here on MDN.</p>
-
-<p>Other improvements include updating the emulator to rename the  interface to {{domxref("XRSystem")}}, introduce support for squeeze (grip) input sources, and add support for the {{domxref("XRInputSource")}} property {{domxref("XRInputSource.profiles", "profiles")}}.<code>XR</code></p>
-
-<h2 id="Accessing_the_WebXR_API">Accessing the WebXR API</h2>
-
-<p>To gain access to the WebXR API within the context of a given window, use the {{domxref("navigator.xr")}} property.</p>
-
-<dl>
- <dt>{{domxref("navigator.xr")}} {{ReadOnlyInline}}</dt>
- <dd>This property, added to the {{domxref("Navigator")}} interface, returns the {{domxref("XR")}} object through which the WebXR API is exposed. If this property is missing or , WebXR is not available.<code>null</code></dd>
-</dl>
-
-<h2 id="WebXR_interfaces">WebXR interfaces</h2>
-
-<dl>
- <dt>{{DOMxRef("XR")}}</dt>
- <dd>The {{domxref("Navigator.xr", "navigator.xr")}} property returns the window's instance of {{domxref("XR")}}, which is the mechanism by which your code accesses the WebXR API. Using the  interface, you can create {{domxref("XRSession")}}s to represent actual AR and/or VR sessions.<code>XR</code></dd>
- <dt>{{DOMxRef("XRFrame")}}</dt>
- <dd>While presenting an XR session, the state of all tracked objects which make up the session are represented by an . To get an , call the session's {{domxref("XRSession.requestAnimationFrame", "requestAnimationFrame()")}} method, providing a callback which will be called with the  once available. Events which communicate tracking states will also use  to contain that information.<code>XRFrame</code><code>XRFrame</code><code>XRFrame</code><code>XRFrame</code></dd>
- <dt>{{DOMxRef("XRRenderState")}}</dt>
- <dd>Provides a set of configurable properties which change how the imagery output by an  is composited.<code>XRSession</code></dd>
- <dt>{{DOMxRef("XRSession")}}</dt>
- <dd>Provides the interface for interacting with XR hardware. Once an  is obtained from {{domxref("XR.requestSession()")}}, the session can be used to check the position and orientation of the viewer, query the device for environment information, and present the virtual or augmented world to the user.<code>XRSession</code></dd>
- <dt>{{DOMxRef("XRSpace")}}</dt>
- <dd><code>XRSpace</code> is an opaque base class on which all virtual coordinate system interfaces are based. Positions in WebXR are always expressed in relation to a particular  at the time at which a particular {{domxref("XFrame")}} takes place. The space's coordinate system has its origin at the a given physical position.<code>XRSpace</code></dd>
- <dt>{{DOMxRef("XRReferenceSpace")}}</dt>
- <dd>A subclass of {{domxref("XRSpace")}} which is used to identify a spatial relationship in relation to the user's physical environment. The  coordinate system is expected to remain unchanged through the lifespan of the {{domxref("XRSession")}}.The world has no boundaries and extends infinitely in every direction.<code>XRReferenceSpace</code></dd>
- <dt>{{DOMxRef("XRBoundedReferenceSpace")}}</dt>
- <dd><code>XRBoundedReferenceSpace</code> extends the {{domxref("XRReferenceSpace")}} coordinate system to further include support for a finite world with set boundaries. Unlike , the origin must be located on the floor (that is, <em>y</em> = 0 at the floor). The x and z components of the origin are typically presumed to be located at or near the center of the room or surface.<code>XRReferenceSpace</code></dd>
- <dt>{{DOMxRef("XRView")}}</dt>
- <dd>Represents a single view into the XR scene for a particular frame. Each  corresponds to the video display surface used to present the scene to the user. For example, a given XR device might have two views: one for the left eye and one for the right. Each view has an offset used to shift the position of the view relative to the camera, in order to allow for creating stereographic effects.<code>XRView</code></dd>
- <dt>{{DOMxRef("XRViewport")}}</dt>
- <dd>Describes a viewport. A viewport is a rectangular portion of a graphic surface.</dd>
- <dt>{{DOMxRef("XRRigidTransform")}}</dt>
- <dd>A transform defined using a position and orientation in the virtual space's coordinate system as described by the {{domxref("XRSpace")}}.</dd>
- <dt>{{DOMxRef("XRPose")}}</dt>
- <dd>Describes a position and orientation in space relative to an {{domxref("XRSpace")}}.</dd>
- <dt>{{DOMxRef("XRViewerPose")}}</dt>
- <dd>Based on {{domxref("XRPose")}},  specifies the state of a viewer of the WebXR scene as indicated by the XR device. Included is an array of {{domxref("XRView")}} objects, each representing one perspective on the scene. For example, it takes two views to create the stereoscopic view as perceived by human vision—one for the left eye and a second for the right eye. One view is offset to the left slightly from the viewer's position, and the other view is offset to the right by the same distance. The view list can also be used to represent the perspectives of each of the spectators of a scene, in a multi-user environment.<code>XRViewerPose</code></dd>
- <dt>{{DOMxRef("XRInputSource")}}</dt>
- <dd>Represents any input device the user can use to perform targeted actions within the same virtual space as the viewer. Input sources may include devices such as hand controllers, optical tracking systems, and other devices which are explicitly associated with the XR device. Other input devices such as keyboards, mice, and gamepads are not presented as  instances.<code>XRInputSource</code></dd>
- <dt>{{DOMxRef("XRWebGLLayer")}}</dt>
- <dd>A layer which serves as a <a href="/en-US/docs/Web/API/WebGL_API">WebGL</a> frame buffer into which a scene's view is rendered. Using WebGL to render the scene gains substantial performance benefits due to graphics acceleration.</dd>
-</dl>
-
-<h3 id="Event_interfaces">Event interfaces</h3>
-
-<p>The following interfaces are used to represent the events used by the WebXR API.</p>
-
-<dl>
- <dt>{{domxref("XRInputSourceEvent")}}</dt>
- <dd>Sent when the state of an {{domxref("XRInputSource")}} changes. This can happen, for example, when the position and/or orientation of the device changes, or when buttons are pressed or released.</dd>
- <dt>{{domxref("XRInputSourcesChangeEvent")}}</dt>
- <dd>Sent to indicate that the set of available input sources has changed for the {{domxref("XRSession")}}.</dd>
- <dt>{{domxref("XRReferenceSpaceEvent")}}</dt>
- <dd>Sent when the state of an {{domxref("XRReferenceSpace")}} changes.</dd>
- <dt>{{domxref("XRSessionEvent")}}</dt>
- <dd>Sent to indicate that the state of an {{domxref("XRSession")}} has changed. For example, if the position and/or orient</dd>
-</dl>
-
-<h2 id="Extensions_to_the_WebGL_API">Extensions to the WebGL API</h2>
-
-<p>The WebGL API is extended by the WebXR specification to augment the WebGL context to allow it to be used to render views for display by a WebXR device.</p>
-
-<dl>
- <dt>{{domxref("WebGLRenderingContextBase.makeXRCompatibile()")}}</dt>
- <dd>Configures the WebGL context to be compatible with WebXR. If the context was not initially created with the {{domxref("WebGLContextAttributes.xrCompatible", "xrCompatible")}} property set to , you must call  prior to attempting to use the WebGL context for WebXR rendering. Returns a {{jsxref("promise")}} which resolves once the context has been prepared, or is rejected if the context cannot be configured for use by WebXR.<code>true</code><code>makeXRCompatible()</code></dd>
-</dl>
-
-<h2 id="Guides_and_tutorials">Guides and tutorials</h2>
-
-<p>The following guides and tutorials are a great resource to learn how to comprehend WebXR and the underlying 3D and VR/AR graphics concepts.</p>
-
-<dl>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Fundamentals">Fundamentals of WebXR</a></dt>
- <dd>Before diving into the details of how to create content using WebXR, it may be helpful to read this overview of the technology, which includes introductions to terminology that may be unfamiliar to you, or which may be used in a new way.</dd>
- <dt><a href="/en-US/docs/Web/API/WebGL_API/Matrix_math_for_the_web">Matrix math for the web</a></dt>
- <dd>A guide covering how matrices can be used on the web, including both for CSS transforms and for WebGL purposes, as well as to handle the positioning and orientation of objects in WebXR contexts.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Startup_and_shutdown">Setting up and shutting down a WebXR session</a></dt>
- <dd>Before actually presenting a scene using an XR device such as a headset or goggles, you need to create a WebXR session bound to a rendering layer that draws the scene for presentation in each of the XR device's displays so that the 3D effect can be presented to the user. This guide covers how to create and stop WebXR sessions.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Permissions_and_security">Permissions and security for WebXR</a></dt>
- <dd>The WebXR Device API has several areas of security to contend with, from establishing feature-policy to ensuring the user intends to use the mixed reality presentation before activating it.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Geometry">Geometry and reference spaces in WebXR</a></dt>
- <dd>In this guide, the required concepts of 3D geometry are briefly reviewed, and the fundamentals of how that geometry is represented in WebXR are detailed. Learn how reference spaces are used to position objects—and the viewer—and the differences among the available types of reference space, as well as their use cases.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Spatial_tracking">Spatial tracking in WebXR</a></dt>
- <dd>This guide describes how objects—including the user's body and its parts—are located in space, and how their movement and orientation relative to one another is monitored and managed over time. This article explains the relationship between spaces, poses, viewers, and views.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Rendering">Rendering and the WebXR frame loop</a></dt>
- <dd>Starting with how you schedule frames to be rendered, this guide then continues to cover how to determine the placement of objects in the view and how to then render them into the WebGL buffer used for each of the two eyes' views of the scene.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Cameras">Viewpoints and viewers: Simulating cameras in WebXR </a></dt>
- <dd>WebGL (and therefore WebXR) doesn't really have a concept of a camera, which is the traditional concept used to represent a viewpoint in 3D graphics. In this article, we see how to simulate a camera and how to create the illusion of moving a viewer through a world in which the viewer doesn't really move.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Movement_and_motion">Movement, orientation, and motion: A WebXR example</a></dt>
- <dd>In this example and tutorial, we use information learned throughout the WebXR documentation to create a scene containing a rotating cube which the user can move around using both VR headset and keyboard and mouse.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Bounded_reference_spaces">Using bounded reference spaces</a></dt>
- <dd>In this article, we examine how to use a  reference space to define the boundaries of where the viewer can safely move about without leaving the area tracked by their XR hardware or colliding with a physical obstacle. On devices which support it,  can be a useful tool in your repertoire.<code>bounded-floor</code><code>bounded-floor</code></dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Performance">WebXR performance guide</a></dt>
- <dd>Recommendations and tips to help you optimize the performance of your WebXR application.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Inputs">Inputs and input sources</a></dt>
- <dd>A guide to input sources and how to efficiently manage the input devices being used to control the WebXR session, and how to receive and process user inputs from those devices.</dd>
- <dt><a href="/en-US/docs/Web/API/WebXR_Device_API/Input_profiles">Using WebXR input profiles</a></dt>
- <dd>A guide to interpreting the {{Glossary("JSON")}} data provided by the <a href="https://github.com/immersive-web/webxr-input-profiles/tree/master/packages/registry">WebXR Input Profiles Registry</a>, which can be used to determine what options and controls are available on the user's available input devices.</dd>
- <dt><a href="/en-US/docs/Web/WebXR_Device_API/Gamepads">Supporting advanced controllers and gamepads in WebXR applications</a></dt>
- <dd>WebXR uses the {{domxref("Gamepad")}} object to describe the controls available on complex input devices (such as hand controllers with multiple buttons and/or axes) and gamepad-like devices. In this guide, learn how to make use of these devices' controls.</dd>
-</dl>
-
-<h2 id="Specifications">Specifications</h2>
-
-<table class="standard-table">
- <tbody>
- <tr>
- <th scope="col">Specification</th>
- <th scope="col">Status</th>
- <th scope="col">Comment</th>
- </tr>
- <tr>
- <td>{{SpecName("WebXR")}}</td>
- <td>{{Spec2("WebXR")}}</td>
- <td>Initial definition.</td>
- </tr>
- </tbody>
-</table>
-
-<h2 id="Browser_compatibility">Browser compatibility</h2>
-
-<p>{{Compat("api.Navigator.xr")}}</p>
-
-<h2 id="See_also">See also</h2>
-
-<ul>
- <li><a href="/en-US/docs/Web/Guide/Graphics">Graphics on the web</a></li>
- <li><a href="/en-US/docs/Learn/JavaScript/Client-side_web_APIs/Drawing_graphics">Drawing graphics</a></li>
- <li><a href="/en-US/docs/Web/API/WebGL_API">WebGL API</a>: Accelerated 2D and 3D graphics on the web</li>
- <li><a href="/en-US/docs/Web/API/Canvas_API">Canvas API</a>: 2D drawing for the web</li>
- <li><a href="/en-US/docs/Web/API/Canvas_API/Tutorial">Canvas tutorial</a></li>
-</ul>