diff options
author | Peter Bengtsson <mail@peterbe.com> | 2020-12-08 14:42:52 -0500 |
---|---|---|
committer | Peter Bengtsson <mail@peterbe.com> | 2020-12-08 14:42:52 -0500 |
commit | 074785cea106179cb3305637055ab0a009ca74f2 (patch) | |
tree | e6ae371cccd642aa2b67f39752a2cdf1fd4eb040 /files/pt-br/web/api/webvr_api | |
parent | da78a9e329e272dedb2400b79a3bdeebff387d47 (diff) | |
download | translated-content-074785cea106179cb3305637055ab0a009ca74f2.tar.gz translated-content-074785cea106179cb3305637055ab0a009ca74f2.tar.bz2 translated-content-074785cea106179cb3305637055ab0a009ca74f2.zip |
initial commit
Diffstat (limited to 'files/pt-br/web/api/webvr_api')
-rw-r--r-- | files/pt-br/web/api/webvr_api/index.html | 151 | ||||
-rw-r--r-- | files/pt-br/web/api/webvr_api/using_the_webvr_api/index.html | 440 |
2 files changed, 591 insertions, 0 deletions
diff --git a/files/pt-br/web/api/webvr_api/index.html b/files/pt-br/web/api/webvr_api/index.html new file mode 100644 index 0000000000..3046e3f086 --- /dev/null +++ b/files/pt-br/web/api/webvr_api/index.html @@ -0,0 +1,151 @@ +--- +title: WebVR API +slug: Web/API/WebVR_API +translation_of: Web/API/WebVR_API +--- +<div>{{draft("The WebVR API documentation is currently being updated to cover the v1.0 spec, therefore some of this information will be out of date. Contact ~~chrisdavidmills if you have any questions about this work.")}}{{DefaultAPISidebar("WebVR API")}}{{SeeCompatTable}}</div> + +<p class="summary">O WebVR oferece suporte para expor dispositivos de realidade virtual - por exemplo, telas montadas na cabeça, como o Oculus Rift - para aplicativos da web, permitindo que os desenvolvedores traduzam informações de posição e movimento da tela para movimento em torno de uma cena 3D. Isso tem inúmeras aplicações muito interessantes, de passeios de produtos virtuais e aplicativos de treinamento interativo para super imersivo jogos em primeira pessoa.</p> + +<h2 id="Conceitos_e_uso">Conceitos e uso</h2> + +<p><img alt="Sketch of a person in a chair with wearing goggles labelled Head mounted display (HMD) facing a monitor with a webcam labelled Position sensor " src="https://mdn.mozillademos.org/files/11035/hw-setup.png" style="display: block; height: 78px; margin: 0px auto; width: 60%;"></p> + +<p>Todos os dispositivos VR ligados ao computador serão devolvidos pelo {{domxref("Navigator.getVRDisplays()")}} método. Que retorna uma matriz de objetos para representar os dispositivos conectados, que herdam do objeto geral {{domxref("VRDevice")}} Geralmente o display usado na cabeça terá dois dispositivos - o próprio display montado na cabeça, representado por {{domxref("HMDVRDevice")}}, e uma câmera com sensor na posição que manterá o controle de sua posição de cabeça, representada por {{domxref("PositionSensorVRDevice")}}.</p> + +<p>O objeto {{domxref("PositionSensorVRDevice")}} contém o método {{domxref("PositionSensorVRDevice.getState", "getState()")}}, que retorna um objeto {{domxref("VRPositionState")}} - isto representa o estado do sensor num dado carimbo de data e inclui propriedades que contêm dados úteis tais como velocidade, aceleração e orientação atuais, úteis para atualizar o processamento de uma cena em cada trama de acordo com o movimento do visor montado na cabeça VR.</p> + +<p>O método {{domxref("HMDVRDevice.getEyeParameters()")}} retorna um objeto {{domxref("VREyeParameters")}}, que pode ser usado para retornar informações do campo de exibição - quanto da cena a tela montada na cabeça pode ver.O {{domxref("VREyeParameters.currentFieldOfView")}} retorna um objeto {{domxref("VRFieldOfView")}} que contém 4 ângulos que descrevem a vista atual a partir de um ponto central. Você também pode alterar o campo de visualização usando {{domxref("HMDVRDevice.setFieldOfView()")}}.</p> + +<div id="gt-res-content"> +<h2 class="trans-verified-button-small" dir="ltr" id="WebVR_Interfaces">WebVR Interfaces</h2> + +<p class="trans-verified-button-small" dir="ltr">{{domxref("VRDisplay")}}<br> + Representa qualquer dispositivo VR suportado por esta API. Ele inclui informações genéricas, como IDs de dispositivo e descrições, bem como métodos para começar a apresentar uma cena VR, recuperar os parâmetros do olho e exibir capacidades e outras funcionalidades importantes.<br> + {{domxref("VRDisplayCapabilities")}}<br> + Descreve os recursos de um {{domxref("VRDisplay")}} - seus recursos podem ser usados para executar testes de capacidade do dispositivo VR, por exemplo, ele pode retornar informações de posição.<br> + {{domxref("VRPose")}}<br> + Representa o estado de posição em um dado carimbo de data/hora (que inclui orientação, posição, velocidade e aceleração).<br> + {{domxref ("VREyeParameters")}}<br> + Fornece acesso a todas as informações necessárias para processar corretamente uma cena para cada olho, incluindo informações de campo de visão.<br> + {{domxref("VRFieldOfView")}}<br> + Representa um campo de visão definido por 4 valores de graus diferentes que descrevem a vista a partir de um ponto central.<br> + {{Domxref("VRLayer")}}<br> + Representa uma camada a ser apresentada em {{domxref("VRDisplay")}}.<br> + {{domxref("VRStageParameters")}}<br> + Representa os valores que descrevem a área de estágio para dispositivos que suportam experiências em escala de sala.</p> + +<h2 class="trans-verified-button-small" dir="ltr" id="Extensões_para_outras_interfaces">Extensões para outras interfaces</h2> + +<p class="trans-verified-button-small" dir="ltr">{{domxref("Gamepad.displayId")}} {{readonlyInline}}<br> + Retorna o {{domxref("VRDisplay.displayId")}} do associado {{domxref("VRDisplay")}} - o VRDisplay que o gamepad está controlando a cena exibida de.<br> + {{domxref("Navigator.activeVRDisplays")}} {{readonlyInline}}<br> + Retorna uma matriz contendo todos os objetos {{domxref("VRDisplay")}} que está sendo apresentado ({{domxref("VRDisplay.ispresenting")}}).<br> + {{domxref("Navigator.getVRDisplays()")}}<br> + Retorna uma promessa que resolve uma matriz de objetos {{domxref("VRDisplay")}} que representa qualquer dispositivo VR disponível conectado ao computador.<br> + {{domxref("Window.onvrdisplayconnected")}}<br> + Representa um manipulador de eventos que será executado quando um dispositivo VR compatível tiver sido conectado ao computador (quando o evento {{event("vrdisplayconnected")}} for acionado).<br> + {{domxref("Window.onvrdisplaydisconnected")}}<br> + Representa um manipulador de eventos que será executado quando um dispositivo VR compatível tiver sido desconectado do computador (quando o evento {{event("vrdisplaydisconnected")}} for acionado).<br> + {{domxref("Window.onvrdisplaypresentchange")}}<br> + Representa um manipulador de eventos que será executado quando o estado de apresentação de um dispositivo VR mudar - isto é, vai de apresentar a não apresentar, ou vice-versa (quando o evento {{event("onvrdisplaypresentchange")}} é acionado).</p> +</div> + +<h2 id="Examplos">Examplos</h2> + +<p>Você pode encontrar uma série de exemplos nesses repositórios Github:</p> + +<ul> + <li><a href="https://github.com/aframevr/aframe">A-Frame</a>: Estrutura web de código aberto para a construção de experiências VR. Muitos exemplos.</li> + <li><a href="https://github.com/mdn/webvr-tests">mdn/webvr-tests</a>: Demonstrações simples construídas para ilustrar o uso de recursos básicos.</li> + <li><a href="https://github.com/MozVR/">MozVR team</a>: Mais demonstrações, sobre WebVR e muito mais!</li> +</ul> + +<h2 id="Especificações">Especificações</h2> + +<table class="standard-table"> + <tbody> + <tr> + <th scope="col">Specificação</th> + <th scope="col">Status</th> + <th scope="col">Comentario</th> + </tr> + <tr> + <td>{{SpecName('WebVR')}}</td> + <td>{{Spec2('WebVR')}}</td> + <td>Initial definition</td> + </tr> + </tbody> +</table> + +<h2 id="Compatibilidade_do_Browser_(navegador)">Compatibilidade do Browser (navegador)</h2> + +<p>{{CompatibilityTable}}</p> + +<div id="compat-desktop"> +<table class="compat-table"> + <tbody> + <tr> + <th>Feature</th> + <th>Chrome</th> + <th>Chromium</th> + <th>Firefox (Gecko)</th> + <th>Internet Explorer</th> + <th>Opera</th> + <th>Safari (WebKit)</th> + </tr> + <tr> + <td> + <p>Suporte Básico</p> + </td> + <td>{{CompatNo}}</td> + <td>{{CompatVersionUnknown}}</td> + <td>{{CompatVersionUnknown}}</td> + <td>{{CompatNo}}</td> + <td>{{CompatNo}}</td> + <td>{{CompatNo}}</td> + </tr> + </tbody> +</table> +</div> + +<div id="compat-mobile"> +<table class="compat-table"> + <tbody> + <tr> + <th>Feature</th> + <th>Android</th> + <th>Firefox Mobile (Gecko)</th> + <th>IE Phone</th> + <th>Opera Mobile</th> + <th>Safari Mobile</th> + <th>Chrome for Android</th> + <th>Samsung Internet for GearVR</th> + </tr> + <tr> + <td> + <p>Suporte Básico</p> + </td> + <td>{{CompatNo}}</td> + <td>{{CompatVersionUnknown}}</td> + <td>{{CompatNo}}</td> + <td>{{CompatNo}}</td> + <td>{{CompatNo}}</td> + <td>{{CompatVersionUnknown}}</td> + <td>{{CompatVersionUnknown}}</td> + </tr> + </tbody> +</table> +</div> + +<h2 id="Veja_Também">Veja Também</h2> + +<ul> + <li><a href="https://webvr.info">webvr.info</a>- Informações atualizadas sobre WebVR, configuração do navegador e comunidade.</li> + <li><a href="https://iswebvrready.com">webvr.rocks</a>- Informações atualizadas sobre o suporte ao navegador WebVR (incluindo compilações experimentais).</li> + <li><a href="http://mozvr.com/">MozVr.com</a>- Demos, downloads, outros recursos da equipe de VR da Mozilla.</li> + <li><a href="https://aframe.io">A-Frame</a>- A web framework para a construção de experiências VR (com HTML), a partir da equipe Mozilla VR.</li> + <li><a href="http://dsmu.me/ConsoleGameOnWeb/">Console Game on Web</a>- Uma coleção de demonstrações interessantes conceito de jogo, alguns dos quais incluem WebVR.</li> + <li><a href="https://github.com/MozVR/vr-web-examples/tree/master/threejs-vr-boilerplate">threejs-vr-boilerplate</a>- Um modelo de iniciador muito útil para escrever aplicações WebVR.</li> + <li><a href="https://developer.oculus.com/">Oculus Rift homepage</a></li> +</ul> diff --git a/files/pt-br/web/api/webvr_api/using_the_webvr_api/index.html b/files/pt-br/web/api/webvr_api/using_the_webvr_api/index.html new file mode 100644 index 0000000000..88c0697bdb --- /dev/null +++ b/files/pt-br/web/api/webvr_api/using_the_webvr_api/index.html @@ -0,0 +1,440 @@ +--- +title: Using the WebVR API +slug: Web/API/WebVR_API/Using_the_WebVR_API +translation_of: Web/API/WebVR_API/Using_the_WebVR_API +--- +<div>{{APIRef("WebVR API")}}{{deprecated_header}}</div> + +<div class="blockIndicator note"><strong>Note:</strong> WebVR API is replaced by <a href="/en-US/docs/Web/API/WebXR_API">WebXR API</a>. WebVR was never ratified as a standard, was implemented and enabled by default in very few browsers and supported a small number of devices.</div> + +<p class="summary">The WebVR API is a fantastic addition to the web developer's toolkit, allowing WebGL scenes to be presented in virtual reality displays such as the Oculus Rift and HTC Vive. But how do you get started with developing VR apps for the Web? This article will guide you through the basics.</p> + +<div class="note"> +<p><strong>Note</strong>: The WebVR API's most stable version — 1.1 — has recently been implemented in Firefox 55 (Windows in release version, and Mac OS X on Nightly only) and is also available in Chrome when used with Google Daydream hardware. There is also a later evolution of the spec — 2.0 — but this is at an early stage right now. You can find information on the latest state of the specs at <a href="https://w3c.github.io/webvr/">WebVR Spec Version List</a>.</p> +</div> + +<h2 id="Getting_started">Getting started</h2> + +<p>To get started, you need:</p> + +<ul> + <li>Supporting VR hardware. + <ul> + <li>The cheapest option is to use a mobile device, supporting browser, and device mount (e.g. Google Cardboard). This won't be quite as good an experience as dedicated hardware, but you won't need to purchase a powerful computer or dedicated VR display.</li> + <li>Dedicated hardware can be costly, but it does provide a better experience. The most WebVR-compatible hardware at the moment is the HTC VIVE, and The Oculus Rift. The frontpage of <a href="https://webvr.info/">webvr.info</a> has some further useful information about available hardware, and what browser support them.</li> + </ul> + </li> + <li>A computer powerful enough to handle rendering/displaying of VR scenes using your dedicated VR Hardware, if required. To give you an idea of what you need, look at the relevant guide for the VR you are purchasing (e.g. <a href="https://www.vive.com/us/ready/">VIVE READY Computers</a>).</li> + <li>A supporting browser installed — the latest <a href="https://www.mozilla.org/en-US/firefox/channel/desktop/">Firefox Nightly</a> or <a href="https://www.google.com/chrome/index.html">Chrome</a> are your best options right now, on desktop or mobile.</li> +</ul> + +<p>Once you have everything assembled, you can test to see whether your setup works with WebVR by going to our <a href="https://mdn.github.io/webvr-tests/aframe-demo/">simple A-Frame demo</a>, and seeing whether the scene renders and whether you can enter VR display mode by pressing the button at the bottom right.</p> + +<p><a href="https://aframe.io/">A-Frame</a> is by far the best option if you want to create a WebVR-compatible 3D scene quickly, without needing to understand a bunch of new JavaScript code. It doesn't however teach you how the raw WebVR API works, and this is what we'll get on to next.</p> + +<h2 id="Introducing_our_demo">Introducing our demo</h2> + +<p>To illustrate how the WebVR API works, we'll study our raw-webgl-example, which looks a bit like this:</p> + +<p><img alt="" src="https://mdn.mozillademos.org/files/15121/Capture1.png" style="display: block; height: 761px; margin: 0px auto; width: 1341px;"></p> + +<div class="note"> +<p><strong>Note</strong>: You can find the <a href="https://github.com/mdn/webvr-tests/tree/master/raw-webgl-example">source code of our demo</a> on GitHub, and <a href="https://mdn.github.io/webvr-tests/raw-webgl-example/">view it live</a> also.</p> +</div> + +<div class="note"> +<p><strong>Note</strong>: If WebVR isn't working in your browser, you might need to make sure it is running through your graphics card. For example for NVIDIA cards, if you've got the NVIDIA control panel set up successfully, there will be a context menu option available — right click on Firefox, then choose <em>Run with graphics processor > High-performance NVIDIA processor</em>.</p> +</div> + +<p>Our demo features the holy grail of WebGL demos — a rotating 3D cube. We've implemented this using raw <a href="/en-US/docs/Web/API/WebGL_API">WebGL API</a> code. We won't be teaching any basic JavaScript or WebGL, just the WebVR parts.</p> + +<p>Our demo also features:</p> + +<ul> + <li>A button to start (and stop) our scene from being presented in the VR display.</li> + <li>A button to show (and hide) VR pose data, i.e. the position and orientation of the headset, updated in real time.</li> +</ul> + +<p>When you look through the source code of <a href="https://github.com/mdn/webvr-tests/blob/master/raw-webgl-example/webgl-demo.js">our demo's main JavaScript file</a>, you can easily find the WebVR-specific parts by searching for the string "WebVR" in preceding comments.</p> + +<div class="note"> +<p><strong>Note</strong>: To find out more about basic JavaScript and WebGL, consult our <a href="/en-US/docs/Learn/JavaScript">JavaScript learning material</a>, and our <a href="/en-US/docs/Web/API/WebGL_API/Tutorial">WebGL Tutorial</a>.</p> +</div> + +<h2 id="How_does_it_work">How does it work?</h2> + +<p>At this point, let's look at how the WebVR parts of the code work.</p> + +<p>A typical (simple) WebVR app works like this:</p> + +<ol> + <li>{{domxref("Navigator.getVRDisplays()")}} is used to get a reference to your VR display.</li> + <li>{{domxref("VRDisplay.requestPresent()")}} is used to start presenting to the VR display.</li> + <li>WebVR's dedicated {{domxref("VRDisplay.requestAnimationFrame()")}} method is used to run the app's rendering loop at the correct refresh rate for the display.</li> + <li>Inside the rendering loop, you grab the data required to display the current frame ({{domxref("VRDisplay.getFrameData()")}}), draw the displayed scene twice — once for the view in each eye — then submit the rendered view to the display to show to the user via ({{domxref("VRDisplay.submitFrame()")}}).</li> +</ol> + +<p>In the below sections we'll look at our raw-webgl-demo in detail, and see where exactly the above features are used.</p> + +<h3 id="Starting_with_some_variables">Starting with some variables</h3> + +<p>The first WebVR-related code you'll meet is this following block:</p> + +<pre class="brush: js">// WebVR variables + +var frameData = new VRFrameData(); +var vrDisplay; +var btn = document.querySelector('.stop-start'); +var normalSceneFrame; +var vrSceneFrame; + +var poseStatsBtn = document.querySelector('.pose-stats'); +var poseStatsSection = document.querySelector('section'); +poseStatsSection.style.visibility = 'hidden'; // hide it initially + +var posStats = document.querySelector('.pos'); +var orientStats = document.querySelector('.orient'); +var linVelStats = document.querySelector('.lin-vel'); +var linAccStats = document.querySelector('.lin-acc'); +var angVelStats = document.querySelector('.ang-vel'); +var angAccStats = document.querySelector('.ang-acc'); +var poseStatsDisplayed = false;</pre> + +<p>Let's briefly explain these:</p> + +<ul> + <li><code>frameData</code> contains a {{domxref("VRFrameData")}} object, created using the {{domxref("VRFrameData.VRFrameData", "VRFrameData()")}} constructor. This is initially empty, but will later contain the data required to render each frame to show in the VR display, updated constantly as the rendering loop runs.</li> + <li><code>vrDisplay</code> starts uninitialized, but will later on hold a reference to our VR headset ({{domxref("VRDisplay")}} — the central control object of the API).</li> + <li><code>btn</code> and <code>poseStatsBtn</code> hold references to the two buttons we are using to control our app.</li> + <li><code>normalSceneFrame</code> and <code>vrSceneFrame</code> start uninitialized, but later on will hold references to {{domxref("Window.requestAnimationFrame()")}} and {{domxref("VRDisplay.requestAnimationFrame()")}} calls — these will initiate the running of a normal rendering loop, and a special WebVR rendering loop; we'll explain the difference between these two later on.</li> + <li>The other variables store references to different parts of the VR pose data display box, which you can see in the bottom right hand corner of the UI.</li> +</ul> + +<h3 id="Getting_a_reference_to_our_VR_display">Getting a reference to our VR display</h3> + +<p>One of the major functions inside our code is <code>start()</code> — we run this function when the body has finished loading:</p> + +<pre class="brush: js">// start +// +// Called when the body has loaded is created to get the ball rolling. + +document.body.onload = start;</pre> + +<p>To begin with, <code>start()</code> retrieves a WebGL context to use to render 3D graphics into the {{htmlelement("canvas")}} element in <a href="https://github.com/mdn/webvr-tests/blob/master/raw-webgl-example/index.html">our HTML</a>. We then check whether the <code>gl</code> context is available — if so, we run a number of functions to set up the scene for display.</p> + +<pre class="brush: js">function start() { + canvas = document.getElementById("glcanvas"); + + initWebGL(canvas); // Initialize the GL context + + // WebGL setup code here</pre> + +<p>Next, we start the process of actually rendering the scene onto the canvas, by setting the canvas to fill the entire browser viewport, and running the rendering loop (<code>drawScene()</code>) for the first time. This is the non-WebVR — normal — rendering loop.</p> + +<pre class="brush: js"> // draw the scene normally, without WebVR - for those who don't have it and want to see the scene in their browser + + canvas.width = window.innerWidth; + canvas.height = window.innerHeight; + drawScene();</pre> + +<p>Now onto our first WebVR-specific code. First of all, we check to see if {{domxref("Navigator.getVRDisplays")}} exists — this is the entry point into the API, and therefore good basic feature detection for WebVR. You'll see at the end of the block (inside the <code>else</code> clause) that if this doesn't exist, we log a message to indicate that WebVR 1.1 isn't supported by the browser.</p> + +<pre class="brush: js"> // WebVR: Check to see if WebVR is supported + if(navigator.getVRDisplays) { + console.log('WebVR 1.1 supported');</pre> + +<p>Inside our <code>if() { ... }</code> block, we run the {{domxref("Navigator.getVRDisplays()")}} function. This returns a promise, which is fulfilled with an array containing all the VR display devices connected to the computer. If none are connected, the array will be empty.</p> + +<pre class="brush: js"> // Then get the displays attached to the computer + navigator.getVRDisplays().then(function(displays) {</pre> + +<p>Inside the promise <code>then()</code> block, we check whether the array length is more than 0; if so, we set the value of our <code>vrDisplay</code> variable to the 0 index item inside the array. <code>vrDisplay</code> now contains a {{domxref("VRDisplay")}} object representing our connected display!</p> + +<pre class="brush: js"> // If a display is available, use it to present the scene + if(displays.length > 0) { + vrDisplay = displays[0]; + console.log('Display found');</pre> + +<div class="note"> +<p><strong>Note</strong>: It is unlikely that you'll have multiple VR displays connected to your computer, and this is just a simple demo, so this will do for now.</p> +</div> + +<h3 id="Starting_and_stopping_the_VR_presentation">Starting and stopping the VR presentation</h3> + +<p>Now we have a {{domxref("VRDisplay")}} object, we can use it do a number of things. The next thing we want to do is wire up functionality to start and stop presentation of the WebGL content to the display.</p> + +<p>Continuing on with the previous code block, we now add an event listener to our start/stop button (<code>btn</code>) — when this button is clicked we want to check whether we are presenting to the display already (we do this in a fairly dumb fashion, by checking what the button <code><a href="/en-US/docs/Web/API/Node/textContent">textContent</a></code> contains).</p> + +<p>If the display is not already presenting, we use the {{domxref("VRDisplay.requestPresent()")}} method to request that the browser start presenting content to the display. This takes as a parameter an array of the {{domxref("VRLayerInit")}} objects representing the layers you want to present in the display.</p> + +<p>Since the maximum number of layers you can display is currently 1, and the only required object member is the {{domxref("VRLayerInit.source")}} property (which is a reference to the {{htmlelement("canvas")}} you want to present in that layer; the other parameters are given sensible defaults — see {{domxref("VRLayerInit.leftBounds", "leftBounds")}} and {{domxref("VRLayerInit.rightBounds", "rightBounds")}})), the parameter is simply [{ source: canvas }].</p> + +<p><code>requestPresent()</code> returns a promise that is fulfilled when the presentation begins successfully.</p> + +<pre class="brush: js"> // Starting the presentation when the button is clicked: It can only be called in response to a user gesture + btn.addEventListener('click', function() { + if(btn.textContent === 'Start VR display') { + vrDisplay.requestPresent([{ source: canvas }]).then(function() { + console.log('Presenting to WebVR display');</pre> + +<p>With our presentation request successful, we now want to start setting up to render content to present to the VRDisplay. First of all we set the canvas to the same size as the VR display area. We do this by getting the {{domxref("VREyeParameters")}} for both eyes using {{domxref("VRDisplay.getEyeParameters()")}}.</p> + +<p>We then do some simple math to calculate the total width of the VRDisplay rendering area based on the eye {{domxref("VREyeParameters.renderWidth")}} and {{domxref("VREyeParameters.renderHeight")}}.</p> + +<pre class="brush: js"> // Set the canvas size to the size of the vrDisplay viewport + + var leftEye = vrDisplay.getEyeParameters('left'); + var rightEye = vrDisplay.getEyeParameters('right'); + + canvas.width = Math.max(leftEye.renderWidth, rightEye.renderWidth) * 2; + canvas.height = Math.max(leftEye.renderHeight, rightEye.renderHeight);</pre> + +<p>Next, we {{domxref("Window.cancelAnimationFrame()", "cancel the animation loop")}} previously set in motion by the {{domxref("Window.requestAnimationFrame()")}} call inside the <code>drawScene()</code> function, and instead invoke <code>drawVRScene()</code>. This function renders the same scene as before, but with some special WebVR magic going on. The loop inside here is maintained by WebVR's special {{domxref("VRDisplay.requestAnimationFrame")}} method.</p> + +<pre class="brush: js"> // stop the normal presentation, and start the vr presentation + window.cancelAnimationFrame(normalSceneFrame); + drawVRScene();</pre> + +<p>Finally, we update the button text so that the next time it is pressed, it will stop presentation to the VR display.</p> + +<pre class="brush: js"> btn.textContent = 'Exit VR display'; + });</pre> + +<p><br> + To stop the VR presentation when the button is subsequently pressed, we call {{domxref("VRDisplay.exitPresent()")}}. We also reverse the button's text content, and swap over the <code>requestAnimationFrame</code> calls. You can see here that we are using {{domxref("VRDisplay.cancelAnimationFrame")}} to stop the VR rendering loop, and starting the normal rendering loop off again by calling <code>drawScene()</code>.</p> + +<pre class="brush: js"> } else { + vrDisplay.exitPresent(); + console.log('Stopped presenting to WebVR display'); + + btn.textContent = 'Start VR display'; + + // Stop the VR presentation, and start the normal presentation + vrDisplay.cancelAnimationFrame(vrSceneFrame); + drawScene(); + } + }); + } + }); + } else { + console.log('WebVR API not supported by this browser.'); + } + } +}</pre> + +<p>Once the presentation starts, you'll be able to see the stereoscopic view displayed in the browser:</p> + +<p><img alt="" src="https://mdn.mozillademos.org/files/15123/Capture2.png" style="display: block; margin: 0 auto;"></p> + +<p>You'll learn below how the stereoscopic view is actually produced.</p> + +<h3 id="Why_does_WebVR_have_its_own_requestAnimationFrame">Why does WebVR have its own requestAnimationFrame()?</h3> + +<p>This is a good question. The reason is that for smooth rendering inside the VR display, you need to render the content at the display's native refresh rate, not that of the computer. VR display refresh rates are greater than PC refresh rates, typically up to 90fps. The rate will be differ from the computer's core refresh rate.</p> + +<p>Note that when the VR display is not presenting, {{domxref("VRDisplay.requestAnimationFrame")}} runs identically to {{domxref("Window.requestAnimationFrame")}}, so if you wanted, you could just use a single rendering loop, rather than the two we are using in our app. We have used two because we wanted to do slightly different things depending on whether the VR display is presenting or not, and keep things separated for ease of comprehension.</p> + +<h3 id="Rendering_and_display">Rendering and display</h3> + +<p>At this point, we've seen all the code required to access the VR hardware, request that we present our scene to the hardware, and start running the rending loop. Let's now look at the code for the rendering loop, and explain how the WebVR-specific parts of it work.</p> + +<p>First of all, we begin the definition of our rendering loop function — <code>drawVRScene()</code>. The first thing we do inside here is make a call to {{domxref("VRDisplay.requestAnimationFrame()")}} to keep the loop running after it has been called once (this occurred earlier on in our code when we started presenting to the VR display). This call is set as the value of the global <code>vrSceneFrame</code> variable, so we can cancel the loop with a call to {{domxref("VRDisplay.cancelAnimationFrame()")}} once we exit VR presenting.</p> + +<pre class="brush: js">function drawVRScene() { + // WebVR: Request the next frame of the animation + vrSceneFrame = vrDisplay.requestAnimationFrame(drawVRScene);</pre> + +<p>Next, we call {{domxref("VRDisplay.getFrameData()")}}, passing it the name of the variable that we want to use to contain the frame data. We initialized this earlier on — <code>frameData</code>. After the call completes, this variable will contain the data need to render the next frame to the VR device, packaged up as a {{domxref("VRFrameData")}} object. This contains things like projection and view matrices for rendering the scene correctly for the left and right eye view, and the current {{domxref("VRPose")}} object, which contains data on the VR display such as orientation, position, etc.</p> + +<p>This has to be called on every frame so the rendered view is always up-to-date.</p> + +<pre class="brush: js"> // Populate frameData with the data of the next frame to display + vrDisplay.getFrameData(frameData);</pre> + +<p>Now we retrieve the current {{domxref("VRPose")}} from the {{domxref("VRFrameData.pose")}} property, store the position and orientation for use later on, and send the current pose to the pose stats box for display, if the <code>poseStatsDisplayed</code> variable is set to true.</p> + +<pre class="brush: js"> // You can get the position, orientation, etc. of the display from the current frame's pose + + var curFramePose = frameData.pose; + var curPos = curFramePose.position; + var curOrient = curFramePose.orientation; + if(poseStatsDisplayed) { + displayPoseStats(curFramePose); + }</pre> + +<p> We now clear the canvas before we start drawing on it, so that the next frame is clearly seen, and we don't also see previous rendered frames:</p> + +<pre class="brush: js"> // Clear the canvas before we start drawing on it. + + gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);</pre> + +<p>We now render the view for both the left and right eyes. First of all we need to create projection and view locations for use in the rendering. these are {{domxref("WebGLUniformLocation")}} objects, created using the {{domxref("WebGLRenderingContext.getUniformLocation()")}} method, passing it the shader program's identifier and an identifying name as parameters.</p> + +<pre class="brush: js"> // WebVR: Create the required projection and view matrix locations needed + // for passing into the uniformMatrix4fv methods below + + var projectionMatrixLocation = gl.getUniformLocation(shaderProgram, "projMatrix"); + var viewMatrixLocation = gl.getUniformLocation(shaderProgram, "viewMatrix");</pre> + +<p>The next rendering step involves:</p> + +<ul> + <li>Specifying the viewport size for the left eye, using {{domxref("WebGLRenderingContext.viewport")}} — this is logically the first half of the canvas width, and the full canvas height.</li> + <li>Specifying the view and projection matrix values to use to render the left eye — this is done using the {{domxref("WebGLRenderingContext.uniformMatrix", "WebGLRenderingContext.uniformMatrix4fv")}} method, which is passed the location values we retrieved above, and the left matrices obtained from the {{domxref("VRFrameData")}} object.</li> + <li>Running the <code>drawGeometry()</code> function, which renders the actual scene — because of what we specified in the previous two steps, we will render it for the left eye only.</li> +</ul> + +<pre class="brush: js"> // WebVR: Render the left eye’s view to the left half of the canvas + gl.viewport(0, 0, canvas.width * 0.5, canvas.height); + gl.uniformMatrix4fv(projectionMatrixLocation, false, frameData.leftProjectionMatrix); + gl.uniformMatrix4fv(viewMatrixLocation, false, frameData.leftViewMatrix); + drawGeometry();</pre> + +<p>We now do exactly the same thing, but for the right eye:</p> + +<pre class="brush: js"> // WebVR: Render the right eye’s view to the right half of the canvas + gl.viewport(canvas.width * 0.5, 0, canvas.width * 0.5, canvas.height); + gl.uniformMatrix4fv(projectionMatrixLocation, false, frameData.rightProjectionMatrix); + gl.uniformMatrix4fv(viewMatrixLocation, false, frameData.rightViewMatrix); + drawGeometry();</pre> + +<p>Next we define our <code>drawGeometry()</code> function. Most of this is just general WebGL code required to draw our 3D cube. You'll see some WebVR-specific parts in the <code>mvTranslate()</code> and <code>mvRotate()</code> function calls — these pass matrices into the WebGL program that define the translation and rotation of the cube for the current frame</p> + +<p>You'll see that we are modifying these values by the position (<code>curPos</code>) and orientation (<code>curOrient</code>) of the VR display we got from the {{domxref("VRPose")}} object. The result is that, for example, as you move or rotate your head left, the x position value (<code>curPos[0]</code>) and y rotation value (<code>[curOrient[1]</code>) are added to the x translation value, meaning that the cube will move to the right, as you'd expect when you are looking at something and then move/turn your head left.</p> + +<p>This is a quick and dirty way to use VR pose data, but it illustrates the basic principle.</p> + +<pre class="brush: js"> function drawGeometry() { + // Establish the perspective with which we want to view the + // scene. Our field of view is 45 degrees, with a width/height + // ratio of 640:480, and we only want to see objects between 0.1 units + // and 100 units away from the camera. + + perspectiveMatrix = makePerspective(45, 640.0/480.0, 0.1, 100.0); + + // Set the drawing position to the "identity" point, which is + // the center of the scene. + + loadIdentity(); + + // Now move the drawing position a bit to where we want to start + // drawing the cube. + + mvTranslate([ + 0.0 - (curPos[0] * 25) + (curOrient[1] * 25), + 5.0 - (curPos[1] * 25) - (curOrient[0] * 25), + -15.0 - (curPos[2] * 25) + ]); + + // Save the current matrix, then rotate before we draw. + + mvPushMatrix(); + mvRotate(cubeRotation, [0.25, 0, 0.25 - curOrient[2] * 0.5]); + + // Draw the cube by binding the array buffer to the cube's vertices + // array, setting attributes, and pushing it to GL. + + gl.bindBuffer(gl.ARRAY_BUFFER, cubeVerticesBuffer); + gl.vertexAttribPointer(vertexPositionAttribute, 3, gl.FLOAT, false, 0, 0); + + // Set the texture coordinates attribute for the vertices. + + gl.bindBuffer(gl.ARRAY_BUFFER, cubeVerticesTextureCoordBuffer); + gl.vertexAttribPointer(textureCoordAttribute, 2, gl.FLOAT, false, 0, 0); + + // Specify the texture to map onto the faces. + + gl.activeTexture(gl.TEXTURE0); + gl.bindTexture(gl.TEXTURE_2D, cubeTexture); + gl.uniform1i(gl.getUniformLocation(shaderProgram, "uSampler"), 0); + + // Draw the cube. + + gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, cubeVerticesIndexBuffer); + setMatrixUniforms(); + gl.drawElements(gl.TRIANGLES, 36, gl.UNSIGNED_SHORT, 0); + + // Restore the original matrix + + mvPopMatrix(); + } +</pre> + +<p>The next bit of the code has nothing to do with WebVR — it just updates the rotation of the cube on each frame:</p> + +<pre class="brush: js"> // Update the rotation for the next draw, if it's time to do so. + + var currentTime = (new Date).getTime(); + if (lastCubeUpdateTime) { + var delta = currentTime - lastCubeUpdateTime; + + cubeRotation += (30 * delta) / 1000.0; + } + + lastCubeUpdateTime = currentTime;</pre> + +<p>The last part of the rendering loop involves us calling {{domxref("VRDisplay.submitFrame()")}} — now all the work has been done and we've rendered the display on the {{htmlelement("canvas")}}, this method then submits the frame to the VR display so it is displayed on there as well.</p> + +<pre class="brush: js"> // WebVR: Indicate that we are ready to present the rendered frame to the VR display + vrDisplay.submitFrame(); +}</pre> + +<h3 id="Displaying_the_pose_position_orientation_etc._data">Displaying the pose (position, orientation, etc.) data</h3> + +<p>In this section we'll discuss the <code>displayPoseStats()</code> function, which displays our updated pose data on each frame. The function is fairly simple.</p> + +<p>First of all, we store the six different property values obtainable from the {{domxref("VRPose")}} object in their own variables — each one is a {{domxref("Float32Array")}}.</p> + +<pre class="brush: js">function displayPoseStats(pose) { + var pos = pose.position; + var orient = pose.orientation; + var linVel = pose.linearVelocity; + var linAcc = pose.linearAcceleration; + var angVel = pose.angularVelocity; + var angAcc = pose.angularAcceleration;</pre> + +<p>We then write out the data into the information box, updating it on every frame. We've clamped each value to three decimal places using <code><a href="/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/toFixed">toFixed()</a></code>, as the values are hard to read otherwise.</p> + +<p>You should note that we've used a conditional expression to detect whether the linear acceleration and angular acceleration arrays are successfully returned before we display the data. These values are not reported by most VR hardware as yet, so the code would throw an error if we did not do this (the arrays return <code>null</code> if they are not successfully reported).</p> + +<pre class="brush: js"> posStats.textContent = 'Position: x ' + pos[0].toFixed(3) + ', y ' + pos[1].toFixed(3) + ', z ' + pos[2].toFixed(3); + orientStats.textContent = 'Orientation: x ' + orient[0].toFixed(3) + ', y ' + orient[1].toFixed(3) + ', z ' + orient[2].toFixed(3); + linVelStats.textContent = 'Linear velocity: x ' + linVel[0].toFixed(3) + ', y ' + linVel[1].toFixed(3) + ', z ' + linVel[2].toFixed(3); + angVelStats.textContent = 'Angular velocity: x ' + angVel[0].toFixed(3) + ', y ' + angVel[1].toFixed(3) + ', z ' + angVel[2].toFixed(3); + + if(linAcc) { + linAccStats.textContent = 'Linear acceleration: x ' + linAcc[0].toFixed(3) + ', y ' + linAcc[1].toFixed(3) + ', z ' + linAcc[2].toFixed(3); + } else { + linAccStats.textContent = 'Linear acceleration not reported'; + } + + if(angAcc) { + angAccStats.textContent = 'Angular acceleration: x ' + angAcc[0].toFixed(3) + ', y ' + angAcc[1].toFixed(3) + ', z ' + angAcc[2].toFixed(3); + } else { + angAccStats.textContent = 'Angular acceleration not reported'; + } +}</pre> + +<h2 id="WebVR_events">WebVR events</h2> + +<p>The WebVR spec features a number of events that are fired, allowing our app code to react to changes in the state of the VR display (see <a href="/en-US/docs/Web/API/WebVR_API#Window_events">Window events</a>). For example:</p> + +<ul> + <li>{{event("vrdisplaypresentchange")}} — Fires when the presenting state of a VR display changes — i.e. goes from presenting to not presenting, or vice versa.</li> + <li>{{event("vrdisplayconnect")}} — Fires when a compatible VR display has been connected to the computer.</li> + <li>{{event("vrdisplaydisconnect")}} — Fires when a compatible VR display has been disconnected from the computer.</li> +</ul> + +<p>To demonstrate how they work, our simple demo includes the following example:</p> + +<pre class="brush: js">window.addEventListener('vrdisplaypresentchange', function(e) { + console.log('Display ' + e.display.displayId + ' presentation has changed. Reason given: ' + e.reason + '.'); +});</pre> + +<p>As you can see, the {{domxref("VRDisplayEvent", "event object")}} provides two useful properties — {{domxref("VRDisplayEvent.display")}}, which contains a reference to the {{domxref("VRDisplay")}} the event was fired in response to, and {{domxref("VRDisplayEvent.reason")}}, which contains a human-readable reason why the event was fired.</p> + +<p>This is a very useful event; you could use it to handle cases where the display gets disconnected unexpectedly, stopping errors from being thrown and making sure the user is aware of the situation. In Google's Webvr.info presentation demo, the event is used to run an <a href="https://github.com/toji/webvr.info/blob/master/samples/03-vr-presentation.html#L174"><code>onVRPresentChange()</code> function</a>, which updates the UI controls as appropriate and resizes the canvas.</p> + +<h2 id="Summary">Summary</h2> + +<p>This article has given you the very basics of how to create a simple WebVR 1.1 app, to help you get started.</p> |