davrous

Members
  • Content count

    644
  • Joined

  • Last visited

  • Days Won

    23

davrous last won the day on August 15

davrous had the most liked content!

3 Followers

About davrous

  • Rank
    Advanced Member

Contact Methods

  • Twitter
    davrous

Recent Profile Visitors

3,886 profile views
  1. Hi there! I've created a detailed tutorial on how to create a full cross-platforms WebGL game (using Babylon.js of course) from mobile touch devices, to desktop up to VR! https://www.davrous.com/2018/08/14/using-webgl-and-pwa-to-build-an-adaptative-game-for-touch-mouse-and-vr-devices/ I've tried to create small samples for each section often pointing to the playground to explain you how to do each part. I hope it will help people discovering some Babylon.js principles and will also generate some forks to create other fun 3d experiences! At last, I'm explaining how to manage properly offline via a Service Worker coming from PWA coupled with our embedded IndexedDB layer. As a bonus, you'll discover how to push your game to the Microsoft Store Hope you'll like it! Cheers, David
  2. davrous

    Rotating a mesh by degree's

    Hi, Rotations are in radians. You need to convert degrees to radians: https://en.wikipedia.org/wiki/Radian. Radian = degree * Math.PI / 180 By the way, 45 == Math.PI / 4, 90 == Math.PI / 2 and 180 == Math.PI David
  3. davrous

    How can I force repaint?

    Don't forget JavaScript is mono-threaded. So if your custom format loader is taking all the CPU, we won't be able to render anything until you've finished doing your job. To give more time to the UI thread, you can try to make some pauses between each big meshes load using setTimeout for instance. You can also try to do as much work as possible on a web worker. But as soon as the worker will need to join the UI Thread to send data, it will freeze the rendering anyway.
  4. davrous

    Fastest way to display fps

    Interesting discussion. As Jerome said, try to avoid updating too often the DOM as the simple fact to display the FPS will... lower the FPS itself! Probably the best option would be to display the FPS counter directly in the 3D canvas using a DynamicTexture or the BABYLON.Gui.
  5. Patrick is the 3D artist of the MS Babylon.js team so go for it @inteja
  6. davrous

    Using the GearVR controllers

    You can have a look to this 3DSMax specific feature built by @Luaacro there: https://medium.com/babylon-js/actions-builder-b05e72aa541a?source=linkShare-2f9e89c24e66-1530275332
  7. davrous

    Newbie OnPickTrigger Problem

    Yes it's by design to let mobile getting out of VR by double tapping on the screen. It shouldn't be an issue as you're not supposed to double click while in VR, no?
  8. davrous

    Using the GearVR controllers

    We've been using the Actions from the ActionManager via the 3DSMax exporter which is still today the richer version we've got with the Action Builder integrated. You can build the same actions by code, looking at our doc. As soon as the VRHelper finds an action registered on a mesh, it shows a different ray from the controller and let you activate the action with the trigger. We're working in 3.3 to easily transform any mesh into an interactive element. Stay tuned!
  9. davrous

    VRExperienceHelper Gamepad Conflict

    Interesting. It's probably a use case I've missed. I'll check that soon. Before creating the VRHelper, did you use a Universal Camera or a Gamepad Camera?
  10. davrous

    GearVR and Occulus Go

    Most of the people are sick when you're moving the camera using gamepad or thumbsticks, that's why almost all VR experiences & games are offering teleportation instead. Some VR experiences are offering both. I've decided to only expose a teleportation option in the VR Helper. It's also my favorite way of moving in VR
  11. davrous

    Using the GearVR controllers

    Hi @MackeyK24, I'm interested in better understanding why you're lost with our approach. I've tried to make it as simple as possible so if you're lost, I'm missing my goal. Have you watched the BUILD session we've done with @Deltakosh : https://channel9.msdn.com/events/Build/2018/BRK2436 Regarding complexity, all Unity VR developers I've met who have seen our Babylon.js WebVR approach were amazed by the simplicity and productivity. But you're coming from another background so we maybe need to write better doc. Let's try to answer some of your questions: - the VRHelper is your best friend to build WebVR scenes : http://doc.babylonjs.com/how_to/webvr_helper & https://www.davrous.com/2017/12/22/babylon-js-vrexperiencehelper-create-fully-interactive-webvr-apps-in-2-lines-of-code/ - building a VR scene in Babylon is not different from building a normal scene with some interesting particular details to note: - the unit 1 is the Babylon.js world should be considered as 1 meter in VR - you need to pay much more attention to performance as you need to target 90 fps on high end HMD and 60 fps on mobile while doing a double rendering of the scene (left/right eyes). So check the stats/performance tab of the Debug Layer to optimize your scene. - Once you've build your scene switching to WebVR should only be a matter of 1 to 2 lines of code thanks to the helper which covers: - download and display of the current VR headset controllers used (HTC Vive, Oculus Touch, Windows MR, etc.) - provide a gaze doing the mesh/hit testing for your - provide a teleportation option which has been optimized for performance and to reduce motion sickness - WebVR is a standard living on top of WebGL. If you're watching our BUILD session, you'll see that Microsoft has implemented it inside Edge and Mozilla in Firefox on Windows. On Android, you've got support thanks to Chrome, Samsung Internet or Oculus Browser. Devices that has a WebVR support are: HTC Vive, Oculus, Oculus Go, Gear VR, DayDream and Windows Mixed Reality devices (including HoloLens in last build). - if the browser or device doesn't support WebVR, we're fall-backing automatically to the VRDeviceOrientation camera using the sensor to simulate a HMD and doing a double rendering of the scene with lens deformation (which can be disabled). But, the VRHelper has been made to hide this complexity. To enter WebVR, you are forced, by the specification and for security reasons, to have a user's interaction. That's why the VRHelper is creating a VR button for you to let you enter VR. This can be customized as described in the documentation. David
  12. davrous

    GearVR and Occulus Go

    Hello, In my article: https://www.davrous.com/2017/12/22/babylon-js-vrexperiencehelper-create-fully-interactive-webvr-apps-in-2-lines-of-code/ I'm providing a sample to show how to use gaze for a timing selection approach: https://playground.babylonjs.com/#5P51YL It will work on a card board which has no form of input controller as well as on Gear VR and Oculus Go. You should always go on the VRHelper as it has been made to address all VR scenarios. The WebVRCamera is a building block of the VRHelper. It should only be used directly for very custom / specific use cases. David
  13. Hello, In your code: https://studio.rocketclowns.com/agnes/canvas/rocketLander/main.js?v=201806111547 You're updating the camera position with this code: camera.position.x += ((userX * 256) - camera.position.x) * 0.05; camera.position.y += ((userY * -128) - camera.position.y) * 0.05; But scene.pointerX is a undefined at the beginning when the mouse is out the canvas then the camera.position.x & y become NaN and thus the update Audio parameter function fails. The solution is either to test for undefined on pointerX and pointerY before computing your position or to register your "mousemove" event on the canvas element rather that on the window object. David
  14. You're right about the delay. Ideally, I would need to take their audio graph and connect to the input node of the BABYLON.Sound object. I've seen in their samples & doc I can provide my own audioContext, which is good but I haven't seen how to get the output audio node object to connect to my own audio graph. It then seems we would need to both modify the audio engine of Babylon.js and tone.js to make this works. Indeed, I don't have the option to customize my audio graph today like that because I haven't thought about such a use case. For now, we're both creating an audioContext and both connecting to the audio destination (speakers). To make what you'd like working, it would need to let Babylon.js taking control of the audio destination and panning node (to have spatialization) and put in between the procedural audio generation of tone.js
  15. davrous

    Limit Virtrual Joystick Area To Bottom

    Hi, It's been a long time since I was planning to refactor the VirtualJoystick to use a dynamic texture in the 3D scene rather than on a 2D canvas on top and add such customization. I didn't have the feeling it was widely used so it never came to my priority list. The current logic is there: https://github.com/BabylonJS/Babylon.js/blob/master/src/Tools/babylon.virtualJoystick.ts So to answer your questions, you currently can't do what you'd like to do. But it's fairly easy to take the code of babylon.virtualJoystick.ts and tune it for your needs. The bigger job to be done is refactoring it to render it in the 3D scene rather than a 2D canvas. Doing so will enable everything you need out of the box. By the way, to make it works on mobile/touch, you need jQuery PEP to be referenced in your page. David