Search the Community

Showing results for tags 'vr'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



Website URL





Found 64 results

  1. Hi, I am using Babylon.js with HTC Vive and WebVR and I want to pick a point on the mesh (this.wallMesh) with controller ray. I don't want to use default VRHelper interactions. I expected that this code should work for it but it seems that there is wrong rotation because the rotation of the device behaves totally weird. Do you have some tips on how to fix it? Thank you a lot. const pickMesh = BABYLON.Mesh.CreateSphere( 'pickMesh', 5, 0.1, this.scene, ); const ray = new BABYLON.Ray( BABYLON.Vector3.Zero(), BABYLON.Vector3.One(), 100, ); this.scene.registerAfterRender(() => { ray.origin = controller.devicePosition; ray.direction = controller.deviceRotationQuaternion.toEulerAngles(); const hit = this.wallMesh! .getScene() .pickWithRay(ray, (mesh) => mesh === this.wallMesh); if (hit) { console.log(hit.hit); if (hit.pickedPoint) { pickMesh.position = hit.pickedPoint; } } });
  2. devAxeon

    VR fixed sight

    Hi ! I'm looking for a sight that is not following the scene's geometry like the gaze tracker mesh. So I tried to create a plane with a texture, child of the VR camera, that is at the center and in front of this camera. Here's what I've tried: But it isn't always in the center, or doesn't work without a headset connected (like for google cardboard). Furthermore, It would be nice if we can have the good scene.activeCamera on the 'onEnteringVR' event Thank you so much for your help !!
  3. Hi, I would like to prevent the headset from changing the position of the camera, and keep only the device's rotation. Did I miss something ? Here's a PG of what I've tried: Tested on Firefox + HTC Vive. Thanks !!
  4. I borrowed an Oculus Go from work, here are some first impressions/ questions. Overall I'm really impressed with the hardware. It's comfortable to wear, resolution and FPS seem fine to me. My only other VR headset experiences have been cardboard and a Samsung one (forget which it was), Oculus Go is better than either of those. As a dev device, I've not downloaded any SDK or looked at any dev doc, I just point the device's browser at http://<my laptop's IP address>:8080 and off I go (no JS console tho!) It's amazing how much the following one line of code gets you: const vrHelper = scene.createDefaultVRExperience({ createDeviceOrientationCamera: false }); it gives you a VR icon on the screen. When you tap it, Oculus Go launches into fullscreen VR. Looking around looks great, works seamlessly out the box. Worth noting that Oculus Go is a 3 DOF system, rather than 6 DoF. Meaning that it only tracks orientation of the headset (and the controller too? Need to double-check this), not its position. Where I have run into some difficulties is with the controller. I'm currently trying to implement so that the user can point at a mesh with the controller, and pull the trigger to select a point on that mesh. I'm going to list some of these in this thread. Bearing in mind I've only had the device for a few hours. If anyone has suggestions for these issues, would be great to hear. 1. getting PickingInfo So far I've only been able to get PickingInfo with vrHelper.onNewMeshPicked this only fires when the pointer first moves over the mesh, regardless of whether the user is pressing one of the buttons or not. The callback for picking up button changes controller.onButtonStateChange returns info about the button, but no pickInfo. I guess I need to call scene.Pick inside the button state change callback, haven't had time to try this yet. 2. customizing the controller mesh When you enable interactivity with vrHelper.enableInteractions(); The default mesh for the controller is a long, thin cylinder that stretches from where you're pointing, right up to your eyes. It's really in-your-face, and I haven't found a way to modify or disable it yet. This doesn't have any effect: vrHelper.displayGaze = false; And neither did trying to apply a custom mesh with controller[0].attachToMesh(mesh) I've got to be offline for a few hours now, but I'll keep experimenting. If anyone has questions, tips, please do post.
  5. I'm able to pick sprites with Scene.pickSprite, but is there a way to pick them with scene.pickWithRay? PickWithRay only seems to return meshes (I've set isPickable to true on the sprites and confirmed that they're pickable with Scene.pickSprite). The use-case is, I'm trying to pick sprites with a 3DoF VR Controller, so I don't have a 2d pointerX and pointerY to work with, only the controller's forward ray. I'm thinking that my options are: a. Use Vector3.Project to try to unproject the pickedPoint of pickWithRay (I'm using a skybox, so the ray will always hit something), then put that point into Scene.pickSprite. I tried this quickly, but I think the unprojected point might be off b. replace the sprites with billboarded mesh instances that can directly be hit with pickWithRay. I suspect this might be the easiest option? Or is there some easier way to pick a sprite with a ray?
  6. devAxeon

    VR + Mesh.clone()

    Hi, Just found this bug: an error broke the scene if we try to clone a mesh when we are in VR mode. Here's a PG: Thanks !!
  7. Hi everyone ! I'm trying to move between two cubemaps with an HTC VIVE. There's two interactions avaibles: Click with the controller on a plane. Use the gaze tracker on this same plane. This two cases works perfectly, but: Controller's ray is blocked by the cube (even with isBlocker = false). gazeTrackerMesh is not visible on my plane. Here is a PG:
  8. devAxeon

    VR camera height

    Hi, How about an option in vrHelper to force the height of the camera, even if isInVRMode is true ? I'm testing my scene seated, but I want to be in a stanted view inside my VR. So it will be nice if we have a forceDefaultHeight option in webVROptions, to always be standing Thank you !!
  9. devAxeon

    VR teleportation mesh

    Hi, First, great job for the VR's implementation !! It's incrediblely simple and it works perfectly !! I would like to customize the mesh created in the "_createTeleportationCircles" function. Is it possible to have an option to do it ? (Or can I already do it ?) Many thanks !!
  10. Hello, We are our scene through data coming from a third-party application and we would like to navigate through the scene in VR. We are using the WebVR Experience Helper with teleportation enabled - it works pretty well apart from one small issue. Calibrating the objects in the environment to the user's height such that the objects that are supposed to be at eye-level seem to be at the eye-level in VR is something we have to do manually right now. Is there a standard height (from the ground) that the user's eye level is at by default? Or does this depend on ground size/subdivisions? Also, does 1 unit in BabylonJS map to a real unit (feet/metre/etc.) in VR? Let me know if my question seems a little unclear. We are using Windows Mixed Reality Headsets for this.
  11. ian

    New Level
  12. If I use VR normally on any Oculus app, the meshes however far it is, aren't flickering, even with any reflecting material or any advanced shaders. But when using Babylon WebVR, meshes that are far, flicker so heavily. What is causing that flickering? Is it Aliasing? I think Aliasing should affect all edges not just meshes which are far. Also, this only appears inside VR HMD, not in the split screen view in the browser on the PC display (hence, couldn't get the screenshot as well). I believe this has something to do with GPU param settings (Anisotropic filtering? Aliasing level?).
  13. If we use MirrorTexture based reflections on any planer surface while using WebVRFreeCamera (And viewing in the VR HMD), reflected image is different for both the eyes. So it looks consistent and random. It can't be perceived to be a reflection. It looks like random artefacts. Because reflection is not matching. I am not able to imagine if the mirror texture should be obtained differently for both the eyes using two camera rig (Currently Babylon seems to be doing that ) or obtained reflection texture should be same for both the eyes. If it does need to be obtained two two camera rig (separate reflectionTexture for both eyes, then maybe the two camera rig has some position issue and position for internal mirrorTexture capture cameras need to be corrected. But mirrorTexture is wrong while checking in VR HMD is what I can tell. @RaananW , Mentioning you because of WebVRFReeCamera.
  14. Chrome supports WebVR for Oculus via SteamVR. I tried that with Oculus Touch controller. Now since Oculus support in Chrome is via SteamVR (OpenVR), the Oculus controller are being detected as Vive controllers in Babylon WebVR implementation. Its not just that controller 3D models are wrongly detected/displayed. Oculus controller input/keys which we have setup for our app and are working fine in Firefox, aren't working when using same oculus controllers in Chrome-SteamVR. Although I know we should wait for official chrome support for Oculus. Just pointing it out here if it matters.
  15. Converge

    Aliasing in VR mode

    Hey everyone, We are building an environment that contains many thin lines and grills. While in normal mode everything looks fine (with Babylon's AA enabled), however after entering the VR mode (using the VRhelper) jagged edges become very apparent to the point that it's impossible to distinct some of the details. I have made a few screenshots with test environment to demonstrate the problem. Solutions like FXAA plugged to post processing pipeline make everything blurrier and worse. I'm wondering why is it happening (stereo cameras are super low-res?) and is there a way to deal with it? It would be ok to take some performance hits and render everything internally in much higher resolution for example. What are the good practices for fighting the jagged edges in VR in general? Many thanks in advance!
  16. Perplexus in VR mode How did I play it.
  17. For the normal camera, we can have bounding mesh which has collision enabled and that way we can restrict camera movement to the certain area. But the WebVR camera is free and doesn't respect the collision, So how to restrict the movement of the user in the model? Note: This is not about Room Scale and guardian system. This is about restricting camera movement including teleportation to a certain area inside VR.
  18. This post to help anybody that would like to debug/work on GearVR or other mobile devices over WiFi. There is a pull request that was approved today. The code to implement was literally 10 minutes of mostly copy/paste, while the setup took over an hour to figure out, so I wanted to share that (also for my forgetful self in the future!). What does work on GearVR is trigger button will select meshes. Trackpad allows you to rotate. At least with trigger button working it is somewhat usable now! The todo list at least contains: - get a 3D model - onTrackpadChangedObservable (doesn't exist, but would be called). The trackpad axis could then show on mesh where it is selected (ie: small circle). - position on GamePad is always null by design. It should be held in the hand from the constructor, which I think defaults to 'right'. that will fix the rays coming from the eyes and have the model visible. I think choose a position in front and model should align with the forward ray from that point. - using the headset device control I was triggering scene pick events, but did not seem to be where I was looking. So, support for GearVR headset (without a controller) needs yet to be added. I just wanted to outline my workflow that allows you to debug your GearVR from Chrome on your desktop. If you directly connect your device via USB with USB debugging you can already do this and most of us already do that. With the GearVR, however, you will need to debug over WiFi as the USB connection on the GearVR headset is for charging only (and the GearVR headset uses the USB connection of your phone). First you need to have adb (install the android SDK or download the platform-tools). Step 1. Connect your android device to USB and set the TCP port: C:\Program Files (x86)\Android\android-sdk\platform-tools>adb devices List of devices attached * daemon not running. starting it now on port 5037 * * daemon started successfully * 9889234230424e3755 device C:\Program Files (x86)\Android\android-sdk\platform-tools>adb tcpip 5555 Step 2. Disconnect your android device from USB. Ensure you are connected to the same WiFi network as your desktop. Get your Phone IP address from Settings->About->Status (on S8). C:\Program Files (x86)\Android\android-sdk\platform-tools>adb connect connected to C:\Program Files (x86)\Android\android-sdk\platform-tools>adb devices List of devices attached device Note: You may need to reconnect your device during development, so probably leave this prompt open, so you can reconnect. Open up chrome://inspect on your desktop - make sure you are on the Devices tab. Click port forwarding to allow accessing ie: localhost:3000. Open up Samsung Internet to (so, loopback address instead of 'localhost'). You should see the following under Remote Target in the lower part of the chrome://inspect tab: SM-G950W # [ ] 'Title of page' <inspect> Chrome ... com.brave.browser ... Also, any other debuggable Chromium browsers running will show up (ie: Chrome, Brave, etc.). Note that SM-G950W will be your phone model identifier followed by your phone WiFi IP address. Clicking the inspect link (bold above) to open DevTools. You can now simulate clicks on your phone in the emulator and view the console. This is how you would, for example, see the button indexes to map to functionality you need. If you are working on BabylonJS itself you can get away with only 'npm install' and 'gulp typescript' then symbolically link (or copy) the preview release with the node_modules in your project. Otherwise you can use the local PG in BabylonJS. I've tried both and they work well.
  19. little_bigben

    Rotate Camera in VR

    Hello, I'm developping a web VR editor for 360 video experiences, and I would like in this case to force the camera rotation when I change my POV. I tried to use setTarget on ArcRotateCamera, but it doesn't work when scene.createDefaultVRExperience() is called. How could I change my camera rotation? Thanks, Benjamin
  20. Hi everyone, I'm trying to integrate VR cameras to my project, but I encountered some problems with the VRDeviceOrientationArcRotateCamera. As you can see there : , I can't set the target of the camera on the moving sphere, and I have no idea what's going wrong with my code. If you have some ideas about how I can fix that, you are very welcome
  21. Just a question about floorMeshName. export interface VRTeleportationOptions { floorMeshName?: string; // If you'd like to provide a mesh acting as the floor floorMeshes?: Mesh[]; } I was expecting from that comment a different comparison than ( if (this._floorMeshName && === this._floorMeshName) { return true; } Otherwise if you set the floor mesh name to 'ground' and then had another mesh like 'ground-cover' or any mesh 'ground' as a substring would be teleportable.
  22. Hi everybody: We are starting a VR project and, as we want it to be as cross-platform as possible with a unique base code, I'm thinking on giving a chance to BJS (maybe this is not the best case of use for this awesome engine). This is, indeed, not a genuine 3D Engine development, in terms of 3D scenes with different assets into it (characters, props, effects, ...), but a theatrical experience where the user sees an omni-directional stereo (ODS) pre-renderized CG animation, walking on the streets of a medieval village, going downhill from the cathedral to the palace. We need this approach in order to have high-quality CGs. As said we are not looking for a pseudo-VR, consisting in a planar projection of a 360º video on the inner side of a sphere, but a real VR video experience. Under this assumption we'll need a different video stream for each eye, in order to give the user a sense of depth as you look around in every direction. Near things look near, far things look far. Of course we'll be rendering texts and other CG effects overlays on the video streams. I'm searching through the docs and this forum for a starting point on all this, but I'm only finding pseudo-VR approachs. Is this type of "dual-camera" planned to be incorporated to BJS in future versions? Best regards.
  23. Hi, I'm working on VR web apps but I saw that when I use VRDeviceOrientationFreeCamera the user still can move (on PC) the camera with arrows. I tried to use "VRDeviceOrientationArcRotateCamera" but in console I get "VRDeviceOrientationArcRotateCamera is not a constructor" What can I do? Thank you all. Edit: I tried also a registerBeforeRender with a 0,0,0 position for the camera but it still moves with arrows.
  24. brianzinn

    Highlighter not working in VR

    I have the highlighter working in a regular scene. In the VR scene the highlight is not appearing where the mesh is and with left/right eye it is in different positions. Is this a bug or is using the Highlighter not supported with VR cameras?
  25. brianzinn

    VR other devices

    I have my scene working great in GearVR, but fuzzy in Cardboard on OnePlus. I think what I need to do is calibrate the VR metrics for my viewer: google cardboard is: Sreen to lens distance 39.3mm Inter-lens distance 63.9mm Screen vertical alignment: Bottom Tray to lens-center distance 35mm k1 distortion coefficient 0.33582564 k2 distortion coefficient 0.55348791 There is a cardboard calibrator ( where you enter your PPI and will generate a QR code - using the cardboard clicking button. Would be cool to have something built into BabylonJS. I think I need to work my way from those numbers to those use by camera metrics - does anybody know how? Otherwise maybe I'll try to start a PG using GUI + gaze to alter the metrics - except it looks like the metrics are a constructor parameter for the camera, so doesn't look like a good option. result.hResolution = 1280; result.vResolution = 800; result.hScreenSize = 0.149759993; result.vScreenSize = 0.0935999975; result.vScreenCenter = 0.0467999987; result.eyeToScreenDistance = 0.0410000011; result.lensSeparationDistance = 0.0635000020; result.interpupillaryDistance = 0.0640000030; result.distortionK = [1.0, 0.219999999, 0.239999995, 0.0]; result.chromaAbCorrection = [0.995999992, -0.00400000019, 1.01400006, 0.0]; result.postProcessScaleFactor = 1.714605507808412; result.lensCenterOffset = 0.151976421; @davrous - Got some ideas - maybe something that could be a VR Experience Helper option to choose popular viewers using GUI + gaze like your demo (ie: cardboard/gearvr/etc.)?