Search the Community

Showing results for tags 'vr'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Found 49 results

  1. For the normal camera, we can have bounding mesh which has collision enabled and that way we can restrict camera movement to the certain area. But the WebVR camera is free and doesn't respect the collision, So how to restrict the movement of the user in the model? Note: This is not about Room Scale and guardian system. This is about restricting camera movement including teleportation to a certain area inside VR.
  2. This post to help anybody that would like to debug/work on GearVR or other mobile devices over WiFi. There is a pull request that was approved today. The code to implement was literally 10 minutes of mostly copy/paste, while the setup took over an hour to figure out, so I wanted to share that (also for my forgetful self in the future!). What does work on GearVR is trigger button will select meshes. Trackpad allows you to rotate. At least with trigger button working it is somewhat usable now! The todo list at least contains: - get a 3D model - onTrackpadChangedObservable (doesn't exist, but would be called). The trackpad axis could then show on mesh where it is selected (ie: small circle). - position on GamePad is always null by design. It should be held in the hand from the constructor, which I think defaults to 'right'. that will fix the rays coming from the eyes and have the model visible. I think choose a position in front and model should align with the forward ray from that point. - using the headset device control I was triggering scene pick events, but did not seem to be where I was looking. So, support for GearVR headset (without a controller) needs yet to be added. I just wanted to outline my workflow that allows you to debug your GearVR from Chrome on your desktop. If you directly connect your device via USB with USB debugging you can already do this and most of us already do that. With the GearVR, however, you will need to debug over WiFi as the USB connection on the GearVR headset is for charging only (and the GearVR headset uses the USB connection of your phone). First you need to have adb (install the android SDK or download the platform-tools). Step 1. Connect your android device to USB and set the TCP port: C:\Program Files (x86)\Android\android-sdk\platform-tools>adb devices List of devices attached * daemon not running. starting it now on port 5037 * * daemon started successfully * 9889234230424e3755 device C:\Program Files (x86)\Android\android-sdk\platform-tools>adb tcpip 5555 Step 2. Disconnect your android device from USB. Ensure you are connected to the same WiFi network as your desktop. Get your Phone IP address from Settings->About->Status (on S8). C:\Program Files (x86)\Android\android-sdk\platform-tools>adb connect connected to C:\Program Files (x86)\Android\android-sdk\platform-tools>adb devices List of devices attached device Note: You may need to reconnect your device during development, so probably leave this prompt open, so you can reconnect. Open up chrome://inspect on your desktop - make sure you are on the Devices tab. Click port forwarding to allow accessing ie: localhost:3000. Open up Samsung Internet to (so, loopback address instead of 'localhost'). You should see the following under Remote Target in the lower part of the chrome://inspect tab: SM-G950W # [ ] 'Title of page' <inspect> Chrome ... com.brave.browser ... Also, any other debuggable Chromium browsers running will show up (ie: Chrome, Brave, etc.). Note that SM-G950W will be your phone model identifier followed by your phone WiFi IP address. Clicking the inspect link (bold above) to open DevTools. You can now simulate clicks on your phone in the emulator and view the console. This is how you would, for example, see the button indexes to map to functionality you need. If you are working on BabylonJS itself you can get away with only 'npm install' and 'gulp typescript' then symbolically link (or copy) the preview release with the node_modules in your project. Otherwise you can use the local PG in BabylonJS. I've tried both and they work well.
  3. Aliasing in VR mode

    Hey everyone, We are building an environment that contains many thin lines and grills. While in normal mode everything looks fine (with Babylon's AA enabled), however after entering the VR mode (using the VRhelper) jagged edges become very apparent to the point that it's impossible to distinct some of the details. I have made a few screenshots with test environment to demonstrate the problem. Solutions like FXAA plugged to post processing pipeline make everything blurrier and worse. I'm wondering why is it happening (stereo cameras are super low-res?) and is there a way to deal with it? It would be ok to take some performance hits and render everything internally in much higher resolution for example. What are the good practices for fighting the jagged edges in VR in general? Many thanks in advance!
  4. Rotate Camera in VR

    Hello, I'm developping a web VR editor for 360 video experiences, and I would like in this case to force the camera rotation when I change my POV. I tried to use setTarget on ArcRotateCamera, but it doesn't work when scene.createDefaultVRExperience() is called. How could I change my camera rotation? Thanks, Benjamin
  5. Hi everyone, I'm trying to integrate VR cameras to my project, but I encountered some problems with the VRDeviceOrientationArcRotateCamera. As you can see there : , I can't set the target of the camera on the moving sphere, and I have no idea what's going wrong with my code. If you have some ideas about how I can fix that, you are very welcome
  6. Just a question about floorMeshName. export interface VRTeleportationOptions { floorMeshName?: string; // If you'd like to provide a mesh acting as the floor floorMeshes?: Mesh[]; } I was expecting from that comment a different comparison than ( if (this._floorMeshName && === this._floorMeshName) { return true; } Otherwise if you set the floor mesh name to 'ground' and then had another mesh like 'ground-cover' or any mesh 'ground' as a substring would be teleportable.
  7. Hi everybody: We are starting a VR project and, as we want it to be as cross-platform as possible with a unique base code, I'm thinking on giving a chance to BJS (maybe this is not the best case of use for this awesome engine). This is, indeed, not a genuine 3D Engine development, in terms of 3D scenes with different assets into it (characters, props, effects, ...), but a theatrical experience where the user sees an omni-directional stereo (ODS) pre-renderized CG animation, walking on the streets of a medieval village, going downhill from the cathedral to the palace. We need this approach in order to have high-quality CGs. As said we are not looking for a pseudo-VR, consisting in a planar projection of a 360º video on the inner side of a sphere, but a real VR video experience. Under this assumption we'll need a different video stream for each eye, in order to give the user a sense of depth as you look around in every direction. Near things look near, far things look far. Of course we'll be rendering texts and other CG effects overlays on the video streams. I'm searching through the docs and this forum for a starting point on all this, but I'm only finding pseudo-VR approachs. Is this type of "dual-camera" planned to be incorporated to BJS in future versions? Best regards.
  8. Here is one beautiful chrome extension which allow you to stream your desktop full screeen (or browser fullscreen with f11 to your mobile. Expecially if you enable VR camera in desktop game in chrome browser and want to stream this to your mobile for your VR gear. And you don't need VR extra bloototh convroller because you can play your game with your keyboard+mouse of Desktop PC and get mirror of your desktop chrome to your mobile browser for your VR gear. Even your friends can watch your chat room with password you give them in URL. No extra coding just reuse chrome RTC extension. Mobile browser (chrome,firefox,opera) are receiver of your desktop PC chrome browser (game with VR camera). Greetings Ian
  9. VR other devices

    I have my scene working great in GearVR, but fuzzy in Cardboard on OnePlus. I think what I need to do is calibrate the VR metrics for my viewer: google cardboard is: Sreen to lens distance 39.3mm Inter-lens distance 63.9mm Screen vertical alignment: Bottom Tray to lens-center distance 35mm k1 distortion coefficient 0.33582564 k2 distortion coefficient 0.55348791 There is a cardboard calibrator ( where you enter your PPI and will generate a QR code - using the cardboard clicking button. Would be cool to have something built into BabylonJS. I think I need to work my way from those numbers to those use by camera metrics - does anybody know how? Otherwise maybe I'll try to start a PG using GUI + gaze to alter the metrics - except it looks like the metrics are a constructor parameter for the camera, so doesn't look like a good option. result.hResolution = 1280; result.vResolution = 800; result.hScreenSize = 0.149759993; result.vScreenSize = 0.0935999975; result.vScreenCenter = 0.0467999987; result.eyeToScreenDistance = 0.0410000011; result.lensSeparationDistance = 0.0635000020; result.interpupillaryDistance = 0.0640000030; result.distortionK = [1.0, 0.219999999, 0.239999995, 0.0]; result.chromaAbCorrection = [0.995999992, -0.00400000019, 1.01400006, 0.0]; result.postProcessScaleFactor = 1.714605507808412; result.lensCenterOffset = 0.151976421; @davrous - Got some ideas - maybe something that could be a VR Experience Helper option to choose popular viewers using GUI + gaze like your demo (ie: cardboard/gearvr/etc.)?
  10. Highlighter not working in VR

    I have the highlighter working in a regular scene. In the VR scene the highlight is not appearing where the mesh is and with left/right eye it is in different positions. Is this a bug or is using the Highlighter not supported with VR cameras?
  11. Virtual Reality, VR with Phaser

    In short, does Phaser have a VR game mode? Well, I'm looking to develop VR games, either played with headset or turning your phone. In the past I used Aframe to do some simple 'look around and spot stuff' kinda games with panorama (hdri) sky images. I want to take the advantage of real game engine like Phaser which includes physics, score or time tracking etc.. so I don't have to write everything from scratch. Any ideas? Thanks everyone! I'm new here and this is my first post on this forum
  12. Can anybody explain me how can I setup VRDeviceOrientationArcRotateCamera with gamepad controller. Is this already done or not? How can we attach gamepad with VRDeviceOrientationArcRotateCamera? Greetings Ian
  13. Mobile canvas resolution

    Hi everyone, I'm currently makind VR apps with BabylonJS, and try to solve the issue of the canvas resolution. I've already found some discussions about this topic, oddly none seems to concern the same issue as I do (it's not about anti-aliasing, as far as I understand, it's more about hardware pixel and css pixel) If I open this playground in my phone : (not VR, the question is the same for mobile VR and mobile non-VR applications) The pixels are too large. When watching the phone through Google Cardboard it's even worse. This playground is much better, with hardwareScaling set to 0.25 It's fairly logical, because my phone has 4 hardware pixels per "layout pixel". 1) I'm using a ZTE Axon 7 phone, with 2500*1080 screen. A full screen canvas has width = 900px, height=450px in my browser. Do you encounter the same issues with your phone ? 2) Is there already an automatic built-in way to set the canvas resolution ? Like with the <meta> or something ? (ie : without adding resizing code in the app) 3) If no, should not we add some code to recognize the screen dpi an set the hardwareScaling accordingly ? So it work "out of the box" on mobile. Or is it to CPU-intensive to enforce x16 resolution ? Thanks a lot for you inputs ! Have a nice week-end PS : I've also tried the scenes from @davrous -interesting- article here : , I'm 90% sure I have the same issues with the examples like Sponza scene, also the fact there are some textures ad lightning make it less obvious.
  14. Hello, I'd like to get some suggestions for how to make a 3D-background from images. Let me explain a bit more: given two sets of images for a cubemap/skybox (one set for each eye), how to render each cubemap separately? In a normal VR-scene the camera renders twice (one render for each eye) for the same object. For stereo backgrounds IMHO the VR-camera also has to render twice but for different objects on the same place. How could I solve this? Hopefully the question is not too complicated and makes sense, thanks for input
  15. Hello guys, I just completed a little demo: a very small VR scene editor. I'll be very happy if you wanna try and give me back some feedback. Don't forget also to get a look into the code Thanks
  16. Can anybody explain what is difference between WebVRFreeCamera vs VRDeviceOrientationFreeCamera Why should we use WebVRFreeCamera? Greetings Ian
  17. Is possible to mirror desktop's browser screen to mobile screen. Keyboard and mouse of desktop mashine should have control. For example WSAD and mouse should control desktop borovser VR, but screen should also be mirror on phone (Android or iPhone or msphone) Greetings Ian
  18. Hi, I'm working on VR web apps but I saw that when I use VRDeviceOrientationFreeCamera the user still can move (on PC) the camera with arrows. I tried to use "VRDeviceOrientationArcRotateCamera" but in console I get "VRDeviceOrientationArcRotateCamera is not a constructor" What can I do? Thank you all. Edit: I tried also a registerBeforeRender with a 0,0,0 position for the camera but it still moves with arrows.
  19. I recently got one of those VR headsets where you put in your phone for the display. It's like cardboard only slightly better, with a more sturdy plastic phone holder and a head strap. It's pretty cheap but works surprisingly well with the default YouTube app and some 360 videos. It is possible to create vr apps and demos using HTML5/JavaScript, for example: Google VR has a JavaScript library that can display 360 images, Glam.js demos. So I was wondering if it's possible to create HTML5 vr games? Would be pretty cool for escape the room games, where you can look around in 360 degrees. Although technically I guess it's possible, I don't think there's a real market for such games. Seeing as HTML is mostly used for small webportal games, people with Oculus/HTCVive probably won't bother with HTML5, and people with a cardboard headset don't have powerful enough phones. Any thoughts?
  20. Hi Guys can sombody helpme to make the VR camera follow the spline ? I try parenting the camera into a mesh that folllows the spline and the camera still on the ground. I change the camera But I can see yet why its not updating the position here is my // Target Cam // VR CAMERA INSTEAD***************************************************************************** var targetCam = new BABYLON.VRDeviceOrientationArcRotateCamera("tcam", camera.position, scene); // var targetCam = new BABYLON.TargetCamera("tcam", camera.position, scene); targetCam.setTarget(points[t]); scene.activeCamera = targetCam; var target = BABYLON.Vector3.Zero(); // animation scene.registerBeforeRender(function() { //followCam._computeViewMatrix(); points[i].addToRef(curvect.scale(k / step), pos); points[j].addToRef(nextvect.scale(k / step), target); pos.addInPlace(norms[i]); targetCam.position = pos; targetCam.setTarget(target); targetCam.getViewMatrix(); //wagon.position = pos; //wagon.rotation = rot; k++; // next segment if (k == step) { i = (i + 1) % lg; j = (i + 1) % lg; t = (i + dt) % lg; rot = BABYLON.Vector3.RotationFromAxis(tgts[i], norms[i], binorms[i]); points[j].subtractToRef(points[i], curvect); points[t].subtractToRef(points[j], nextvect); //curvect.normalize(); //nextvect.normalize(); k = 0; } }); return scene;
  21. WebVR Support (again)

    Checking in to see the support for WebVR in the current edition of Babylon. Currently, Babylon doesn't see WebVR enabled on my copy of Google Chrome (Canary) or Firefox nightly., and recognized but no working in Chromium experimental build (Feb 7). Currently, the Chromium build with Babylon 3.0 alpha blows up with some undefineds: Uncaught TypeError: Cannot set property 'x' of undefined at Matrix.getTranslationToRef (babylon.max.js:2817) With Babylon 2.5, you get a stereo screen in Chromium...but the Vive does not respond - does not show the stereo scene on the desktop window. Advice on any browser or SteamVR setting which will enable the headset appreciated!
  22. Hi, i am a developer of vr application and i have a problem with the VRDeviceOrientationFreeCamera relative to anti-aliasing. My scene is good but when I change to this camera all models of my scene have a bad appearance, with no anti-aliassing. The resolution of this escene using the vrCamera is very low... How could be the cause?
  23. Hello everyone!. I am developing an application with Babylon.js and I have setup the WebVRFreeCamera on my scene. I have been playing with babylon actions, pickResults etc using the mouse pointer, and now I was trying to get the mesh that is in the camera view trajectory so I can trigger actions or activate behaviours. I have tried using Babylon.Ray and looking around the documentation with no luck, I have been also testing React VR, and it has event binding to the meshes on the scene. Something like this: <Mesh source={{mesh:'cube.obj', texture:'cube.jpg'}} onEnter={this.OnEnterCallback} onExit={this.OnExitCallback} /> It makes easy triggering events when gaze in / gaze out happens around the different meshes, and I would like to know if you can guide me in the right direction to do this with babylon. I have seen that scene has scene.getActiveMeshes, but I returns an SmartArray of visible meshes in the camera, and It would be cool to have something more precise. I hope you can guide me!. Thanks!
  24. Hi Guys Im looking for way to integrate augmented reality into the toolkit if any body has ideas about this please let me now. so far I'm trying with trackingjs, I figure if we can track faces, color etc we can bring the video in vr mode camera and stick stuff in the 3d world using trackingjs syst. any way I know there other libraries but I though this is a nice and clean way to do some augmented reality and it has nothing to do withe Babylon framework yet . @MackeyK24 here my initial code attempt , I put the #video div on the index.hmtl but I dont get the camera working yet je je I need to render the camera's device into the canvas @Deltakosh is ther a way to do this already? see the ~ref link public ready(): void { // this.scene execute when ready var scene = this.scene; this.scene.activeCamera.attachControl(this.engine.getRenderingCanvas()); this.manager.enableUserInput(); var tracker: any = new tracking.ObjectTracker(['face']); tracker.setInitialScale(4); tracker.setStepSize(2); tracker.setEdgesDensity(0.1); tracking.track('#video', tracker); tracker.on('track', function (event) { var canvas = this.engine.getRenderingCanvas(); var context = canvas.getContext('2d'); tracker.on('track', function (event) { context.clearRect(0, 0, canvas.width, canvas.height); (rect) { context.strokeStyle = '#a64ceb'; context.strokeRect(rect.x, rect.y, rect.width, rect.height); context.font = '11px Helvetica'; context.fillStyle = "#fff"; context.fillText('x: ' + rect.x + 'px', rect.x + rect.width + 5, rect.y + 11); context.fillText('y: ' + rect.y + 'px', rect.x + rect.width + 5, rect.y + 22); }); }); }); };}
  25. Hi, I noticed that Babylon's implementation of VR camera rig is such that it calculates distortion correction inside of fragment shader. Since the calculations are done per pixel, it results in a steep performance drop, especially on high-density screens, which renders the rig unusable for any mobile phone. On a simplest of scenes, I get only 30fps on Google Pixel. I wonder why this particular method has been chosen over, say, displaying the rendered texture on a dense plane (20x20) and then performing all calculations by vertex of that plane. With this method we would be performing calculations some 400 times per eye (on a 20x20 mesh), versus over 900 000 times (for each pixel on a QHD screen, for example). What I am referring to is the 2nd approach described here: Both WebVR polyfill and Google VR View use this method and I notice no performance drop AT ALL when running their examples. The reason I ask is because I am thinking of developing this method for Babylon, simply because current pixel-based implementation is unfortunately completely unusable. But before I start I'd, like to know if there is some underlying problem, inherent to Babylon, to implement this method? Thanks