Jump to content

Search the Community

Showing results for tags 'VR'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
    • Paid Promotion (Buy Banner)
  • Frameworks
    • Pixi.js
    • Phaser 3
    • Phaser 2
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
    • GameMonetize
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered
    • Marketplace (Sell Apps, Websites, Games)

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

  1. We'd like to present you with another game that we created to showcase the codeless VR capabilities of Verge3D for Blender. Understandably, it is Xmas-themed, so you might put on your VR headset such as Oculus Quest right now (can works without VR too). Pick up a snowball gun and fend off those sneaky snowmen that pop up in the woods! Launch the game in your Oculus or desktop browser: Click here to launch Oculus controls: stick on either controller – walk/strafe, grip on either controller – pick up the snowball gun, trigger button on either controller – shoot. Desktop controls: WASD – walk/strafe, Shift – run, Space – jump, LMB – shoot. We wish you great holidays and a Happy New Year! Enjoy!
  2. Hello. Let's brainstorm a bit. We can have our HTML5 games on phones using Cordova. But how to do the same for the Oculus Quest 1 & 2 standalone (which runs Android too)? I haven't tried it but i guess Cordova would make the games run in 2D mode but we would want it to force VR mode somehow so players don't have to do that. Interesting links i found so far: BabylonJS native with future Android export: https://www.babylonjs.com/native/ General discussion: https://www.reddit.com/r/WebVR/comments/dd8ah6/how_to_publish_webvr_games_as_native_apps/ Native WebXR browser: https://exokit.org => https://github.com/exokitxr/exokit (looks abandoned, no update since April 2020) Regards nora
  3. I recently got one of those VR headsets where you put in your phone for the display. It's like cardboard only slightly better, with a more sturdy plastic phone holder and a head strap. It's pretty cheap but works surprisingly well with the default YouTube app and some 360 videos. It is possible to create vr apps and demos using HTML5/JavaScript, for example: Google VR has a JavaScript library that can display 360 images, Glam.js demos. So I was wondering if it's possible to create HTML5 vr games? Would be pretty cool for escape the room games, where you can look around in 360 degrees. Although technically I guess it's possible, I don't think there's a real market for such games. Seeing as HTML is mostly used for small webportal games, people with Oculus/HTCVive probably won't bother with HTML5, and people with a cardboard headset don't have powerful enough phones. Any thoughts?
  4. Hi, 360 virtual tour on babylonjs can be done? Have you ever worked with this topic before? I want to do 360 virtual tours.I want to scroll through the photos. I want to navigate through photos by applying the Click property. If you can help, can you contact me?
  5. Is there any reason why sound is not playing in my WebVR demo in the Oculus browser? I already changed video.muted to false & it still won’t play. It’s the same demo in the examples only with another video I tried with sound. Also, it doesn’t autostart even though I told it to. It only autostarts in non-VR mode on my regular laptop. Here’s the link :: http://babylontesting.epizy.com/Three.js/skytime-vr/ Here’s the link to the Source :: view-source:http://babylontesting.epizy.com/Three.js/skytime-vr/ { NOTE } :: Copy & paste EVERYTHING above including the ‘view-source:’ portion into a browser
  6. I'm trying to get post process working on a project that also should run in VR, but unfortunately things break as soon as I add the scene.createDefaultVRExperience() to the project. This happens even if VR is not activated yet. Is there any way to use post process and VR together? Here is an playground highlighting the issue: https://www.babylonjs-playground.com/#T49RDX (just uncomment line 9) Thanks!
  7. Hello. I've a scene in which I use ArcRotateCamera and also VR is enabled using VRExperienceHelper. The problem is that, when the user enters to VR, VR camera is not looking at the same position that is looking Non-VR Camera and also is further. I've tried to handle onEnteringVR event,but I'm not able to make it work. I've prepared to simple examples to illustrate the problem. In this sample, without moving/rotation the camera, if the user enters in VR, is looking to the sphere, which is OK, but it is further from the sphere, which is not OK. However, in this sample, if the user enters in VR without moving the camera, is looking to the sphere opposite side, and also the position is wrong. Any ideas to help to put the correct position/orientation in the VR camera when entering to VR? Thanks in advance. Regards.
  8. Hi, I am using Babylon.js with HTC Vive and WebVR and I want to pick a point on the mesh (this.wallMesh) with controller ray. I don't want to use default VRHelper interactions. I expected that this code should work for it but it seems that there is wrong rotation because the rotation of the device behaves totally weird. Do you have some tips on how to fix it? Thank you a lot. const pickMesh = BABYLON.Mesh.CreateSphere( 'pickMesh', 5, 0.1, this.scene, ); const ray = new BABYLON.Ray( BABYLON.Vector3.Zero(), BABYLON.Vector3.One(), 100, ); this.scene.registerAfterRender(() => { ray.origin = controller.devicePosition; ray.direction = controller.deviceRotationQuaternion.toEulerAngles(); const hit = this.wallMesh! .getScene() .pickWithRay(ray, (mesh) => mesh === this.wallMesh); if (hit) { console.log(hit.hit); if (hit.pickedPoint) { pickMesh.position = hit.pickedPoint; } } });
  9. Hi ! I'm looking for a sight that is not following the scene's geometry like the gaze tracker mesh. So I tried to create a plane with a texture, child of the VR camera, that is at the center and in front of this camera. Here's what I've tried: https://www.babylonjs-playground.com/#Q1VRX3#2 But it isn't always in the center, or doesn't work without a headset connected (like for google cardboard). Furthermore, It would be nice if we can have the good scene.activeCamera on the 'onEnteringVR' event Thank you so much for your help !!
  10. Hi, I would like to prevent the headset from changing the position of the camera, and keep only the device's rotation. Did I miss something ? Here's a PG of what I've tried: https://www.babylonjs-playground.com/#9RCQVW Tested on Firefox + HTC Vive. Thanks !!
  11. I borrowed an Oculus Go from work, here are some first impressions/ questions. Overall I'm really impressed with the hardware. It's comfortable to wear, resolution and FPS seem fine to me. My only other VR headset experiences have been cardboard and a Samsung one (forget which it was), Oculus Go is better than either of those. As a dev device, I've not downloaded any SDK or looked at any dev doc, I just point the device's browser at http://<my laptop's IP address>:8080 and off I go (no JS console tho!) It's amazing how much the following one line of code gets you: const vrHelper = scene.createDefaultVRExperience({ createDeviceOrientationCamera: false }); it gives you a VR icon on the screen. When you tap it, Oculus Go launches into fullscreen VR. Looking around looks great, works seamlessly out the box. Worth noting that Oculus Go is a 3 DOF system, rather than 6 DoF. Meaning that it only tracks orientation of the headset (and the controller too? Need to double-check this), not its position. Where I have run into some difficulties is with the controller. I'm currently trying to implement so that the user can point at a mesh with the controller, and pull the trigger to select a point on that mesh. I'm going to list some of these in this thread. Bearing in mind I've only had the device for a few hours. If anyone has suggestions for these issues, would be great to hear. 1. getting PickingInfo So far I've only been able to get PickingInfo with vrHelper.onNewMeshPicked this only fires when the pointer first moves over the mesh, regardless of whether the user is pressing one of the buttons or not. The callback for picking up button changes controller.onButtonStateChange returns info about the button, but no pickInfo. I guess I need to call scene.Pick inside the button state change callback, haven't had time to try this yet. 2. customizing the controller mesh When you enable interactivity with vrHelper.enableInteractions(); The default mesh for the controller is a long, thin cylinder that stretches from where you're pointing, right up to your eyes. It's really in-your-face, and I haven't found a way to modify or disable it yet. This doesn't have any effect: vrHelper.displayGaze = false; And neither did trying to apply a custom mesh with controller[0].attachToMesh(mesh) I've got to be offline for a few hours now, but I'll keep experimenting. If anyone has questions, tips, please do post.
  12. I'm able to pick sprites with Scene.pickSprite, but is there a way to pick them with scene.pickWithRay? PickWithRay only seems to return meshes (I've set isPickable to true on the sprites and confirmed that they're pickable with Scene.pickSprite). The use-case is, I'm trying to pick sprites with a 3DoF VR Controller, so I don't have a 2d pointerX and pointerY to work with, only the controller's forward ray. I'm thinking that my options are: a. Use Vector3.Project to try to unproject the pickedPoint of pickWithRay (I'm using a skybox, so the ray will always hit something), then put that point into Scene.pickSprite. I tried this quickly, but I think the unprojected point might be off b. replace the sprites with billboarded mesh instances that can directly be hit with pickWithRay. I suspect this might be the easiest option? Or is there some easier way to pick a sprite with a ray?
  13. Hi, Just found this bug: an error broke the scene if we try to clone a mesh when we are in VR mode. Here's a PG: https://www.babylonjs-playground.com/#K7YG35 Thanks !!
  14. Hi everyone ! I'm trying to move between two cubemaps with an HTC VIVE. There's two interactions avaibles: Click with the controller on a plane. Use the gaze tracker on this same plane. This two cases works perfectly, but: Controller's ray is blocked by the cube (even with isBlocker = false). gazeTrackerMesh is not visible on my plane. Here is a PG: https://playground.babylonjs.com/index.html#M9YVIX
  15. Hi, How about an option in vrHelper to force the height of the camera, even if isInVRMode is true ? I'm testing my scene seated, but I want to be in a stanted view inside my VR. So it will be nice if we have a forceDefaultHeight option in webVROptions, to always be standing Thank you !!
  16. Hi, First, great job for the VR's implementation !! It's incrediblely simple and it works perfectly !! I would like to customize the mesh created in the "_createTeleportationCircles" function. Is it possible to have an option to do it ? (Or can I already do it ?) Many thanks !!
  17. Hello, We are our scene through data coming from a third-party application and we would like to navigate through the scene in VR. We are using the WebVR Experience Helper with teleportation enabled - it works pretty well apart from one small issue. Calibrating the objects in the environment to the user's height such that the objects that are supposed to be at eye-level seem to be at the eye-level in VR is something we have to do manually right now. Is there a standard height (from the ground) that the user's eye level is at by default? Or does this depend on ground size/subdivisions? Also, does 1 unit in BabylonJS map to a real unit (feet/metre/etc.) in VR? Let me know if my question seems a little unclear. We are using Windows Mixed Reality Headsets for this.
  18. https://gaming.youtube.com/watch?v=cGwsAGfm2MI
  19. If I use VR normally on any Oculus app, the meshes however far it is, aren't flickering, even with any reflecting material or any advanced shaders. But when using Babylon WebVR, meshes that are far, flicker so heavily. What is causing that flickering? Is it Aliasing? I think Aliasing should affect all edges not just meshes which are far. Also, this only appears inside VR HMD, not in the split screen view in the browser on the PC display (hence, couldn't get the screenshot as well). I believe this has something to do with GPU param settings (Anisotropic filtering? Aliasing level?).
  20. If we use MirrorTexture based reflections on any planer surface while using WebVRFreeCamera (And viewing in the VR HMD), reflected image is different for both the eyes. So it looks consistent and random. It can't be perceived to be a reflection. It looks like random artefacts. Because reflection is not matching. I am not able to imagine if the mirror texture should be obtained differently for both the eyes using two camera rig (Currently Babylon seems to be doing that ) or obtained reflection texture should be same for both the eyes. If it does need to be obtained two two camera rig (separate reflectionTexture for both eyes, then maybe the two camera rig has some position issue and position for internal mirrorTexture capture cameras need to be corrected. But mirrorTexture is wrong while checking in VR HMD is what I can tell. @RaananW , Mentioning you because of WebVRFReeCamera.
  21. Chrome supports WebVR for Oculus via SteamVR. I tried that with Oculus Touch controller. Now since Oculus support in Chrome is via SteamVR (OpenVR), the Oculus controller are being detected as Vive controllers in Babylon WebVR implementation. Its not just that controller 3D models are wrongly detected/displayed. Oculus controller input/keys which we have setup for our app and are working fine in Firefox, aren't working when using same oculus controllers in Chrome-SteamVR. Although I know we should wait for official chrome support for Oculus. Just pointing it out here if it matters.
  22. Hey everyone, We are building an environment that contains many thin lines and grills. While in normal mode everything looks fine (with Babylon's AA enabled), however after entering the VR mode (using the VRhelper) jagged edges become very apparent to the point that it's impossible to distinct some of the details. I have made a few screenshots with test environment to demonstrate the problem. Solutions like FXAA plugged to post processing pipeline make everything blurrier and worse. I'm wondering why is it happening (stereo cameras are super low-res?) and is there a way to deal with it? It would be ok to take some performance hits and render everything internally in much higher resolution for example. What are the good practices for fighting the jagged edges in VR in general? Many thanks in advance!
  23. Perplexus in VR mode How did I play it.
  24. For the normal camera, we can have bounding mesh which has collision enabled and that way we can restrict camera movement to the certain area. But the WebVR camera is free and doesn't respect the collision, So how to restrict the movement of the user in the model? Note: This is not about Room Scale and guardian system. This is about restricting camera movement including teleportation to a certain area inside VR.
  25. This post to help anybody that would like to debug/work on GearVR or other mobile devices over WiFi. There is a pull request that was approved today. The code to implement was literally 10 minutes of mostly copy/paste, while the setup took over an hour to figure out, so I wanted to share that (also for my forgetful self in the future!). What does work on GearVR is trigger button will select meshes. Trackpad allows you to rotate. At least with trigger button working it is somewhat usable now! The todo list at least contains: - get a 3D model - onTrackpadChangedObservable (doesn't exist, but would be called). The trackpad axis could then show on mesh where it is selected (ie: small circle). - position on GamePad is always null by design. It should be held in the hand from the constructor, which I think defaults to 'right'. that will fix the rays coming from the eyes and have the model visible. I think choose a position in front and model should align with the forward ray from that point. - using the headset device control I was triggering scene pick events, but did not seem to be where I was looking. So, support for GearVR headset (without a controller) needs yet to be added. I just wanted to outline my workflow that allows you to debug your GearVR from Chrome on your desktop. If you directly connect your device via USB with USB debugging you can already do this and most of us already do that. With the GearVR, however, you will need to debug over WiFi as the USB connection on the GearVR headset is for charging only (and the GearVR headset uses the USB connection of your phone). First you need to have adb (install the android SDK or download the platform-tools). Step 1. Connect your android device to USB and set the TCP port: C:\Program Files (x86)\Android\android-sdk\platform-tools>adb devices List of devices attached * daemon not running. starting it now on port 5037 * * daemon started successfully * 9889234230424e3755 device C:\Program Files (x86)\Android\android-sdk\platform-tools>adb tcpip 5555 Step 2. Disconnect your android device from USB. Ensure you are connected to the same WiFi network as your desktop. Get your Phone IP address from Settings->About->Status (on S8). C:\Program Files (x86)\Android\android-sdk\platform-tools>adb connect 192.168.1.77 connected to 192.168.1.77:5555 C:\Program Files (x86)\Android\android-sdk\platform-tools>adb devices List of devices attached 192.168.1.77:5555 device Note: You may need to reconnect your device during development, so probably leave this prompt open, so you can reconnect. Open up chrome://inspect on your desktop - make sure you are on the Devices tab. Click port forwarding to allow accessing ie: localhost:3000. Open up Samsung Internet to http://127.0.0.1:3000 (so, loopback address instead of 'localhost'). You should see the following under Remote Target in the lower part of the chrome://inspect tab: SM-G950W #192.168.1.65:5555 com.sec.android.app.sbrowser.beta [ ] 'Title of page' http://127.0.0.1:3000 <inspect> Chrome ... com.brave.browser ... Also, any other debuggable Chromium browsers running will show up (ie: Chrome, Brave, etc.). Note that SM-G950W will be your phone model identifier followed by your phone WiFi IP address. Clicking the inspect link (bold above) to open DevTools. You can now simulate clicks on your phone in the emulator and view the console. This is how you would, for example, see the button indexes to map to functionality you need. If you are working on BabylonJS itself you can get away with only 'npm install' and 'gulp typescript' then symbolically link (or copy) the preview release with the node_modules in your project. Otherwise you can use the local PG in BabylonJS. I've tried both and they work well.
×
×
  • Create New...