Jump to content

Search the Community

Showing results for tags 'vr camera'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
    • Paid Promotion (Buy Banner)
  • Frameworks
    • Pixi.js
    • Phaser 3
    • Phaser 2
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
    • GameMonetize
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered
    • Marketplace (Sell Apps, Websites, Games)

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

Found 4 results

  1. Hi, I noticed that Babylon's implementation of VR camera rig is such that it calculates distortion correction inside of fragment shader. Since the calculations are done per pixel, it results in a steep performance drop, especially on high-density screens, which renders the rig unusable for any mobile phone. On a simplest of scenes, I get only 30fps on Google Pixel. I wonder why this particular method has been chosen over, say, displaying the rendered texture on a dense plane (20x20) and then performing all calculations by vertex of that plane. With this method we would be performing calculations some 400 times per eye (on a 20x20 mesh), versus over 900 000 times (for each pixel on a QHD screen, for example). What I am referring to is the 2nd approach described here: http://smus.com/vr-lens-distortion/ Both WebVR polyfill and Google VR View use this method and I notice no performance drop AT ALL when running their examples. The reason I ask is because I am thinking of developing this method for Babylon, simply because current pixel-based implementation is unfortunately completely unusable. But before I start I'd, like to know if there is some underlying problem, inherent to Babylon, to implement this method? Thanks
  2. Hi all, this is my first post after working with Babylon JS for a few months already, and at first I would like to express gratitude to all contributors who made this great engine possible. Now for the question. For a few days I have been trying without success to get good results using VRDeviceOrientationFreeCamera on a simple Google Cardboard. In the end, I decided that I could go without barrel distortion effect (which pretty much doubles the frame rates), but now I am stuck with a camera which FOV is too large (too wide-angle), and it makes for a very disorienting experience. Apparently .fov property does nothing on VRDeviceOrientationFreeCamera ( correct me if I am wrong ?!), so how can I change the lens (fov) ? I tried to use RIG_MODE_STEREOSCOPIC_SIDEBYSIDE_CROSSEYED instead of VR rig, which makes .fov property working again, but it results on some strange scaling issue plus it doesn't react to roll rotation. I also played with all VRCameraMetrics properties, none made any effect on the field of view. Thanks for your help.
  3. Hello! Thank you for the new version of the Babylon.js. But i have a question with the VrCamera. On this version the mousse inputs don't work ? http://www.babylonjs-playground.com/#VVCUZ There is any solutions on this probleme ? Thank,
  4. Hello, I have a project to build in short time which must support the stereoscopic display using a Samsung Galaxy S6 phone paired with the Samsung Gear VR. I recall that last year I was in a dialog with several developers on this forum including @JCPalmer and @Deltakosh where it appeared we were considering currently stopping full support of the Oculus camera specifically and moving to a more generic stereoscopic camera which could be modified to support most any VR headset using the stereoscopic BJS camera. I don't recall where we ended up, and if/how BJS continued support of the Oculus camera, but I quickly need to build a reasonably simple VFX project for multiple mobile devices (including Sony and other Android tablets, as well as IOS and Ipad) and to also identify the Samsung S6 and switch to a stereoscopic camera which supports the Oculus stereoscopic format. I'm also fine with building two different scenes - one for most mobile devices, and a seperate scene for the support of the Galaxy S6 paired with the Gear VR. Can anyone provide a code example which renders and displays stereo video on the S6 attached to the Gear VR; as well as what properties are available in the camera to deliver the best stereo imagery to the Gear VR? I know in our post discussions we covered all of the essential settings for a generic stereo camera such as covergence, divergence, parallax, interaxial seperation, etc., but don't recall defining any settings specific to the Oculus camera and the necessary rendering settings in support of their stereo format. If anyone has a sample scene and/or the camera code which has been tested and working with the Gear VR, this would assist me a great deal - as I won't need to spend the time once again discovering what works best for the Oculus (Gear VR) stereo display - and can focus all of my time on the scene - as I have a tight schedule to produce a series of stereo effects and controls as a proof of concept to show that babylon.js is capable of rendering everything the client has spec-ed out for the test, and to demonstrate that babylon.js is the best choice for the framework to support both 2D and stereo cameras for future projects - and specifically Oculus at this time since they are currently working directly with Samsung. Also, any assistance with supporting the Samsung Gear VR bluetooth controller would be highly appriciated, as this is the other additional spec which I must deliver in a scene which works for most any mobile device (no problem there) and the Oculus camera for the Gear VR and controller. I hve no previous experience using the Ger VR controller, and won't receive the S6, Ger VR, and controller until Thursday - and have a presentation scheduled for this Monday - not my call, but I don't see where there should be any problems other than my inexperience with actually rendering to the Gear VR and using their bluetooth controller. As always, thanks for any help and/or examples you might provide. Oh yeah - and please "wish me luck." DB
×
×
  • Create New...