Jump to content

MUCH faster VR distortion in Vertex shader - possible?


Recommended Posts


I noticed that Babylon's implementation of VR camera rig is such that it calculates distortion correction inside of fragment shader. Since the calculations are done per pixel, it results in a steep performance drop, especially on high-density screens, which renders the rig unusable for any mobile phone. On a simplest of scenes, I get only 30fps on Google Pixel.

I wonder why this particular method has been chosen over, say, displaying the rendered texture on a dense plane (20x20) and then performing all calculations by vertex of that plane. With this method we would be performing calculations some 400 times per eye (on a 20x20 mesh), versus over 900 000 times (for each pixel on a QHD screen, for example).

What I am referring to is the 2nd approach described here:


Both WebVR polyfill and Google VR View use this method and I notice no performance drop AT ALL when running their examples.

The reason I ask is because I am thinking of developing this method for Babylon, simply because current pixel-based implementation is unfortunately completely unusable. But before I start I'd, like to know if there is some underlying problem, inherent to Babylon, to implement this method?



Link to comment
Share on other sites

Hello, we started developing the distortion correction at pixel level because it was easier to test and implement.

But now you are completely right it comes at an expensive cost. I would really appreciate your help if you can provide per vertex implementation (Which could be turned on/off)

We can even think about using it as default and allow users to opt in to use the per pixel version

Link to comment
Share on other sites

I'd like to help. Right now I'm running on several deadlines, but when I finish I will take a stab at it.
Before I start diving into thousands of lines of the current Babylon js code, where would you recommend me to look at?
I should have no problem writing the actual shader *, but I don't know anything on how PostProcess is currently implemented in Babylon, I guess I should start there?

(EDIT: actually Google has posted the actual shaders as a part of their WebVR polyfill)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Create New...