Jump to content

Depth-only pass


Kai_
 Share

Recommended Posts

Hi,

 

I am new to Babylon.js and love it for its simplicity and superior performance compared to other frameworks, like three.js, when it comes to WebGL rendering, compared to CSS3d and Canvas rendering.

Having quite some experience using desktop OpenGL up to 4.4 and GLSL as well, I recently switched to WebGL to support rendering in the browser client.

 

Now, my question is how would I setup a GL_DEPTH_COMPONENT24  texture to a FBO and disable color-writing via glColorMask with the high-level objects offered by Babylon.js?

I have no trouble rolling my own TypeScript class for that. Just want to know whether something like this exists and how the API is supposed to be. I had a look at babylon.depthRenderer.ts, but currently don't know how to compose that into a functioning pipeline with the rest of Babylon.js.

 

What I want to do here is to port my existing hybrid rasterization and path tracing demo (which I did for the LWJGL team here) from desktop OpenGL to WebGL.

And the first thing I try to figure out here is how to do a depth-only pass with Babylon.js for the screen-quad fragment shader to start the shadow rays at the correct scene position. :)

 

Thanks in advance!

 

Cheers,

Kai

Link to comment
Share on other sites

Ah, it seems that WebGL does not support rendering depth to texture, as is stated here: http://stackoverflow.com/questions/7255814/webgl-render-depth-to-fbo-texture-does-not-work

And it also seems that I would be using scene.enableDepthRenderer().getDepthMap() to get the texture that the depth.fragment.fx shader wrote the depth information (from gl_FragCoord.z) into.

WebGL/OpenGL ES 2.0 seems like a shock when coming from OpenGL 4.3  :wacko:

Link to comment
Share on other sites

Thank you.

 

But nonetheless, great work with Babylon.js to you and the other David!  :)

 

Recently watched you both on the Babylon.js introduction video on Microsoft Virtual Academy, and great work with that!

 

Babylon.js is the first WebGL framework I found featuring everything needed for our company to build a great web-enabled dsolution for one of our big customers about visualizing TOGAF IT landscapes as a city, like what CodeCity does but on a big scale and hopefully with a bit more fidelity. :)

So it's not going to be a game.

 

But I wanted to have some kind of global illumination, or at least ambient occlusion (not SSAA, but real AO), to make the city look more like http://www.local-guru.net/blender/cube_town.png (image).

 

It also seems that WebGL 1.0 / OpenGL ES 2.0 does not support Shader Storage Buffer Objects, so I guess I will be using RGB/XYZ-textures then to hold the scene objects, which currently only comprise of axis-aligned boxes with just min/max corner coordinates.

Hopefully, ES 3.1 will find it's way soon into WebGL.  ;)

 

Cheers,

Kai

Link to comment
Share on other sites

I am liking Babylon.js more and more every second is spend on it!  :)
 
Right now I decided to not write depth information from gl_FragCoord.z (which strangely no one knows exactly what it contains and how to linearize it to get the view-space distance to the camera :)) but just instead simply render the view-space position in RGB/XYZ.
And for that I needed the viewMatrix and the projectionMatrix separately.
And voilà, having a look at babylon.scene.ts it is able to produce these exact values via _viewMatrix and _projectionMatrix or their respective getters.
Wonderful!

 

 

We are working on SSAO FYI :)

Yes, I saw that ssao shader there. But speaking of gl_FragCoord.z, I also saw that the depth.fragment.fx shader outputs some "depth" using gl_FragCoord.z divided by "far" (which I guess is the camera far plane distance). But does this fraction make sense? I mean, is it correlated to the actual depth or view-distance to the scene? Or am I missing something?

 

I did an experiment with that shader (using a small desktop OpenGL demo) and read-back what it outputs in a render target, and those values are quite small.  :)

I mean, gl_FragCoord.z is 0.0 at the near plane and 1.0 at the far plane, so dividing by far will give you values between 0.0 and 1.0/far.

 

Anyway, I will use plain view-space position to overcome that hassle of converting gl_FragCoord.z back to linear view-space.  :)

Link to comment
Share on other sites

Hi David,
 
thanks for your replies and support so far!
 

 

Actually z is not clamped.

Hmm... empirics says otherwise, I'm afraid.  :unsure:

Also in the GLSL 1.3 spec chapter 7.2 it stands: "The z component is the depth value that would be used for the fragment’s depth if no shader contained any writes to gl_FragDepth."

 

Have you actually tried reading-back the values from a floating point texture render target?

 
I have tried it now on three different PC's with different graphics cards and drivers and what I get when simply projecting a quad facing the viewer is that I get the following results for the following settings of a perspective projection:
near =   0.1far  = 100.0

Now the table with the actual values (first is actual z-distance from camera in linear view-space and second is gl_FragCoord.z value):

z-dist   gl_FragCoord.z----------------------- 0.11    0.09100008 0.20    0.5005005 1.00    0.900900910.0     0.99099150.0     0.99899990.0     0.999888899.0     0.99998987
So, it seems they in fact are between 0.0 and 1.0 and non-linear.
This is also what some of my web researches, regarding what gl_FragCoord.z is, have given: [1], [2], [3]
 
The main of my fragment shader is simply this:
  gl_FragColor = vec4(gl_FragCoord.z, 0, 0, 1);

And I am reading back the values of the floating point texture via glGetTexImage.

 
 
The [1] is actually quite good in explaining how to linearize the value back again.
There is a comment by oc2ki saying that converting back to linear view-space can be achieved via:
float Z = gl_ProjectionMatrix[3].z/(gl_FragCoord.z * -2.0 + 1.0 - gl_ProjectionMatrix[2].z);
But anyway. For doing SSAO with depth discontinuity checks I would not rely too much on actual depth values because of their non-linearity and of precision loss when converting back, but simply using a G-buffer with gl_Position.z values written out as RGB/XYZ texture.
Could be perfectly intermingled with the also necessary normals in RGB. So normals go in RGB and linear view-space z goes into A.

 

Cheers,

Kai
Link to comment
Share on other sites

I have tried it with a native OpenGL ES 2.0 program on a tablet with Tegra K1 CPU/GPU, since WebGL 1.0 is derived from OpenGL ES 2.0.

But furthermore, the OpenGL ES 2.0 Shading Language Specification also states under 7.2, that "The z component is the depth value that will be used for the fragment's depth."

And it would be most inconvenient and unpleasant for a lot of people if Khronos happened to change the semantics of gl_FragCoord.z there between desktop OpenGL and ES.  ;)

But I will right now try it also with a native WebGL program, and let you know!

 

Cheers,

Kai

Link to comment
Share on other sites

Here is a small showcase of rendering into FBO texture and reading back the values with glReadPixels. It prints-out on the console the read-back value. In this particular case my print-out is 0.9989989995956421 for near = 0.1 and far = 100.0 and zDest = 50.0.

 

I am using glMatrix.js for the projection matrix and view matrix computation. 

<html><body><script src="glMatrix.min.js"></script><canvas id="webgl" width="1" height="1"></canvas><script type="text/javascript">function draw() {    try {        gl = document.getElementById("webgl")            .getContext("experimental-webgl");        if (!gl) { throw "x"; }    } catch (err) {        throw "Your web browser does not support WebGL!";    }        var zDist = 50.0;    var near = 0.1;    var far = 100.0;        // Query extension    var OES_texture_float = gl.getExtension('OES_texture_float');    if (!OES_texture_float) {        throw new Error("No support for OES_texture_float");    }        var fbo = gl.createFramebuffer();    gl.bindFramebuffer(gl.FRAMEBUFFER, fbo);    var tex = gl.createTexture();    gl.bindTexture(gl.TEXTURE_2D, tex);    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 1, 1, 0, gl.RGBA, gl.FLOAT, null);    gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, tex, 0);    var prog = gl.createProgram();    var addshader = function(type, source) {        var s = gl.createShader(type);        gl.shaderSource(s, source);        gl.compileShader(s);        if (!gl.getShaderParameter(s, gl.COMPILE_STATUS)) {            throw "Could not compile "+type+" shader:\n\n"+gl.getShaderInfoLog(s);        }        gl.attachShader(prog, s);    };    addshader(gl.VERTEX_SHADER, "attribute vec3 pos;"+        "uniform mat4 projection;"+        "uniform mat4 view;"+        "void main() {"+        "    gl_Position = projection * view * vec4(pos, 1.0);"+        "}");    addshader(gl.FRAGMENT_SHADER, "void main() {"+        "    gl_FragColor = vec4(gl_FragCoord.z, 0, 0, 0);"+        "}");    gl.linkProgram(prog);    if (!gl.getProgramParameter(prog, gl.LINK_STATUS)) {        throw "Could not link the shader program!";    }    gl.useProgram(prog);        var mvMatrix = mat4.create();    var pMatrix = mat4.create();    mat4.lookAt([0.0, 0.0, zDist], [0.0, 0.0, 0.0], [0.0, 1.0, 0.0], mvMatrix);    mat4.perspective(45, 1, near, far, pMatrix);        gl.uniformMatrix4fv(gl.getUniformLocation(prog, 'projection'), false, pMatrix);    gl.uniformMatrix4fv(gl.getUniformLocation(prog, 'view'), false, mvMatrix);        gl.bindBuffer(gl.ARRAY_BUFFER, gl.createBuffer());    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([        -1, -1, 0,         1, -1, 0,         1,  1, 0,        -1,  1, 0    ]), gl.STATIC_DRAW);    var attr = gl.getAttribLocation(prog, "pos");    gl.enableVertexAttribArray(attr);    gl.vertexAttribPointer(attr, 3, gl.FLOAT, false, 0, 0);        gl.viewport(0, 0, 1, 1);    gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);    var pixels = new Float32Array(4);    gl.readPixels(0, 0, 1, 1, gl.RGBA, gl.FLOAT, pixels);    console.log(pixels[0]);}draw();</script></body></html>

Unfortunately, the above does not work on Firefox 35, but does on Chrome 41, 42 (Canary) and IE11, because (and that is just unbelievable... :) WebGL... *sigh*) the WebGL spec does not define color-renderable floating-point formats and also does not define FLOAT to be a valid type for readPixels, as it states under 5.14.12. "Reading back pixels": "Only two combinations of format and type are accepted. The first is format RGBA and type UNSIGNED_BYTE. The second is an implementation-chosen format."

Link to comment
Share on other sites

Yeah,  ;) I guess I've spent too many years of my life reading OpenGL core and extension specs.  :D

Was just not up-to-date on the specifics of ES 2.0 and WebGL. Have to catch up on that.

But I would very much like to help you guys out on Babylon.js, as I personally think it's a great undertaking to further WebGL usage and development and it is just what I need right now. If I can I will contribute my soon-to-come path tracing renderer with a first ambient occlusion integrator to Babylon.

Link to comment
Share on other sites

Hello and welcome Kai_ :)

 

The depth calculation should be :

float depth = gl_FragCoord.z / gl_FragCoord.w;

gl_FragCoord.z is the same output style between OpenGL and OpenGLES, it stores the z value of gl_Position but divided by gl_Position.w.

After each vertex transformation (I mean end of vertex's main function), OpenGL will divide (x, y, z) by w in gl_Position and will set w to 1 / w

 

So, to find the starting value of gl_Position.z before the division, you must multiply gl_FragCoord.z by gl_Position.w. Then,

float depth = gl_FragCoord.z / (1 / gl_FragCoord.w) = gl_FragCoord.z * gl_FragCoord.w;

According to the depth renderer, if we want a depth value in the [0, camera.maxZ] interval, we should do :

float depth = (gl_FragCoord.z / gl_FragCoord.w) / far;

Hope it helped you ! :)

Link to comment
Share on other sites

Hello Luaacro,
 
many thanks for your suggestion!
 
In fact, I also tried what you suggested not long ago, as that was proposed by a variety of stackoverflow, gamedev and other posts on the web.
But again from empirics, this also does not give linear view-space z.
 
And there are people on the web trying to explain, why that is:
 
 
 
The whole "back-to-linear-view-space"-computation is a whole lot more complicated, as the opengl.org wiki article here tries to explain with too much math:
 
 
Here is also an interesting post explaining in detail the computation of gl_FragCoord.z from the gl_Vertex attribute:
 
 

it [gl_FragCoord.z] stores the z value of gl_Position but divided by gl_Position.w

In fact, gl_FragCoord.z is the gl_Position.z before the division made by OpenGL.

According to the mentioned articles (and I have yet to find the relevant section on the OpenGL spec, so I will be very glad if someone can point me to that) I think this is wrong.

The OpenGL ES 2.0 spec is only saying under section 3.8: "The built-in variable gl_FragCoord holds the window coordinates x, y, z, and 1/w for the fragment."

So we just have to figure out, what window coordinates means for the z-dimension.  :wacko:
 
But I think we are geeting there! I guess  ;)
 
Cheers,
Kai
Link to comment
Share on other sites

Okay, guys. The answer was right in front, mentioned by oc2ki on the opengl.org thread https://www.opengl.org/discussion_boards/showthread.php/164089-gl_FragCoord-z#post_1159540 (and also by BionicBytes on https://www.opengl.org/discussion_boards/showthread.php/176043-Please-help-gl_fragcoord-to-world-coordinates#post_1229449)

 

I was just too lazy to try it out.  :D 

float Z = gl_ProjectionMatrix[3].z/(gl_FragCoord.z * -2.0 + 1.0 - gl_ProjectionMatrix[2].z);

I tried this and it actually works, regardless of the "near" or "far" settings or the field-of-view:

precision highp float;uniform mat4 projection;uniform float far;void main() {    float Z = projection[3].z/(gl_FragCoord.z * 2.0 - 1.0 + projection[2].z);    float clamped = Z/far;    gl_FragColor = vec4(gl_FragCoord.z, Z, clamped, 0);}

There, Z gives the actual linear distance along the view-z-dimension from the camera center (not from the near plane!) to the fragment.

And clamped gives then the linear distance in [0...1].

I negated the proposed solution, because it was giving me negative values, since I guess in NDC space the z axis is swapped. This way, the value of Z is now always positive.

 

Cheers,

Kai

Link to comment
Share on other sites

I did a mistake myself when writing my post, gl_FragCoord.z is ALWAYS gl_Position.z / gl_Position.w, I was confused ^^

I don't agree because ( gl_FragCoord.z / gl_FragCoord.w)  / far is linearized in the [0, camera.maxZ] interval. In your shader you can try to test a value using the rule of three

 

They say 

float Z = gl_ProjectionMatrix[3].z/(gl_FragCoord.z * -2.0 + 1.0 - gl_ProjectionMatrix[2].z);

because multiplying projection matrix by (Px, Py, Pz, 1.0) gives -Pz in the w result component. Using the above line, it just reverses the calculation by passing the projection matrix to the fragment shader

Link to comment
Share on other sites

Hey Luaacro,

 

thanks for your corrections and clarifications.

 

I may not be that good at math  :) but what I see from those computations and the values they produce, they are not correct do not give linear view-space z-distance, I am afraid.

Please do not take me wrong here. I do not want to be offending or rude in any way.

It's just that I tried (gl_FragCoord.z / gl_FragCoord.w) as well as divided by far and it gives values that do not seem to be directly related to the view-space z-distance. And I yet have to find an interpretation for what theses values really mean.  :) 

 

It might seem for sensible values of far and near, where near is close to zero, that it "looks" like it's linear (plus/minus some delta for floating-point imprecision), but I happen to see that it is actually not linear.

 

And by linear I mean that our proposed 'z' computation is a linear function of the actual z-distance, and should at best be the identity function (always plus/minus floating-point imprecisions).

 

If you want to reproduce, I conclude my test sources here. It's basically the same as I posted above, just with your computation:

function draw() {    var gl = document.getElementById("webgl").getContext("experimental-webgl");        var zDist = 20.0; // <- change this to alter the z-distance to the camera    var near = 10.0;    var far = 100.0;    var OES_texture_float = gl.getExtension('OES_texture_float');    if (!OES_texture_float) {        throw new Error("No support for OES_texture_float");    }    var fbo = gl.createFramebuffer();    gl.bindFramebuffer(gl.FRAMEBUFFER, fbo);    var tex = gl.createTexture();    gl.bindTexture(gl.TEXTURE_2D, tex);    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, 1, 1, 0, gl.RGBA, gl.FLOAT, null);    gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, tex, 0);    var prog = gl.createProgram();    var addshader = function(type, source) {        var s = gl.createShader(type);        gl.shaderSource(s, source);        gl.compileShader(s);        if (!gl.getShaderParameter(s, gl.COMPILE_STATUS)) {            throw "Could not compile "+type+" shader:\n\n"+gl.getShaderInfoLog(s);        }        gl.attachShader(prog, s);    };    addshader(gl.VERTEX_SHADER, "attribute vec3 pos;"+        "uniform mat4 projection;"+        "uniform mat4 view;"+        "void main() {"+        "    gl_Position = projection * view * vec4(pos, 1.0);"+        "}");    addshader(gl.FRAGMENT_SHADER,        "precision highp float;"+        "uniform mat4 projection;"+        "uniform float far;"+        "void main() {"+        "    float z1 = projection[3].z/(gl_FragCoord.z * 2.0 - 1.0 + projection[2].z);"+        "    float z1Clamped = z1/far;"+        "    float z2 = gl_FragCoord.z / gl_FragCoord.w;"+        "    float z2Clamped = z2 / far;"+        "    gl_FragColor = vec4(z1, z1Clamped, z2, z2Clamped);"+        "}");    gl.linkProgram(prog);    if (!gl.getProgramParameter(prog, gl.LINK_STATUS)) {        throw "Could not link the shader program!";    }    gl.useProgram(prog);    gl.disable(gl.CULL_FACE);    var mvMatrix = mat4.create();    var pMatrix = mat4.create();    mat4.lookAt([0.0, 0.0, zDist], [0.0, 0.0, 0.0], [0.0, 1.0, 0.0], mvMatrix);    mat4.perspective(45, 1, near, far, pMatrix);    gl.uniformMatrix4fv(gl.getUniformLocation(prog, 'projection'), false, pMatrix);    gl.uniformMatrix4fv(gl.getUniformLocation(prog, 'view'), false, mvMatrix);    gl.uniform1f(gl.getUniformLocation(prog, 'far'), far);        gl.bindBuffer(gl.ARRAY_BUFFER, gl.createBuffer());    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([        -1, -1, 0,         1, -1, 0,         1,  1, 0,        -1,  1, 0    ]), gl.STATIC_DRAW);    var attr = gl.getAttribLocation(prog, "pos");    gl.enableVertexAttribArray(attr);    gl.vertexAttribPointer(attr, 3, gl.FLOAT, false, 0, 0);    gl.viewport(0, 0, 1, 1);    gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);    var pixels = new Float32Array(4);    gl.readPixels(0, 0, 1, 1, gl.RGBA, gl.FLOAT, pixels);    console.log('expected: ' + zDist);    console.log('z1: ' + pixels[0]); // <- this value should be very close/identical to zDist    console.log('z1Clamped: ' + pixels[1]);    console.log('z2: ' + pixels[2]); // <- this value should also be very close/identical to zDist    console.log('z2Clamped: ' + pixels[3]);}

Following are some outputs from the program for various values of nearfar and zDist.

zDist ist the actual z-view-distance from the camera center to the fragment.

z1 is the computation proposed on the opengl.org wiki thread.

z2 is your computation of (.z/.w)

near      far     zDist   z1                  z2----------------------------------------------------------------- 0.1      100.0   10.0     9.999945640563964   9.90990924835205  0.1      100.0   20.0    19.999773025512695  19.919918060302734  0.1      100.0   90.0    90.00005340576172   89.989990234375   10.0      100.0   11.0    11.0                 1.1111115217208862        <- z2 is way off10.0      100.0   20.0    20.000001907348633  11.111111640930175         <- here, too 10.0      100.0   90.0    90.00000762939453   88.8888931274414           <- z2 a bit off   

Maybe I am misunderstanding you, what your computation should actually produce. But what I want to have, is a computation that produces from gl_FragCoord.z and the other parameters available to a fragment shader (be it implicit or explicit uniforms) the actual view-space z-distance.

 

Cheers,

Kai

Link to comment
Share on other sites

Oh, I know what is happening !

Don't worry you're not offending or what, this a very interesting question and remark :)

 

In fact, using your method assures that you'll find the exactly gl_Position.z value because you're using original values of the perspective matrix. Using z/w, we'll have a loss of precision due to the division by 1.0.

The division by 1.0 is needed because gl_FragCoord must be homogeneous coordinates due to the other calculations OpenGL(ES) will operate.

 

But, to save performances in your shaders, it is advised to use z/w

 

Cheers,

Link to comment
Share on other sites

Thanks for your information!
 

Using z/w, we'll have a loss of precision due to the division by 1.0.

Wha? Okay, but then there is a pretty big loss of precision for the fourth row in the values table above...

When the value should have been 11.0 but was actuall 1.11 ??? :) That's almost ahundred percent off...  :)
I still do not believe that .z/.w really really actually does give us linear z-distance.  :D  and nothing can convince me otherwise anymore.  :P
 
Another performant solution, as also remarked by the same person on that mentioned opengl.org post, is to just use interpolated varyings from vertex to fragment shader.
To spare us all that hassle with dividing here and projection there.. :)
 
I mean, really, if we anyway have the vertex information, then why not just use it.  ;)
 
Then, I would just do in the vertex shader:
uniform mat4 viewMatrix;uniform mat4 projectionMatrix;varying float viewZ; // z-distance in linear view-spacevoid main(void) {  vec4 viewPosition = viewMatrix * vec4(inputVertexPosition, 1.0);  viewZ = viewPosition.z;  gl_Position = projectionMatrix * viewPosition;}

Cheers,

Kai

Link to comment
Share on other sites

Hey,

 

I'm following this thread closely and I just tried hacking a bit the current BJS depth renderer: in the depth fragment shader, I changed this line:

float depth = gl_FragCoord.z / far;

Into this one:

float depth = gl_FragCoord.z / gl_FragCoord.w / far;

And it works quite great! Also, much simpler than passing the projection matrix as a uniform.

 

Just thought I'd let you guys know.

Link to comment
Share on other sites

Hello jahow,

 

it's great that you do it for Babylon.js!

 

Yeah, it's a nice talk we have here.  :D

 

And it works quite great! Also, much simpler than passing the projection matrix as a uniform.

Okay, then do it this way, if the end results look okay, as that is all that counts at the end of the day in computer graphics. always has been.  ;)

 

Cheers,

Kai

Link to comment
Share on other sites

Ah, no, it's okay. haha.  :)

 

I think, I am through with it now. Actually having spent two days straight reading specs, doing a whole lot of web research, experimenting here and there and finally concluding:

ah what the hell, I just use the view-position as varying from the vertex shader.

That settles it for me, at least.

 

But you can go on.  :D

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...