Jump to content

Deferred shading: Can I use GeometryBufferRenderer to get normal sampler if I using custom shader code?


leht
 Share

Recommended Posts

Hello,

I'm trying to make deferred shading works in my application. I use a custom shader that set data into GBuffer like this:
 

  ....
    //diffuse
    gl_FragData[0] = vec4(colorFromLUT.rgb, alpha);
    //normal
    gl_FragData[1] = vec4(colorFromLUT.rgb, alpha); //vec4(gradientW.xyz, 1.0);
    //position
    gl_FragData[2] = vPos;

....

Then I create a custom PostProcess class:

export default class DeferredShadingPostProcess extends PostProcess  {
	/**
         * Creates a new instance DeferredShadingPostProcess
         * @param name The name of the effect.
		 * @param scene The scene to apply the render pass to.
         * @param options The required width/height ratio to downsize to before computing the render pass. (Use 1.0 for full size)
         * @param camera The camera to apply the render pass to.
         * @param samplingMode The sampling mode to be used when computing the pass. (default: 0)
         * @param engine The engine which the post process will be applied. (default: current engine)
         * @param textureType Type of textures used when performing the post process. (default: 0)
         * @param blockCompilation If compilation of the shader should not be done in the constructor. The updateEffect method can be used to compile the shader at a later time. (default: false)
         */
	constructor(name, scene, options, camera, samplingMode = 0, engine = null, textureType = Engine.TEXTURETYPE_UNSIGNED_INT, blockCompilation = false) {
		if(typeof BABYLON.Effect.ShadersStore["deferredShadingFragmentShader"] === 'undefined') {
			BABYLON.Effect.ShadersStore["deferredShadingFragmentShader"] = `
uniform sampler2D textureSampler; 
uniform sampler2D normalSampler;
uniform sampler2D positionSampler;
uniform vec3 cameraPosition;

varying vec2 vUV;

void main( void )
{
    vec4 image = texture2D( textureSampler, vUV );
    vec4 position = texture2D( positionSampler, vUV );
    vec4 normal = normalize(texture2D( normalSampler, vUV ));
    
    // vec3 light = vec3(50,100,50);
    // vec3 lightDir = normalize(light - position.xyz);
    
    // vec3 eyeDir = normalize(cameraPosition - position.xyz);
    // vec3 vHalfVector = normalize(lightDir.xyz + eyeDir);
    
    //gl_FragColor = image * max(dot(normal.xyz, lightDir), 0.0) + vec4(pow(max(dot(normal.xyz, vHalfVector),0.0), 100.0) * 1.5);
	gl_FragColor = normal;
}
			`;
		}
		
		if(engine == null)
			engine = scene.getEngine();
		
		super(name, "deferredShading", ["cameraPosition"], ["normalSampler", "positionSampler"], options, camera, samplingMode, engine, false, "#define GEOMETRY_SUPPORTED", textureType, undefined, null, blockCompilation);
		
		this._geometryBufferRenderer = scene.enableGeometryBufferRenderer();
		
		if (!this._geometryBufferRenderer) {
			// Geometry buffer renderer is not supported. So, work as a passthrough.
			console.log("Multiple Render Target support needed to compute deferred shading");
			this.updateEffect();
		} else {
			// Geometry buffer renderer is supported.
			this._geometryBufferRenderer.enablePosition = true;

			this.onApply = (effect) => {
				effect.setVector2("screenSize", new Vector2(this.width, this.height));
				effect.setVector3("cameraPosition", this.getCamera().position);

				if (this._geometryBufferRenderer) {
					effect.setTexture("normalSampler", this._geometryBufferRenderer.getGBuffer().textures[1]);
					effect.setTexture("positionSampler", this._geometryBufferRenderer.getGBuffer().textures[2]);
				}
			};
		}
    }
}

When I test output the textureSampler, it shows OK. But when testing output the normalSampler, it just show black screen, even if I put same data in gl_FragData[0] and gl_FragData[1].

I guess that I'm misunderstanding the GeometryBufferRenderer, maybe it's only used for StandardMaterial, not custom ShaderMaterial.

Could you tell me how to set & get normal buffer in GBuffer correctly?

 

 

 

Link to comment
Share on other sites

@Deltakosh, thank you! Now I realized that the issue is at that point: I didn't change the geometybuffer shader!

How to use a custom shader code rather than the 'geometry' shader but still make it available for MRT postprocess after its pass? Do I need to make a clone of GeometryBufferRenderer?

Link to comment
Share on other sites

But how about the different uniforms that I use in my own shader?

Go other way, I tried to clone GeometryBufferRenderer & GeometryBufferRendererComponent but still could not use the GBuffer. It shows something on the screen but different than the main inputTexture set in PostProcess class. Here're my classes in attachment.

I still don't understand the way a post process is set the inputTexture & how does the buffer renderer got the data into GBuffer.

helpers.vrBufferRenderer.js

helpers.vrBufferRendererComponent.js

Link to comment
Share on other sites

For more detail, what I got when use "textureSampler" (which is binded with inputTexture) is:

image.png.eea0b5c190a59e675f4079da5bade765.png

But if I use GBuffer, even the textures[0], I got:

image.png.dc5d964d215e5d701b10ffddecd1fd09.png

One issue that I understand: it shows in the left bottom corner because I use camera viewport of 1/4 screen (so it render into 1/4 corner of the GBuffer). But don't know why it cannot show the texture like expected...

Link to comment
Share on other sites

After some hours at night, I found 2 bugs of my own that cause the buffer renderer show different than original! Thanks God, it shows the same picture now!

Confirm that clone the geometryBufferRenderer with our custom shader effect will do the work for GBuffer & deferred shading so far.

Link to comment
Share on other sites

When cloning, there're 3 things that I need to change:

- createEffect in isReady function: change shader name, uniforms list, samplers list,..

- renderSubMesh: change uniforms binding, most of them I take the values from mesh.material

- the texture count in GBuffer

Actually I still don't know why we need to clone the effect while we already have the material attached to the meshes. Maybe to avoid issues can occur if our scene has >1 meshes that have different materials? If that, we also not sure that our only 1 effect in this custom renderBuffer can afford those difference.

Maybe it's better if we have a mechanism for cloning effect, then make a RenderBuffer that easily to add the custom effect that we want to clone. As I mention above, the addition can be "createEffect" & "customRenderFunction".

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...