Jump to content

Implementing custom Shader Effects; extending the WebGlRenderer existing pipelines; where did EffectLayer go?


the-simian
 Share

Recommended Posts

Hello I've been looking into this, and I was looking for some insight on how to use or extend the shader pipeline in Phaser 3. Just to cover my bases, here's a few things I've found. This first was this article here: https://phaser.io/phaser3/devlog/75 . I am assuming that the EffectLayer is no longer there, as I did not see it any longer in the documentation. I did see some of the examples of the lighting pipeline and was able to get that workingm whereby you could target an image and call `setPipeline('Light2D') ` and get normal maps to work properly.

What I am asking is, how would one add more 'pipelines' or integrate their own shaders into a Phaser3 game? A good example might be to implement underwater effect, or do occlusion lighting with decent performance (sort of like illuminated.js).

I noticed the pipelines being registered here: https://github.com/photonstorm/phaser/blob/6e82760c997a1006f6d58a99cf7c58bb52d4b4aa/src/renderer/webgl/pipelines/index.js , but I didn't see the right way to add more, nor any real examples on the pipelines besides the `ForwardDiffuseLightPipeline` (which was the `Light2D` pipeline).

 

I did notice there was a pipelines registry here: https://github.com/photonstorm/phaser/blob/cf8e2cfd60b1202483ac596d7bd0bb1110e80c8d/src/renderer/webgl/WebGLRenderer.js#L451 , but I wasn't 100% sure how to  register more, or if there is a correct convention for them.

Its totally possible I've missed something here or there or in the docs, and I'd love somone to point me in the right direction if they know more. If anyone wants to see an example of the Light2D-pipeline example working, I'd be happy to share. One question for the authors of the engine... if you see this: why' did you limit the lights in that pipeline hardecoded to 10? Performance or another consideration?

Thanks in advance for any guidance.

Link to comment
Share on other sites

I've been working on this for a few days, and I've managed to get some answers. I'll post what I know, but I think I'll need to follow up with better information later, when I have time to compose it.

Ok so the first thing - EffectLayer is totally gone, the entire system of working with WebGl was overhauled and its significantly better now in phaser 3. I can't speak to performance, but the organization seems better. Unfortuantely there's zero documentation here, so I'm just reading and testing.

As for extending or adding a custom pipeline, you can do something like this:

inside of your game config...

 

window.Phaser = Phaser;
//just a fairly normal config here.
const config = {
  type: Phaser.AUTO,
  width: constants.WIDTH,
  height: constants.HEIGHT,
  physics: {
    default: 'arcade',
    arcade: {
      gravity: { y: 300 },
      debug: false
    }
  },
  scene: [GameScene],
  callbacks: {
    //THIS IS THE PART I AM TALKING ABOUT!
    postBoot: game => {
      game.renderer.addPipeline('Custom', new CustomPipeline(game));
    }
  }
};

const game = new Phaser.Game(config);
window.game = game;


you can use the PostBoot handler to add more pipelines.
 

now where you might have used `.setPipeline('Light2D)` or somesuch, you can instead use `.setPipeline('Custom')`

As for the contents of that pipeline... you'll want to write it like you see the existing pipelines. I'm still experimenting with this part and will followup for anyone else reading who is curious how to do this.
 

Link to comment
Share on other sites

  • 6 months later...

Hi the-simian, thanks for this post, really help me!
Do you know how can I use two custom pipelines at same time?

This is what I've tried:

// This works
this.cameras.main.setRenderToTexture(this.customPipeline);
// This works, but replace the previous one
this.cameras.main.setRenderToTexture(this.customPipeline2);

// This breaks
this.cameras.main.setRenderToTexture([this.customPipeline, this.customPipeline2]);

Thanks!

Link to comment
Share on other sites

  • 1 month later...

colormono, this might be a little late but I'll post here in case anyone else comes a googling.

One way to use two pipelines is by setting one on the camera, and the other on each sprite/image that needs it.

customPipeline1 = game.renderer.addPipeline('Custom1', new CustomPipeline1(game)); 
customPipeline2 = game.renderer.addPipeline('Custom2', new CustomPipeline2(game));

var mySprite = this.add.sprite(10, 10, 'blah');

mySprite.setPipeline('Custom1');
this.cameras.main.setRenderToTexture('Custom2');

This works for me (though I'm getting some strange flipping of the Y axis behaviour in the sprite shader, I've not delved into it further).

It does feels a little hacky but this will get two shaders going in a pinch.

Note I have no idea if setting a pipeline for hundreds of individual sprites = bad performance, so use this power wisely :D

 

EDIT:

Ah one other way, is to use multiple cameras, placed in the same position. Apply one pipeline to each camera as needed. Then use the camera.ignore option to layer out your sprites/images across the various cameras. (see here for camera.ignore http://labs.phaser.io/edit.html?src=src\camera\ignore.js).

From my testing, one limitation here though is you only get one pipeline effect per image/sprite. If a sprite is seen by two cameras, both with different custom pipelines, you only see pipeline effect of the camera in front. I assume its because the original image (texture) is always passed to each camera pipeline (rather than the output from the "behind" camera pipeline being input to the "front" camera pipeline).

Link to comment
Share on other sites

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...