Search the Community

Showing results for tags 'postprocessing'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



Website URL





Found 4 results

  1. Hello and first of all sorry for the noob question. I am trying to convert this THREEjs shader to Babylon to no avail. I used as base and managed to arrive to and now i'm kinda stuck. Anyone can help? TIA
  2. I am really confused about: pipeline = new BABYLON.DefaultRenderingPipeline(...) VS pipeline = new BABYLON.StandardRenderingPipeline(...) VS postProcess = new BABYLON.ImageProcessingPostProcess(..) To me all of them are post processing. So I don't understand.. When should I use each? Can I combine all three together? I see that the standard has things that the default pipeline doesn't have. I also see that the standard is "no longer mantained" but it has awesome features like volumetric light, dirt camera, adaptive HDR. How am I supposed to do all that with the default rendering pipeline? I see no explanation of that.. I also see the var ssao = new BABYLON.SSAORenderingPipeline(..)... making things even more confusing What is the best practice if I want to combine most of the effects? I want to combine: Dirt Adaptive HDR Bloom Vignette MSAA Grain ColorCurves LUT color correction Vignette DOF Lens Flare Volumetric lighting is gone from default pipeline too? (I had almost all these postprocessing effects in place in ThreeJs, and I need them now in my Babylon Project) Thanks in advance P.S: I am looking at these pages:
  3. I notice that when I use the FxaaPostProcess my scene retains its alpha transparancy. This is important because I am overlaying my scene over a background image behind the canvas. When I use the ColorCorrectionPostProcess the background becomes black. Does anyone have a way around this? If I insert my background into the scene it will either move with the camera or be affected by the ColorCorrectionPostProcess which I do not want.
  4. Hey, I have another question which might be easily answered by somebody who is very familiar with the pixi sources. I'm in the progress of writing some custom classes to render a particle system. The particel rendering is all fine and working. I made some pretty good progress so far. I have a small customized PIXI.Container with a PIXI.ObjectRenderer and custom shader (it's like a sprite is rendered through the SpriteRenderer). As I said, this really works rock solid at the moment, I have learned a lot from the existing Particle/Sprite Renderer and I'm really impressed how many great examples of coding can be found in the PIXI v3 sources ) NOW: After getting so far, I want more: Post processing of my render result. I thought this would be easy, just plug in some filters and you're done. But there seems to be a problem with my rendering order. Filteres are not applied at all, filters are not behaving like normal (change over time) interfering with sprites on the parent container or I get filters running only when I combine two of them. I tried to create a very minimal example- but it's still a lot of code. See this? That rectangle is blurring over time? Or this: A gray filter is applied but its still red Try to apply both filters to the stage: Nothing happens. Try to apply both filters to the stage: Nothing happens. Let's make a sprite added to the stage visible,now both filteres are triggered My assumption: My custom renderer (RenderTester in the demos) is not called at the right position so it's getting mixed up during rendering. I couldn't find any relevant container.children checks so I don't think this is a cause. I currently override PIXI.Container#_renderWebGL, I had version where I overwrote PIXI.Container#renderWebGL, called by my own the filter Manager with push & pop but I could never get my expected result which is: 1. Render whatever there is to render in my container with custom ObjectRenderer. 2. Then take the result to apply some post processing (blurring, thresholds, alpha blending, ..)Is my filter approach correct? Or should I create my own RenderTarget workflow even that the FilterManager is basically doing exactly what I want ( managing some RenderTargets to process on ) I compared my Container/ObjectRenderer combination with the existing Sprite/SpriteRenderer and could't findanything different that would prevent my container from applying a correct filter post processing. Any advice? Did I missed something really obvious? Thanks! var RenderTester = augment(PIXI.Container, function(uber){ this.constructor = function(texture){; this._texture = texture; } //does this prevent the filteres from getting applied? this._renderWebGL = function(renderer){ renderer.setObjectRenderer( renderer.plugins.tempRenderer ); renderer.plugins.tempRenderer.render( this ); } });