Search the Community

Showing results for tags 'shader'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser
    • Pixi.js
    • Babylon.js
    • Panda.js
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Found 100 results

  1. Hi, I'm trying to reproduce ocean shader like this one: https://jbouny.github.io/fft-ocean/#day I know we have waterMaterial in Babylon (which is gorgeous!) but not quite what I'm after: it only works with windForce/windDirection which looks like you're constantly moving or as if water is constantly flowing. I've tried to use shader which you can find at https://github.com/jbouny/fft-ocean but it's just waaaay over my head! First of all, I can't figure out how to split it into vertex/fragment shaders to use in Babylon. Any shader/GLSL gurus out there, any help hugely appreciated!
  2. I,m trying to develop something like this using Babylon.js (The example is implemented in Unity 3D): I suppose this can be implemented using a shader. The user can move a sphere with the inverted normals, and the shader will have to calculate if this sphere is painted depending on its position in the z buffer. Is it possible to do this using babylon ShaderMaterial? Thanks in advance.
  3. Quick question: Is there any way to fade out a ShaderMaterial'd mesh? mesh.material.alpha = X does not seem to have any affect on the alpha value, unless I am doing something wrong. Am I missing something or is it not possible? (I can probably fake it by adding an alpha uniform to the fragment shader and explicitly passing it in, however this would be painful to animate). Also, if I go the uniform route, I'm not sure how much overhead (if any), it adds, setting the uniform value for the shader before render. I've noticed a lot of games similar to mine (digital card games), will have their shader effects timed in sync for a particular effect, which led me to the assumption that there is probably a performance reason behind it? (because having the effects fire on a different start time would look better) - But maybe I'm way off mark with that assumption, anyone happen to know?
  4. Hi all, I found something weird while optimizing my code: I was using InstancedMesh as much as possible, and switched all my materials to ShaderMaterial. And then boom, everything disappeared. After a bit of research, it seems instanced meshes will simply not be drawn when their source has a ShaderMaterial assigned. Playground repro: http://www.babylonjs-playground.com/#TWDEA#6 Uncomment lines 40 to 44 to see for yourself. It's actually a bit more complex than that: when you assign a shader material to it, even the source mesh disappears if it has been instantiated (?), but only if its instances are in the field of view. I'm currently looking at BJS code and ANGLE_instanced_arrays doc (the extension used for drawing instanced meshes), but I thought I'd come here to fish for ideas... FYI these are two errors I noticed in the log when this problem showed (errors not always there): drawElementsInstancedANGLE: at least one enabled attribute must have a divisor of 0glDrawElementsInstancedANGLE: attempt to draw with all attributes having non-zero divisorsThanks
  5. Hello everyone! I'm making fluid simulation effect, i found a demo: http://jeeliz.com/demos/water/ and i want to clone it. The demo written by WebGL + shader but i don't know about WebGL, i'm worrying . I'm trying export to babylon.js and there are a few things i don't know how to do. I'm really stuck, i expect the babylon community to help me 1. var texture_water = GL.createTexture(); GL.bindTexture(GL.TEXTURE_2D, texture_water); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.NEAREST); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.NEAREST); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE); GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, 512, 512, 0, GL.RGBA, GL.FLOAT, null); 2. var quad_vertex = [-1, -1, 1, -1, 1, 1, -1, 1]; var QUAD_VERTEX = GL.createBuffer(); GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX); GL.bufferData(GL.ARRAY_BUFFER, new Float32Array(quad_vertex), GL.STATIC_DRAW); var quad_faces = [0, 1, 2, 0, 2, 3]; var QUAD_FACES = GL.createBuffer(); GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES); GL.bufferData(GL.ELEMENT_ARRAY_BUFFER, new Uint16Array(quad_faces), GL.STATIC_DRAW); GL.vertexAttribPointer(SHP_VARS.rendering.position, 2, GL.FLOAT, false, 8, 0); GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX); GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES); GL.disableVertexAttribArray(SHP_VARS.rendering.position); 3. GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0); GL.disableVertexAttribArray(SHP_VARS.water.position); GL.framebufferTexture2D(GL.FRAMEBUFFER, GL.COLOR_ATTACHMENT0, GL.TEXTURE_2D, texture_normals, 0); GL.useProgram(SHP_NORMALS); GL.enableVertexAttribArray(SHP_VARS.normals.position); GL.bindTexture(GL.TEXTURE_2D, texture_water); GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0); GL.disableVertexAttribArray(SHP_VARS.normals.position); GL.bindFramebuffer(GL.FRAMEBUFFER, null); GL.flush(); With the above code how i export to babylon.js? Thanks so much!
  6. So, I'm trying to convert a shader from shadertoy, I'm close but still can't get it working. Also in my actual scene it doesn't seem to be working at all, but it's hard to tell if it's related to the issue I am having w conversion, since I need to rotate the sphere to get it to show up to begin with. Shader is here: (it appears blank at first, but if you rotate it youll start to see the fire. The actual effect I am going for you will see only if you rotate it just right so that you see the fire starting with the white in the middle, and it filling up the sphere). http://cyos.babylonjs.com/#M11GKA The source shader is here: https://www.shadertoy.com/view/lsf3RH So the one place I was not sure how to proceed, was mapping over the iResolution variable (which shadertoy states is the viewport resolution). I played around with a bunch of different things and ended up trying the camera input, which works, but requires rotating the mesh to see it at all. Anyone know what input would map over to viewport resolution (or how to get it), and or what I am doing wrong/missing here?
  7. I try to start it with this example : http://phaser.io/examples/v2/filters/blue-dots I use download zip , but seam need to fix a php server. I try to fix by using just the files and folder to run the example that but seem is not working . Can you give me some help ? Thank you. Best regards.
  8. Error in PBR

    Hi, What is the bets way to debug errors in shaders? I have mystica error from PBR: Error: ERROR: 0:1492: 'glossiness' : undeclared identifier ERROR: 0:1492: 'computeHemisphericLighting' : no matching overloaded function found ERROR: 0:1492: 'assign' : cannot convert from 'const mediump float' to 'structure' ERROR: 0:1573: 'computeLighting' : no matching overloaded function found ERROR: 0:1573: 'assign' : cannot convert from 'const mediump float' to 'structure' I cant really find the source of it ,as the error shows up if I change starting camera angle
  9. Hey Folks! For my custom Shader I want to use the object's normals in view space. Therefor I need the gl_NormalMatrix. I know how to construct it (inverted and transposed MVMatrix) but I don't want to construct it manually for each object. I found this thread [SOLVED] - Shader Programs Attributes And Uniforms, but there was no hint on how the gl_NormalMatrix is called in babylon. I also searched the babylon git-repository, but could not find where the shader attributes and such are declared. Can anyone please point me in the right direction? Thank you for your time -Mainequin
  10. I am trying to use this lib: https://github.com/pixijs/pixi-tilemap and it seems using shader rather than sprites to render tilemap? is it more efficiency? why? the memory usage is the same, right? thanks.
  11. Here's a helper function I made to create each shader plugin with just a simple function call like this: createShaderPlugin(name, vertShader, fragShader, uniformDefaults); Then you can create each sprite that you want to be drawn with your custom shader like this: var sprite = createShaderPluginSprite(name, size, uniforms); Update: I started a GitHub repo for it here if you want to check it out. I also made this little CodePen to demonstrate the usage with some comments. The encapsulated code is based on the plugin example and on the worldTransform/vertices calculations that I boosted from PIXI.Sprite. I've tried to optimize and condense for the use case where the Sprite doesn't have any Texture to be filtered.
  12. Help with shader waterfall

    Hello guys, I'm stuck with shaders. May be you can help me. The objective to make waterfall with shader. As for now I have this frag shader: precision mediump float; varying vec2 vTextureCoord; uniform sampler2D uSampler; uniform vec4 filterArea; uniform vec2 dimensions; uniform float speed; uniform float time; #pragma glslify: noise = require("glsl-noise/simplex/3d") vec2 cartToIso(vec2 pos) { vec2 res = pos; res.x = pos.x - pos.y; res.y = (pos.x + pos.y) / 2.0; return res; } void main() { vec2 pixelCoord = vTextureCoord * filterArea.xy; vec2 coord = pixelCoord / dimensions; vec2 iso = cartToIso(pixelCoord); float x = pixelCoord.x * 0.1; float y = dimensions.y / pixelCoord.y + (speed * time * 10.0); float z = time; vec3 vector = vec3(x, y, z); vec3 noise = vec3(noise(vector)); gl_FragColor = vec4(noise, 1.0); } It gives me nice waterfall result (video in attach). But the target is to make it isometric (in reality dimetric). Look at pic in the attach. Is there a way to make this? I'll be appreciate for any help. waterfall.mp4
  13. Car paint shader

    Hi, started to learn GLSL and shader, so decided to port some shader to babylon Source: Original webgl shader from three.js Babylon.js version: PG Right now it is just direct port but maybe someone can help me to merge it better with babylonjs ecosystem. (Right now dont work in mac safari, will check why)
  14. Hi, While I'm studying WebGL, I'm wondering how to use vertex shader with Pixi.js. Using fragment shaders is quite clear in my mind, but I'm still confused on how to send vertices coordinates to a vertex shader in Pixi.js. For instance, assuming we have such a vertex shader: // vertex shader attribute vec3 aVertexPosition; uniform mat4 uMVMatrix; uniform mat4 uPMatrix; void main(void) { gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0); } and we have such vertices coordinates: const vertices = [ 0.0, 1.0, 0.0, -1.0, -1.0, 0.0, 1.0, -1.0, 0.0 ]; How could I draw this triangle (this is supposed to be a triangle) with this vertex shader and Pixi.js? Thanks!
  15. Is there a way to call specific function or event when render finishing with rendering specific sprite or event when shader is executed ?
  16. Shader translation

    Hi, I need help with shaders in pixi . I trying to translate and rotate sprite with shaders. I am using PIXI.Filter but I can find any example with vertex shaders(only fragmet examples ). If someone can provide some example . Thanks
  17. I know that you can manipulate vertices that exists with a shader and make their visual position different then their physical location. Is there a way to deform the spaces between vertices, I doubt it but I figured it was worth asking. Im trying to figure out how to blend terrain without t-junctions and possibly have the GPU handle the whole load.
  18. [solved] Cyos

    Hy guys! I've this problem of [Varyings with the same name but different type, or statically used varyings in fragment shader are not declared in vertex shader:] in ths console of the shader playground; here's my code Vertex shader: /** * Example Vertex Shader * Sets the position of the vertex by setting gl_Position */ // Set the precision for data types used in this shader precision highp float; precision highp int; // Default THREE.js uniforms available to both fragment and vertex shader uniform mat4 modelMatrix; uniform mat4 worldViewProjection; uniform mat4 world; uniform mat4 viewMatrix; uniform mat3 normalMatrix; ///// // Default uniforms provided by ShaderFrog. uniform vec3 cameraPosition; uniform float time; uniform float v; // Default attributes provided by THREE.js. Attributes are only available in the // vertex shader. You can pass them to the fragment shader using varyings attribute vec3 position; attribute vec3 normal; attribute vec2 uv; attribute vec2 uv2; // Examples of variables passed from vertex to fragment shader varying vec3 vPosition; varying vec3 vNormal; varying vec2 vUv; varying vec2 vUv2; void main() { // To pass variables to the fragment shader, you assign them here in the // main function. Traditionally you name the varying with vAttributeName vNormal = normal; vUv = uv; vUv2 = uv2; vPosition = position; // This sets the position of the vertex in 3d space. The correct math is // provided below to take into account camera and object data. gl_Position = worldViewProjection * world * vec4( vPosition, 1.0 ); } Fragment shader: precision highp float; precision highp int; uniform vec2 resolution; uniform float time; varying vec2 vUv; vec3 mod289(vec3 x) { return x - floor(x * (1.0 / 289.0)) * 289.0; } vec4 mod289(vec4 x) { return x - floor(x * (1.0 / 289.0)) * 289.0; } vec4 permute(vec4 x) { return mod289(((x * 34.0) + 1.0) * x); } vec4 taylorInvSqrt(vec4 r) { return 1.79284291400159 - 0.85373472095314 * r; } float snoise(vec3 v) { const vec2 C = vec2(1.0 / 6.0, 1.0 / 3.0); const vec4 D = vec4(0.0, 0.5, 1.0, 2.0); vec3 i = floor(v + dot(v, C.yyy)); vec3 x0 = v - i + dot(i, C.xxx); vec3 g = step(x0.yzx, x0.xyz); vec3 l = 1.0 - g; vec3 i1 = min(g.xyz, l.zxy); vec3 i2 = max(g.xyz, l.zxy); vec3 x1 = x0 - i1 + C.xxx; vec3 x2 = x0 - i2 + C.yyy; vec3 x3 = x0 - D.yyy; i = mod289(i); vec4 p = permute(permute(permute(i.z + vec4(0.0, i1.z, i2.z, 1.0)) + i.y + vec4(0.0, i1.y, i2.y, 1.0)) + i.x + vec4(0.0, i1.x, i2.x, 1.0)); float n_ = 0.142857142857; vec3 ns = n_ * D.wyz - D.xzx; vec4 j = p - 49.0 * floor(p * ns.z * ns.z); vec4 x_ = floor(j * ns.z); vec4 y_ = floor(j - 7.0 * x_); vec4 x = x_ * ns.x + ns.yyyy; vec4 y = y_ * ns.x + ns.yyyy; vec4 h = 1.0 - abs(x) - abs(y); vec4 b0 = vec4(x.xy, y.xy); vec4 b1 = vec4(x.zw, y.zw); vec4 s0 = floor(b0) * 2.0 + 1.0; vec4 s1 = floor(b1) * 2.0 + 1.0; vec4 sh = -step(h, vec4(0.0)); vec4 a0 = b0.xzyw + s0.xzyw * sh.xxyy; vec4 a1 = b1.xzyw + s1.xzyw * sh.zzww; vec3 p0 = vec3(a0.xy, h.x); vec3 p1 = vec3(a0.zw, h.y); vec3 p2 = vec3(a1.xy, h.z); vec3 p3 = vec3(a1.zw, h.w); vec4 norm = taylorInvSqrt(vec4(dot(p0, p0), dot(p1, p1), dot(p2, p2), dot(p3, p3))); p0 *= norm.x; p1 *= norm.y; p2 *= norm.z; p3 *= norm.w; vec4 m = max(0.6 - vec4(dot(x0, x0), dot(x1, x1), dot(x2, x2), dot(x3, x3)), 0.0); m = m * m; return 42.0 * dot(m * m, vec4(dot(p0, x0), dot(p1, x1), dot(p2, x2), dot(p3, x3))); } void main() { vec2 div = vec2(10, 10); vec2 uv = vUv.xy / resolution.xy * div.xy; vec3 v = vec3(uv.x + sin(time) * 0.2, uv.y + cos(time) * 0.2, time / 10.0); float noise = snoise(v); vec2 resolution = vec2(1, 1); uv = vUv.xy / resolution.xy * div.xy; vec3 v2 = vec3(uv.x, uv.y, time / 5.0); noise = sin(noise * 3.14 * (sin(time) + snoise(v2) * 2.0) * 0.75); float darkenFactor = 0.2; float darkenValue = darkenFactor; div = vec2(5, 5); uv = vUv.xy / resolution.xy * div.xy; vec3 v3 = vec3(uv.x, uv.y, time / 2.0); darkenValue = darkenValue * snoise(v3); vec3 v4 = vec3(uv.x * 1000.0, uv.y * 1000.0, time); float b = snoise(v4) * 0.1; gl_FragColor = vec4(1.0 - darkenValue + (noise * (darkenValue + 0.2)) - b, noise - b, b, 1.0); } Pinging the shader-lord @NasimiAsl
  19. I want to draw a texture on the line that user draws on the screen. Like a mouse trail, so I follow this tutorial. . The only thing that doesn't answer me is sending the custom uv to the shaders. I vaguely remember that in C++, I can create struct or something and the shader will know how to handle it. It seems that is not the case here. I can't even figure out how does "index" property in vertex shader gets there. Even so, I can't use the "index" approach as the line won't be full rectangle as in the example. Right now, I copied & pasted the entire lines2d classes and try to customize to be able to draw texture on there, but I don't know how to send the custom uv that I've calculated to the shader. Anyone can explain that to me? Or am I looking at this a wrong way?
  20. Shadows in Custom Shader

    Dear Babylon JS community, we as a company have decided, that we want to use Babylon JS for a larger project. For this we have specific requirements for what the shaders have to be able to do. I will first state the problem I am trying to solve and then give the context, for possible alternative solutions. PROBLEMS: For our more complex shader computations we want to integrate shadows from at least one shadow-generator in a custom shader. For reasons of confidentiality I can not submit our current project code, that is why I created this test playground: http://www.babylonjs-playground.com/#VZKI0U We want to get the influence of all shadows on a fragment as a float value in the shader for further computations. For this we encountered the following problems: - Mapping to shadow-map coordinates seems to be wrong - using functions like computeShadow() from #include<shadowsFragmentFunctions> yields not-declared-error - computeShadow() yields always 1.0 as a result COURSE OF EVENTS: We started playing around with the standart material and shadow generators and quickly got them to work. we wrote a small utility function for setting up the shadow generators, which you can find at the top of the linked playground code. After this we played around with uploading textures into our custom shaders and were able to create the desired effects. We looked into uploading the shadow-map and the shadow-generator parameters into the shader, which was sucessful. You can find the uploads at line 113-115 of the linked playground code. Since we do not want to write the mapping to shadow map coordinates ourselves, we looked if there is already existing code, which we found in the shadowsVertex.fx, shadowsFragment.fx and shadowsFragmentFunctions.fx files. While trying to get the mapping right, we encountered the aformentioned problems. We were not able to get correct results regarding the shadow-uv-coordinates, shaderincludes like the above mentioned #include<shadowsFragmentFunctions> yields a "computeShadow() has not been declared" error when used in the code after the statement and what code we currently copied from these files seems to always yield 1.0 as a result for the sha- dow intensity. We are turning to you now, because we are at a point where we cannot find the errors in our approach/code anymore. We are required to use Babylon JS version 2.5 in our project. Although it didn't seem to make a difference for the shader code we looked through I wanted to mention it. CONTEXT: Our scene is basically shadeless, with multiple materials per object, distributed via a mask. Therefor we combine a precomputed light texture (for individual objects) with a diffuse texture and multiple material textures blended via a mask texture. Since we require no lighting computation we just want the shadow values to get some visual depth in the scene. Therefor the standart material seems to be not sufficient for our purposes, hence the reliance on a custom shader. I saw code that created a custom material with the standart shaders and then re- placed parts of the vertex and fragment code via a function. We would be ready to do this kind of code insertion, if it yields correct shadow information. Sadly I cannot find the example project for this anymore, so if you could provide a link to a simmiliar source it would be much appreciated. Thank you sincerely for your time and help With best regards from the green heart of Germany The Mainequin Team
  21. Using UV Map on New Material

    Hello guys, I was wondering if there is a way to keep using a mesh's UV map after changing it's material. My game currently uses a cell shader, so I have been giving meshes ShaderMaterials and then calling ShaderMaterial.setTexture() to give it the same texture as before. This successfully applies the cell shader and texture, but the UV map is lost in the process so the texture no longer lines up. I've looked through the mesh, material, submaterials, and texture's attributes and all i can find are u/v offset, scale, and Ang (which are 0, 1, 0 respectively), but in the .babylon file for the model I see an attribute caled "uvs" which I'm assuming is what I need to copy. Thanks in advance!
  22. Hi guys! I'm trying to figure out how to apply some custom frag shader into a filter. The intented effect is a simple crt warp over a sprite. Here is what I got so far: https://jsfiddle.net/djq6kjx4/1/ but It should look something like the image in the attachment. As you can see from the example above the warp effect is mostly visible to one corner. Now I'm starting with shaders but I think the reason is that vTextureCoord is somehow off? I tried to use the mapCoords and unmapCoords from https://github.com/pixijs/pixi-filters/blob/master/src/pixelate/pixelate.frag similarly without success (I have no Idea what those do). Some time after I tried to use gl_fragCoord directly like this: vec2 coord = gl_FragCoord.xy / dimensions.xy; wich seems to do the trick but the texture comes out flippled. Im sure it can be fixed but I don't think this is a good path to follow. Right? Any hint would be much appreciated Thanks
  23. Guide to pixi-V4 filters

    V4 filters are differ from V3. You can't just put there shader and assume that texture coords are in [0,1] range. I am sorry that you have to learn all of that, and I will make sure that the process will be easier for pixi-v5. Filter Area Thanks to @bQvle and @radixzz First, lets work with the AREA. When you apply filter to container, PIXI calculates the bounding box for it. We are working with bounding box. Invariant: maximal vTextureCoord multiplied by "filterArea.xy" is the real size of bounding box. Don't try to think about it: its like that because of performance reasons, its not logical in user-experience sense. Neither vTextureCoord dimensions, neither filterArea.xy are predictable, but they multiplication is what we need. Area can have padding, so please don't use it to get "displacement texture" coordinates or any second-hand textures you are adding to the shader, use "mappedMatrix" for it (see below) If you want to get the pixel coordinates, use "uniform filterArea", it will be passed to the filter automatically. uniform vec4 filterArea; ... vec2 pixelCoord = vTextureCoord * filterArea.xy; They are in pixels. That wont work if we want something like "fill the ellipse into a bounding box". So, lets pass dimensions too! PIXI doesnt do it automatically, we need manual fix: filter.apply = function(filterManager, input, output) { this.uniforms.dimensions[0] = input.sourceFrame.width this.uniforms.dimensions[1] = input.sourceFrame.height // draw the filter... filterManager.applyFilter(this, input, output); } Lets combine it in shader! uniform vec4 filterArea; uniform vec2 dimensions; ... vec2 pixelCoord = vTextureCoord * filterArea.xy; vec2 normalizedCoord = pixelCoord / dimensions; Here's the fiddle: https://jsfiddle.net/parsab1h/ . You can see that shader uses "map" and "unmap" to get to that pixel Now let's assume that you somehow need real coordinates on screen for that thing. You can use another component of filterArea, zw: uniform vec4 filterArea; ... vec2 screenCoord = (vTextureCoord * filterArea.xy + filterArea.zw); I dont have an example for that, but may be you need that value for something? Fitting problem Thanks to @adam13531 at github. One small problem: those values become wrong when PIXI tries to fit bounding box: here's the fiddle: http://jsfiddle.net/xbmhh207/1/ Please use this line to fix it: filter.autoFit = false; Bleeding problem Thanks to @bQvle The temporary textures that are used by FilterManager can have some bad pixels. It can bleed. For example, displacementSprite can look through the edge, try to move mouse at the bottom edge of http://pixijs.github.io/examples/#/filters/displacement-map.js. You see that transparent (black) zone, but it could be ANYTHING if it wasnt clamped. To make sure it doesnt happen in your case, please use clamping after you map coordinates: uniform vec4 filterClamp; vec2 pixelCoord = WE_CALCULATED_IT_SOMEHOW vec2 unmappedCoord = pixelCoord / filterArea.xy; vec2 clampedCoord = clamp(unmappedCoord, filterClamp.xy, filterClamp.zw); vec4 rgba = texture2D(uSampler, clampedCoord); Both FilterClamp and FilterArea are provided by FilterManager, you dont have to calculate pass it in "filter.apply", here's pixi code that takes care of that: https://github.com/pixijs/pixi.js/blob/dev/src/core/renderers/webgl/managers/FilterManager.js#L297 OK, now we have "transparent" zone instead of random pixels. But what if we want it to be fit completely? displacementFilter.filterArea = app.screen; // not necessary, but I prefere to do it. displacementFilter.padding = 0; That'll do it. Why did I modify filterArea there, PIXI will "fit" it anyway, right? I dont want PIXI to have time calculating the bounds of container, because maggots are actually changing it, crawling in places that we dont see! No extra transparent space, and if you put it into http://pixijs.github.io/examples/#/filters/displacement-map.js , and you move mouse to bottom edge, you'll see the grass. Mapped matrix When you want to use extra texture to put in the filter, you need to position it as a sprite somewhere. We are working with sprite that is not renderable but exists in the stage. Its transformation matrix will be used to position your texture in the filter. Please use https://github.com/pixijs/pixi.js/blob/dev/src/filters/displacement/DisplacementFilter.js and http://pixijs.github.io/examples/#/filters/displacement-map.js as an example. Look for a mapped matrix: this.uniforms.filterMatrix = filterManager.calculateSpriteMatrix(this.maskMatrix, this.maskSprite); maskMatrix is temporary transformation that you have to create for the filter, you dont need to fill it. Sprite has to be added into stage tree and positioned properly. You can use only texture that are not trimmed or cropped. If you want the texture to be repeated, like a fog, make sure it has pow2-dimensions, and specify it in baseTexture before its uploaded on GPU/rendered first time! texture.baseTexture.wrapMode = PIXI.WRAP_MODES.REPEAT; If you want to use an atlas texture as a secondary input for a filter, please wait for pixi-v5 or do it yourself. Add clamping uniforms, use them in shader and make better mapping in "filterMatrix"
  24. ShaderMaterial is broken in 3.0

    After wrestling countless hours with a problem (as you do) I came to the conclusion that ShaderMaterial is broken in 3.0. More accurately, I can't get texture lookups to work in vertex shader. But they do work in Babylon 2.5 Add this line in any vertex shader and it wont compile: vec4 test = texture2D(textureSampler, uv);