Search the Community

Showing results for tags 'shader'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Found 108 results

  1. Is there a way to pass a video texture to a shader using a similar way as the one used to pass an image texture? shaderMaterial.setTexture("textureSampler", new BABYLON.Texture(imgTexture, scene)); i'm wondering if there are more ways to use video textures in babylon, i have seen that i can set a BABYLON.VideoTexture as a diffuseTexture of a material, but that seems limiting. What if i want to manipulate an object which has a material with a video texture, in the vertex + fragment shaders?
  2. optic illusions

    I'm messing around with shaders and made an optic illusion sort of by accident: https://www.babylonjs-playground.com/#WX2PRW#4 I'm a fan of M.C. Escher, so maybe one day will try to build one of his scenes that depends on perspective... Does anybody else have one to share?
  3. Hi, I’m new in Babylon.js, and trying to use it to create geometry that has animated vertices and an animated procedural texture. I’m animating the vertices in the vertex shader. For the procedural texture, I tried to follow the instructions: https://doc.babylonjs.com/how_to/how_to_use_procedural_textures as well as checked the playground example. https://www.babylonjs-playground.com/#24C4KC#17 the problem with the example is that i can’t really find a complete implementation with the shader/config.json files. And i have a couple of basic questions as well. When creating a custom procedural texture with an external folder with the config.json and custom.fragment.fx files, is that the only fragment shader that can be used in the scene? Or can a BABYLON.ShaderMaterial be used additionally? I'm having a hard time grasping the concept of a ’fragment shader’ procedural texture VS a fragment shader as the last step of the webgl pipeline. Thanks.
  4. I'm experimenting with GLSL shaders in Pixi 4.4, and was trying to make some that would take in two images, the base and an overlay. The shaders would then replace either the Hue, Saturation, or Value of the pixels in the base image with the H/S/V of the corresponding pixel in the overlay image. Transparency on the overlay means "no change." For my tests, I used a 100x100 red square, and the following 100x100 "striped" overlays: That's the Hue overlay, Saturation overlay, and Value overlay respectively. Results were only partially consistent, and all wrong. Here are the results of the Hue and Saturation shaders against a black background. Okay, so the stripes are twice as tall as they should be and the bottom-most stripe is outright missing, as though either the overlay or base were resized as some point in the process. But Value takes the cake: Not only do we have the same problem as above, but there are "teeth" that have appeared off the edge of the 100x100 image (4px in each direction, making a 108x100 image), and there are outlines around every stripe, and if you zoom in especially close you can see that some of the outlines are actually 2 pixels tall, one of near-black, and one of another dark colour, none of which is in the original Value overlay! I'm at a loss to tell if the problem(s) originates in my shader code or in Pixi, especially since tutorials around the net are mum about how to create a second shader2D uniform in any other setup but Pixi. I do want to work with Pixi for this project, however, so a fix for Pixi would be appreciated if the problem really is from there. Here's the HTML/GLSL code. Please don't mind the If statement, I've already had a few ideas on how to get rid of it: <html> <head> <meta content="text/html;charset=utf-8" http-equiv="Content-Type"> <meta content="utf-8" http-equiv="encoding"> <style> body { background-color: black; margin: 0; overflow: hidden; } p { color: white; } </style> </head> <body> <script type="text/javascript" src="libs/pixi.js"></script> <script id="shader" type="shader"> #ifdef GL_ES precision mediump float; #endif varying vec2 vTextureCoord; uniform sampler2D uSampler; //The base image uniform sampler2D overlay; //The overlay vec3 rgb2hsv(vec3 c) { vec4 K = vec4(0.0, -1.0 / 3.0, 2.0 / 3.0, -1.0); vec4 p = mix(vec4(c.bg, K.wz), vec4(c.gb, K.xy), step(c.b, c.g)); vec4 q = mix(vec4(p.xyw, c.r), vec4(c.r, p.yzx), step(p.x, c.r)); float d = q.x - min(q.w, q.y); float e = 1.0e-10; return vec3(abs(q.z + (q.w - q.y) / (6.0 * d + e)), d / (q.x + e), q.x); } vec3 hsv2rgb(vec3 c) { vec4 K = vec4(1.0, 2.0 / 3.0, 1.0 / 3.0, 3.0); vec3 p = abs(fract(c.xxx + K.xyz) * 6.0 - K.www); return c.z * mix(K.xxx, clamp(p - K.xxx, 0.0, 1.0), c.y); } void main(void) { vec4 baseRGB = texture2D(uSampler, vTextureCoord); vec4 overlayRGB = texture2D(overlay, vTextureCoord); if(overlayRGB.a > 0.0) { vec3 baseHSV = rgb2hsv(baseRGB.rgb); vec3 overlayHSV = rgb2hsv(overlayRGB.rgb); // Hue // vec3 resultHSV = vec3(overlayHSV.x, baseHSV.y, baseHSV.z); // Saturation // vec3 resultHSV = vec3(baseHSV.x, overlayHSV.y, baseHSV.z); // Value vec3 resultHSV = vec3(baseHSV.x, baseHSV.y, overlayHSV.z); vec3 resultRGB = hsv2rgb(resultHSV); gl_FragColor = vec4(resultRGB.rgb, baseRGB.a); } else { gl_FragColor = baseRGB; } } </script> <script type="text/javascript" src="replaceTest.js"></script> </body> </html> And here's the JS: var width = window.innerWidth; var height = window.innerHeight; var renderer = new PIXI.WebGLRenderer(width, height); document.body.appendChild(renderer.view); var stage = new PIXI.Container(); var sprite = PIXI.Sprite.fromImage('flat.png'); sprite.x = width / 2;//Set it at the center of the screen sprite.y = height / 2; sprite.anchor.set(0.5);//Make sure the center point of the image is at its center, instead of the default top left stage.addChild(sprite); //Create a uniforms object to send to the shader var uniforms = {} uniforms.overlay = { type:'sampler2D', value: PIXI.Texture.fromImage('stripesVal.png') // or stripesSat, stripesHue, etc } //Get shader code as a string var shaderCode = document.getElementById("shader").innerHTML; //Create our Pixi filter using our custom shader code var rasShader = new PIXI.Filter(null,shaderCode,uniforms); console.log(rasShader.uniforms); sprite.filters = [rasShader]; function update() { requestAnimationFrame(update); renderer.render(stage); } update(); Any help would be appreciated!
  5. I am trying to get Terrain Splatmap support into the toolkit. So a terrain built with up to 12 separate textures with the 12 matching normal maps (if using bumping). Now including the up to 4 actual splatmaps or alphamaps used to 'SPLAT' the textures onto the terrain thats a total of 28 additional textures (besides any lightmap of reflection textures) needed to create a max detailed terrain. Thats way too many textures for WebGL. My browser only supports a max use of 16 textures at once even if in a textureArray. IPHONES only support 8 textures at once... GLSL MAX_TEXTURE_IMAGE_UNITS. So I created a Texture Atlas system in Babylon Toolkit (I also use this for my Texture Atlas Baking Tools) to pack textures into a texture atlas and return an array of UV coordinates rectangle structs for each tile or cell in the texture atlas. Example atlasRect Array, atlasInfo Array And The textureAtlas image: "atlasRect1": [ 0.0, 0.0, 0.5, 0.25 ], "atlasRect2": [ 0.0, 0.5, 0.5, 0.25 ], "atlasRect3": [ 0.25, 0.0, 0.5, 0.25 ], "atlasRect4": [ 0.25, 0.5, 0.5, 0.25 ], "atlasRect5": [ 0.5, 0.0, 0.5, 0.25 ], "atlasRect6": [ 0.75, 0.0, 0.5, 0.25 ], "atlasRect7": [ 0.5, 0.5, 0.5, 0.25 ], The matching atlasInfo array contains the texture tile or cell information (uScale, vScale, uOffset, vOffset): "atlasInfo1": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo2": [ 100.0, 100.0, 0.0, 0.0 ], "atlasInfo3": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo4": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo5": [ 100.0, 100.0, 0.0, 0.0 ], "atlasInfo6": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo7": [ 80.0, 80.0, 0.0, 0.0 ], The texture atlas image I create, Note the first tile or cell is the bottom left: I dont know all the ins and outs of what to multiply by what to get the desired effect. Now I need to get the tile or cell from texture atlas and use atlasInfo.xy (uScale and vScale): Here are my main two functions I added to shader: vec4 textureAtlas2D(sampler2D atlas, vec4 rect, vec2 uv, vec2 offset) { vec2 atlasUV = vec2((uv.x * rect.w) + rect.x, (uv.y * rect.z) + rect.y); return texture2D(atlas, atlasUV + offset); } vec4 textureFract2D(sampler2D atlas, vec4 rect, vec2 scale, vec2 uv, vec2 offset) { vec2 fractUV = fract(uv * scale); return textureAtlas2D(atlas, rect, fractUV, offset); } textureAtlas2D uses the rectangle that holds the UV Coords from above to just get the 'Desired Cell'. This works great... EXCEPT IS DOES NOT SCALE ... The ONLY code I could find after months of goggle search of GLSL Texture Atlas Tiling (or Scaling) was to use the GLSL fract() function to REPEAT into the texture atlas giving you scale. So I created textureFract2D as a wrapper to incorporate 'uvScale'. Example snippet from my splatmap shader call texture atlas functions: #ifdef DIFFUSE // Splatmaps #ifdef splatmapDef vec4 splatColor = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor1 = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor2 = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor3 = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor4 = vec4(0.0, 0.0, 0.0, 0.0); // Base splat colors (No Texture Tiling) if (splatmapRects > 0.0) { baseColor1 = textureAtlas2D(splatmap, splatmapRect1, vTerrainUV, uvOffset); } if (splatmapRects > 1.0) { baseColor2 = textureAtlas2D(splatmap, splatmapRect2, vTerrainUV, uvOffset); } if (splatmapRects > 2.0) { baseColor3 = textureAtlas2D(splatmap, splatmapRect3, vTerrainUV, uvOffset); } if (splatmapRects > 3.0) { baseColor4 = textureAtlas2D(splatmap, splatmapRect4, vTerrainUV, uvOffset); } // Texture atlas colors (Use Texture Tiling) if (atlasInfos > 0.0 && atlasRects > 0.0) { splatColor = textureFract2D(diffuseSampler, atlasRect1, atlasInfo1.xy, vTerrainUV, uvOffset) * baseColor1.r; if (atlasInfos > 1.0 && atlasRects > 1.0) { splatColor += textureFract2D(diffuseSampler, atlasRect2, atlasInfo2.xy, vTerrainUV, uvOffset) * baseColor1.g; } if (atlasInfos > 2.0 && atlasRects > 2.0) { splatColor += textureFract2D(diffuseSampler, atlasRect3, atlasInfo3.xy, vTerrainUV, uvOffset) * baseColor1.b; } // Second splat colors if (atlasInfos > 3.0 && atlasRects > 3.0) { splatColor += textureFract2D(diffuseSampler, atlasRect4, atlasInfo4.xy, vTerrainUV, uvOffset) * baseColor2.r; } if (atlasInfos > 4.0 && atlasRects > 4.0) { splatColor += textureFract2D(diffuseSampler, atlasRect5, atlasInfo5.xy, vTerrainUV, uvOffset) * baseColor2.g; } if (atlasInfos > 5.0 && atlasRects > 5.0) { splatColor += textureFract2D(diffuseSampler, atlasRect6, atlasInfo6.xy, vTerrainUV, uvOffset) * baseColor2.b; } // Third splat colors if (atlasInfos > 6.0 && atlasRects > 6.0) { splatColor += textureFract2D(diffuseSampler, atlasRect7, atlasInfo7.xy, vTerrainUV, uvOffset) * baseColor3.r; } if (atlasInfos > 7.0 && atlasRects > 7.0) { splatColor += textureFract2D(diffuseSampler, atlasRect8, atlasInfo8.xy, vTerrainUV, uvOffset) * baseColor3.g; } if (atlasInfos > 8.0 && atlasRects > 8.0) { splatColor += textureFract2D(diffuseSampler, atlasRect9, atlasInfo9.xy, vTerrainUV, uvOffset) * baseColor3.b; } // Final splat colors if (atlasInfos > 9.0 && atlasRects > 9.0) { splatColor += textureFract2D(diffuseSampler, atlasRect10, atlasInfo10.xy, vTerrainUV, uvOffset) * baseColor4.r; } if (atlasInfos > 10.0 && atlasRects > 10.0) { splatColor += textureFract2D(diffuseSampler, atlasRect11, atlasInfo11.xy, vTerrainUV, uvOffset) * baseColor4.g; } if (atlasInfos > 11.0 && atlasRects > 11.0) { splatColor += textureFract2D(diffuseSampler, atlasRect12, atlasInfo12.xy, vTerrainUV, uvOffset) * baseColor4.b; } } baseColor = splatColor; #else baseColor = texture2D(diffuseSampler, vDiffuseUV + uvOffset); #endif #ifdef ALPHATEST if (baseColor.a < 0.4)å discard; #endif #ifdef ALPHAFROMDIFFUSE alpha *= baseColor.a; #endif baseColor.rgb *= vDiffuseInfos.y; #ifdef splatmapDef #ifdef BUMP //normalW = perturbNormals(viewDirectionW, baseColor1, baseColor2, baseColor3, baseColor4, uvOffset, atlas1UV, atlas2UV, atlas3UV, atlas4UV, atlas5UV, atlas6UV, atlas7UV, atlas8UV, atlas9UV, atlas10UV, atlas11UV, atlas12UV); #endif #ifdef TWOSIDEDLIGHTING normalW = gl_FrontFacing ? normalW : -normalW; #endif #endif #endif My problem is I dont know how to grab the cell info using the UV coords and apply the scaling needed. If I use the textureFract2D GLSL fract to do the scaling I get edge seams (I will post screen shots in the next post). @Deltakosh pointed me the particleSystem direction because I guess it does something like using SPRITESHEET and what they ar calling a 'cellIndex' to get to the tile or cell. Now there is a bunch of code that deals with sprite sheet width and some calculations to get a rowOffset and columnOffset. Well I dont have that kind of info, but like I explained above, I have the ACTUAL UV COORDS for each tile or cell in the texture atlas. But I am STILLL GAME DEV NEWBIE, I dont know what I need to do to use those UV coords and info.xy (uScale, vScale) to get the desired effect. This is the gist of what the particle system does for texture atlas or sprite sheet support: //vec2 offset = options.zw; // Dunno what this is - ??? //attribute float cellIndex; // Dunno what this is - ??? //uniform vec3 particlesInfos; // x (number of rows) y(number of columns) z(rowSize) //#ifdef ANIMATESHEET //float rowOffset = floor(cellIndex / particlesInfos.z); //float columnOffset = cellIndex - rowOffset * particlesInfos.z; //vec2 uvScale = particlesInfos.xy; //vec2 uvOffset = vec2(offset.x , 1.0 - offset.y); //vUV = (uvOffset + vec2(columnOffset, rowOffset)) * uvScale; //#else //vUV = offset; //#endif I have no idea how to take this and adapt this to using the actual UV coords WITH SCALE. PLEASE, ANYBODY, I AM BEGGING (AGAIN)... HELP ME RE-WRITE textureFract2D to get the desired effect Here is my shader programs so far: splatmap.vertex.fx splatmap.fragment.fx UPDATE You can download the Test Terrain export project and edit the shader in the src/shader folder directly and just hit refresh on your browser to see effect Look at next post for example screen shots ... THANKS FOR READING THIS FAR Pinging @Deltakosh and @Sebavan and @Pryme8 and @adam and last but not least my man 'Wingy' at @Wingnut ... Any thought Guys ??? Yo @NasimiAsl ... The Shader Guru ... Maybe you can have another crack at it, but this time use the Test Terrain project from above and change the splatmap shader and hit refresh... Even better than play ground, you get the whole export project... easy access
  6. Water shader approach?

    I'm struggling with an approach to add a water effect to an area of container. I'd like to have the area in green distort the area below it via a shader, but also have the ability for this water area to "rise/fall". The rise/fall is of no problem when it's just a sprite I can move up and down as necessary - but being able to apply a shader to just that specific area is throwing me for a loop. Adding a filter directly to the sprite won't distort anything since it's not accounting for the pixels behind it.. Is there are way to capture this particular area, save it to a buffer, and apply a filter, and put it back in place? RenderTexture seems like it might be in the right direction, but I can't see where I can grab an area of an existing container... Any help would be much appreciated!
  7. Playing with shader

    Hello guys, see the GIF please https://ibb.co/kZkrBR. I wonder how to do something like this in babylonjs, I did this in opengl project by sending a value to a uniform in fragment shader. should I do the same with babylonjs ? ; should I modify babylonjs shaders ?, or there is a better way to do that.
  8. This is a small tutorial in which we learn how to transfer an existing shader from the Shadertoy library to PlayCanvas. Mastering this process opens up many possibilities for using open source shaders and effects in your projects. http://pirron.one/playingincanvas/color-palettes-using-shaders This tutorial is for advanced users and assumes that you are already aware of how the PlayCanvas shader chunks system works. You can study this tutorial to learn more on how to begin with PlayCanvas and shaders.
  9. I,m trying to develop something like this using Babylon.js (The example is implemented in Unity 3D): I suppose this can be implemented using a shader. The user can move a sphere with the inverted normals, and the shader will have to calculate if this sphere is painted depending on its position in the z buffer. Is it possible to do this using babylon ShaderMaterial? Thanks in advance.
  10. Quick question: Is there any way to fade out a ShaderMaterial'd mesh? mesh.material.alpha = X does not seem to have any affect on the alpha value, unless I am doing something wrong. Am I missing something or is it not possible? (I can probably fake it by adding an alpha uniform to the fragment shader and explicitly passing it in, however this would be painful to animate). Also, if I go the uniform route, I'm not sure how much overhead (if any), it adds, setting the uniform value for the shader before render. I've noticed a lot of games similar to mine (digital card games), will have their shader effects timed in sync for a particular effect, which led me to the assumption that there is probably a performance reason behind it? (because having the effects fire on a different start time would look better) - But maybe I'm way off mark with that assumption, anyone happen to know?
  11. Hello everyone! I'm making fluid simulation effect, i found a demo: http://jeeliz.com/demos/water/ and i want to clone it. The demo written by WebGL + shader but i don't know about WebGL, i'm worrying . I'm trying export to babylon.js and there are a few things i don't know how to do. I'm really stuck, i expect the babylon community to help me 1. var texture_water = GL.createTexture(); GL.bindTexture(GL.TEXTURE_2D, texture_water); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.NEAREST); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.NEAREST); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE); GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, 512, 512, 0, GL.RGBA, GL.FLOAT, null); 2. var quad_vertex = [-1, -1, 1, -1, 1, 1, -1, 1]; var QUAD_VERTEX = GL.createBuffer(); GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX); GL.bufferData(GL.ARRAY_BUFFER, new Float32Array(quad_vertex), GL.STATIC_DRAW); var quad_faces = [0, 1, 2, 0, 2, 3]; var QUAD_FACES = GL.createBuffer(); GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES); GL.bufferData(GL.ELEMENT_ARRAY_BUFFER, new Uint16Array(quad_faces), GL.STATIC_DRAW); GL.vertexAttribPointer(SHP_VARS.rendering.position, 2, GL.FLOAT, false, 8, 0); GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX); GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES); GL.disableVertexAttribArray(SHP_VARS.rendering.position); 3. GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0); GL.disableVertexAttribArray(SHP_VARS.water.position); GL.framebufferTexture2D(GL.FRAMEBUFFER, GL.COLOR_ATTACHMENT0, GL.TEXTURE_2D, texture_normals, 0); GL.useProgram(SHP_NORMALS); GL.enableVertexAttribArray(SHP_VARS.normals.position); GL.bindTexture(GL.TEXTURE_2D, texture_water); GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0); GL.disableVertexAttribArray(SHP_VARS.normals.position); GL.bindFramebuffer(GL.FRAMEBUFFER, null); GL.flush(); With the above code how i export to babylon.js? Thanks so much!
  12. So, I'm trying to convert a shader from shadertoy, I'm close but still can't get it working. Also in my actual scene it doesn't seem to be working at all, but it's hard to tell if it's related to the issue I am having w conversion, since I need to rotate the sphere to get it to show up to begin with. Shader is here: (it appears blank at first, but if you rotate it youll start to see the fire. The actual effect I am going for you will see only if you rotate it just right so that you see the fire starting with the white in the middle, and it filling up the sphere). http://cyos.babylonjs.com/#M11GKA The source shader is here: https://www.shadertoy.com/view/lsf3RH So the one place I was not sure how to proceed, was mapping over the iResolution variable (which shadertoy states is the viewport resolution). I played around with a bunch of different things and ended up trying the camera input, which works, but requires rotating the mesh to see it at all. Anyone know what input would map over to viewport resolution (or how to get it), and or what I am doing wrong/missing here?
  13. I try to start it with this example : http://phaser.io/examples/v2/filters/blue-dots I use download zip , but seam need to fix a php server. I try to fix by using just the files and folder to run the example that but seem is not working . Can you give me some help ? Thank you. Best regards.
  14. Error in PBR

    Hi, What is the bets way to debug errors in shaders? I have mystica error from PBR: Error: ERROR: 0:1492: 'glossiness' : undeclared identifier ERROR: 0:1492: 'computeHemisphericLighting' : no matching overloaded function found ERROR: 0:1492: 'assign' : cannot convert from 'const mediump float' to 'structure' ERROR: 0:1573: 'computeLighting' : no matching overloaded function found ERROR: 0:1573: 'assign' : cannot convert from 'const mediump float' to 'structure' I cant really find the source of it ,as the error shows up if I change starting camera angle
  15. Hey Folks! For my custom Shader I want to use the object's normals in view space. Therefor I need the gl_NormalMatrix. I know how to construct it (inverted and transposed MVMatrix) but I don't want to construct it manually for each object. I found this thread [SOLVED] - Shader Programs Attributes And Uniforms, but there was no hint on how the gl_NormalMatrix is called in babylon. I also searched the babylon git-repository, but could not find where the shader attributes and such are declared. Can anyone please point me in the right direction? Thank you for your time -Mainequin
  16. I am trying to use this lib: https://github.com/pixijs/pixi-tilemap and it seems using shader rather than sprites to render tilemap? is it more efficiency? why? the memory usage is the same, right? thanks.
  17. Help with shader waterfall

    Hello guys, I'm stuck with shaders. May be you can help me. The objective to make waterfall with shader. As for now I have this frag shader: precision mediump float; varying vec2 vTextureCoord; uniform sampler2D uSampler; uniform vec4 filterArea; uniform vec2 dimensions; uniform float speed; uniform float time; #pragma glslify: noise = require("glsl-noise/simplex/3d") vec2 cartToIso(vec2 pos) { vec2 res = pos; res.x = pos.x - pos.y; res.y = (pos.x + pos.y) / 2.0; return res; } void main() { vec2 pixelCoord = vTextureCoord * filterArea.xy; vec2 coord = pixelCoord / dimensions; vec2 iso = cartToIso(pixelCoord); float x = pixelCoord.x * 0.1; float y = dimensions.y / pixelCoord.y + (speed * time * 10.0); float z = time; vec3 vector = vec3(x, y, z); vec3 noise = vec3(noise(vector)); gl_FragColor = vec4(noise, 1.0); } It gives me nice waterfall result (video in attach). But the target is to make it isometric (in reality dimetric). Look at pic in the attach. Is there a way to make this? I'll be appreciate for any help. waterfall.mp4
  18. Car paint shader

    Hi, started to learn GLSL and shader, so decided to port some shader to babylon Source: Original webgl shader from three.js Babylon.js version: PG Right now it is just direct port but maybe someone can help me to merge it better with babylonjs ecosystem. (Right now dont work in mac safari, will check why)
  19. Is there a way to call specific function or event when render finishing with rendering specific sprite or event when shader is executed ?
  20. Hi, While I'm studying WebGL, I'm wondering how to use vertex shader with Pixi.js. Using fragment shaders is quite clear in my mind, but I'm still confused on how to send vertices coordinates to a vertex shader in Pixi.js. For instance, assuming we have such a vertex shader: // vertex shader attribute vec3 aVertexPosition; uniform mat4 uMVMatrix; uniform mat4 uPMatrix; void main(void) { gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0); } and we have such vertices coordinates: const vertices = [ 0.0, 1.0, 0.0, -1.0, -1.0, 0.0, 1.0, -1.0, 0.0 ]; How could I draw this triangle (this is supposed to be a triangle) with this vertex shader and Pixi.js? Thanks!
  21. Here's a helper function I made to create each shader plugin with just a simple function call like this: createShaderPlugin(name, vertShader, fragShader, uniformDefaults); Then you can create each sprite that you want to be drawn with your custom shader like this: var sprite = createShaderPluginSprite(name, size, uniforms); Update: I started a GitHub repo for it here if you want to check it out. I also made this little CodePen to demonstrate the usage with some comments. The encapsulated code is based on the plugin example and on the worldTransform/vertices calculations that I boosted from PIXI.Sprite. I've tried to optimize and condense for the use case where the Sprite doesn't have any Texture to be filtered.
  22. Shader translation

    Hi, I need help with shaders in pixi . I trying to translate and rotate sprite with shaders. I am using PIXI.Filter but I can find any example with vertex shaders(only fragmet examples ). If someone can provide some example . Thanks
  23. I know that you can manipulate vertices that exists with a shader and make their visual position different then their physical location. Is there a way to deform the spaces between vertices, I doubt it but I figured it was worth asking. Im trying to figure out how to blend terrain without t-junctions and possibly have the GPU handle the whole load.
  24. [solved] Cyos

    Hy guys! I've this problem of [Varyings with the same name but different type, or statically used varyings in fragment shader are not declared in vertex shader:] in ths console of the shader playground; here's my code Vertex shader: /** * Example Vertex Shader * Sets the position of the vertex by setting gl_Position */ // Set the precision for data types used in this shader precision highp float; precision highp int; // Default THREE.js uniforms available to both fragment and vertex shader uniform mat4 modelMatrix; uniform mat4 worldViewProjection; uniform mat4 world; uniform mat4 viewMatrix; uniform mat3 normalMatrix; ///// // Default uniforms provided by ShaderFrog. uniform vec3 cameraPosition; uniform float time; uniform float v; // Default attributes provided by THREE.js. Attributes are only available in the // vertex shader. You can pass them to the fragment shader using varyings attribute vec3 position; attribute vec3 normal; attribute vec2 uv; attribute vec2 uv2; // Examples of variables passed from vertex to fragment shader varying vec3 vPosition; varying vec3 vNormal; varying vec2 vUv; varying vec2 vUv2; void main() { // To pass variables to the fragment shader, you assign them here in the // main function. Traditionally you name the varying with vAttributeName vNormal = normal; vUv = uv; vUv2 = uv2; vPosition = position; // This sets the position of the vertex in 3d space. The correct math is // provided below to take into account camera and object data. gl_Position = worldViewProjection * world * vec4( vPosition, 1.0 ); } Fragment shader: precision highp float; precision highp int; uniform vec2 resolution; uniform float time; varying vec2 vUv; vec3 mod289(vec3 x) { return x - floor(x * (1.0 / 289.0)) * 289.0; } vec4 mod289(vec4 x) { return x - floor(x * (1.0 / 289.0)) * 289.0; } vec4 permute(vec4 x) { return mod289(((x * 34.0) + 1.0) * x); } vec4 taylorInvSqrt(vec4 r) { return 1.79284291400159 - 0.85373472095314 * r; } float snoise(vec3 v) { const vec2 C = vec2(1.0 / 6.0, 1.0 / 3.0); const vec4 D = vec4(0.0, 0.5, 1.0, 2.0); vec3 i = floor(v + dot(v, C.yyy)); vec3 x0 = v - i + dot(i, C.xxx); vec3 g = step(x0.yzx, x0.xyz); vec3 l = 1.0 - g; vec3 i1 = min(g.xyz, l.zxy); vec3 i2 = max(g.xyz, l.zxy); vec3 x1 = x0 - i1 + C.xxx; vec3 x2 = x0 - i2 + C.yyy; vec3 x3 = x0 - D.yyy; i = mod289(i); vec4 p = permute(permute(permute(i.z + vec4(0.0, i1.z, i2.z, 1.0)) + i.y + vec4(0.0, i1.y, i2.y, 1.0)) + i.x + vec4(0.0, i1.x, i2.x, 1.0)); float n_ = 0.142857142857; vec3 ns = n_ * D.wyz - D.xzx; vec4 j = p - 49.0 * floor(p * ns.z * ns.z); vec4 x_ = floor(j * ns.z); vec4 y_ = floor(j - 7.0 * x_); vec4 x = x_ * ns.x + ns.yyyy; vec4 y = y_ * ns.x + ns.yyyy; vec4 h = 1.0 - abs(x) - abs(y); vec4 b0 = vec4(x.xy, y.xy); vec4 b1 = vec4(x.zw, y.zw); vec4 s0 = floor(b0) * 2.0 + 1.0; vec4 s1 = floor(b1) * 2.0 + 1.0; vec4 sh = -step(h, vec4(0.0)); vec4 a0 = b0.xzyw + s0.xzyw * sh.xxyy; vec4 a1 = b1.xzyw + s1.xzyw * sh.zzww; vec3 p0 = vec3(a0.xy, h.x); vec3 p1 = vec3(a0.zw, h.y); vec3 p2 = vec3(a1.xy, h.z); vec3 p3 = vec3(a1.zw, h.w); vec4 norm = taylorInvSqrt(vec4(dot(p0, p0), dot(p1, p1), dot(p2, p2), dot(p3, p3))); p0 *= norm.x; p1 *= norm.y; p2 *= norm.z; p3 *= norm.w; vec4 m = max(0.6 - vec4(dot(x0, x0), dot(x1, x1), dot(x2, x2), dot(x3, x3)), 0.0); m = m * m; return 42.0 * dot(m * m, vec4(dot(p0, x0), dot(p1, x1), dot(p2, x2), dot(p3, x3))); } void main() { vec2 div = vec2(10, 10); vec2 uv = vUv.xy / resolution.xy * div.xy; vec3 v = vec3(uv.x + sin(time) * 0.2, uv.y + cos(time) * 0.2, time / 10.0); float noise = snoise(v); vec2 resolution = vec2(1, 1); uv = vUv.xy / resolution.xy * div.xy; vec3 v2 = vec3(uv.x, uv.y, time / 5.0); noise = sin(noise * 3.14 * (sin(time) + snoise(v2) * 2.0) * 0.75); float darkenFactor = 0.2; float darkenValue = darkenFactor; div = vec2(5, 5); uv = vUv.xy / resolution.xy * div.xy; vec3 v3 = vec3(uv.x, uv.y, time / 2.0); darkenValue = darkenValue * snoise(v3); vec3 v4 = vec3(uv.x * 1000.0, uv.y * 1000.0, time); float b = snoise(v4) * 0.1; gl_FragColor = vec4(1.0 - darkenValue + (noise * (darkenValue + 0.2)) - b, noise - b, b, 1.0); } Pinging the shader-lord @NasimiAsl