Search the Community

Showing results for tags 'shader'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Found 120 results

  1. Hi! I did a Game Jam recently, our group used the p5 library because someone suggested it to us, and indeed it helped us develop an ok prototype in a short amount of time. The game wasn't perfect at all of course, because 48 hours is still really short, but one of the main flaws that bothered me was the graphic integration; our graphistes liked working with pixel art, and we used a tileset to create the background map, which didn't tile right, with grey line appearing between tiles; on top of that, the other sprites didn't look nice at high resolution either, because of linear filtering. After the Jam ended, I decided to try to fix these. I found the way to activate nearest filter in p5, which made the graphics look way better, but still not as nice as I wanted it to be. After googling a bit, I found this article, which was exactly what I was looking for. Then I noticed... in order to use shaders, I needed to switch the context to WebGL, which meant changing most of the code. This didn't actually bother me, so I started working on it, it was quite a nightmare (because of p5's bugs and lacks of integration...) but when I got to the point where I had basically the same game then before, but switched to a WebGL context (without the shaders), I noticed HUGE performance drops (I'm talking 10 FPS for displaying something 400 sprites in a 500 * 500 context!!!). That's when I decided that I needed to switch to a different library. A little more googling later, I opted for the PIXI library, which, I must agree, is a nice library to work with, even though the official documentation is lacking a bit of informations. I started recoding the game from scratch once again, and as soon as I could draw the map on screen, I tried to implement the pixel art shader; I copy-pasted the code and... Magic! It didn't work. I was actually not that surprised, and decided to play around with the GLSL to see where the problem was. That's when I started noticing strange things: weird texture offsets that I tried to adjust for manually, which didn't work because the offset changed depending on the scaling of the image... and then the texture itself, scaling itself up and down when I scaled the sprite... After a while and thanks to more googling, I found out that Pixi did some pre-processing on both the texture and the texture coordinates which were passed to the fragment shaders. I tried adjusting for it, I tried a lot, I got close, but it's still really messy, and float precision starts doing strange things that makes the textures look even worst then before. That's where I am now. I tried searching for others libraries, but most of them seemed to be either 3D libraries, or probably as weak as p5, so I'm here to ask, is there a good library that I can work with which won't bring me such problems? Or is there something I can do to fix my problem with PIXI? Or should I just resort to using webGL without any additional library (which at this point seems like the best solution to me)?
  2. Hi guys! Since I use BJS, I always be kind of stucked with lightmap blending with my materials. I was working on standard workflow, but now I'm testing PBR, I have still the same issue, so this thread to understand deeper the blocking factor. I never succeeded to get (basically) the same render from my 3D lighting scene to BJS, after many many tests So I consistently had to tweak my hundred of lightmaps files in Photoshop to increase contrast/luminosity. These lightmaps could came from the usuals RenderToTextures passes from 3D softwares: 3DSMax & VRay, from a RawTotalLighting pass Blender & Cycles, from a Diffuse Direct & Indirect pass Having to manually tweak these lightmaps cause me to lighting my scenes in a blind way (which is of course very annoying), and to not have a WYSIWYG workflow (like: tweak a lightmap > save > reload BJS scene > retweak > etc). So here a test scene to discuss about: http://www.babylonjs-playground.com/#59ZXVF#11 (sources) (here the lightmap of the room). You can easily switch or add lightmap mode using the line 32, default here is lightmap file set in lightmapTexture as shadowMap. Because I'm a devianted person, I wanted to compare with another engine, so I made a scene on https://playcanv.as/b/OuZIGpY0/ : here you can see that my white data on my lightmaps files are read in a better way, they burn materials a little ; I think dark data are darker too. (note that PlayCanvas allow to send HDR files and convert them to RGBM, giving an interesting result, but in the link above I used the exactly same png file than in BJS, of course). And the result is totally satisfying. So, have I missed a magic parameter? Is the shader need a little tweaking? Is my workflow totally wrong? How do you personally deal with lightmaps?
  3. I wanted to make a tiled floor in the past. In that case, it was a tool scene to test walking poses, so performance did not matter. It was also to make it easy to count distance. When I tried to do it with a texture, it was always blurry. I just wrote a quick script to make a mesh with a bunch of black squares, and set the clear color of the scene to white. Here is what I got. Now, I would like the same effect & quality for a real scene (no shadows required). Not sure how big each square should be in advance. Needs to be adjustable to try different sizes. Do not want to use white for clear color for scene. Is this a good candidate for a procedural? If so, it seems there a number of variations in the framework & extensions (shaderbuilder). What would you recommend? Here is mesh:
  4. Hi Folks, Just by curiosity, how do we avoid expensive glUseProgram calls internally in Babylonjs? Do we sort meshes by shader so that we can minimize shader changes?
  5. Hello BJS community ! I just began to understand HDR textures, gamma correction and so on in order to learn how to do IBL. In this process, I used BABYLON.HDRCubeTexture to convert my equirectangular HDR texture to a usable environment HDR cubemap, as explained here. Then, I need to apply a convolution on this cubemap to obtain my final irradiance cubemap that I will sample during IBL. To compute my irradiance cubemap from the environment cubemap, I use a RenderTargetTexture. Until there, everything works fine ! In the above tutorial link, the guy uses OpenGL and doesn't matter about having output color exceeding [0..1] range. It's useful to keep HDR textures until the last step where he will tone map its result. I learned the hard way that it's not as simple with WebGL. When I store color outside [0..1] range and then sample this result in another shader, result has been clamped between [0..1] range. This stackoverflow question taught me that not only I need to use a floating point texture but also I have to render to a floating point frame buffer. Or something like that, I never dived into pure WebGL code. To render to a floating point frame buffer, I need to enable the EXT_color_buffer_float extension (only available with WebGL 2), but it doesn't seem to be enough. I think I also need to configure the framebuffer with pure WebGL code. So, my question is: Is is yet possible to render color outside [0..1] range using BabylonJS at this time ? How ? If this not ready yet, I'll normalize and denormalize data at each step of course. But I would love to know if doing it in the ideal way is possible. Thank you a lot in advance !
  6. Best way to handle Character with shadow

    I was wondering what is the best way to handle shadow for a "character". I made a character and i want to put a shadow at the bottom a kind of "circle" with opacity, for convenience let's say it's gonna be a transparent PNG. The way i can think of : - (1) Just add the sprite to the game, this leads to duplicated calculations in the update part cause i have to update both the "character" sprite and his shadow; - (2) Add the shadow as a child sprite of the "character", this leads to having the shadow on top of the sprite instead of back to it; - (3) Add the "character" sprite as a child of the shadow sprite, this seems not logical, but it works somehow; - (4) Create a group and add both the shadow and the sprite, (havn't test it but most likely) this leads to define specific property to the group for the size, the position and the overflow boundaries for collision relative to the "character" sprite and the shadow; - (5) Using a filter (Shader) applied on the hole game, with my character position as uniform to render the shadow directly on the "groundLayer". What do you think please ? Is there a better way that i am missing ? Thanks.
  7. Hi there - I'm going down a rabbit hole trying to implement a color grading / LUT shader for PIXI. Color grading is where you can you use a sprite as a lookup table to quickly transform one set of colors to another - this is handy for applying realtime contrast and color adjustments. I'm using this as a reference: https://www.defold.com/tutorials/grading/ I've created a filter/shader using the code in the link above: var ColorGradingShader = function(LUTSprite) { var my = this; var code = ` precision lowp float; uniform vec4 filterArea; varying vec2 vTextureCoord; uniform sampler2D uSampler; uniform sampler2D lut; #define MAXCOLOR 15.0 #define COLORS 16.0 #define WIDTH 256.0 #define HEIGHT 16.0 void main() { vec4 px = texture2D(uSampler, vTextureCoord.xy); float cell = px.b * MAXCOLOR; float cell_l = floor(cell); float cell_h = ceil(cell); float half_px_x = 0.5 / WIDTH; float half_px_y = 0.5 / HEIGHT; float r_offset = half_px_x + px.r / COLORS * (MAXCOLOR / COLORS); float g_offset = half_px_y + px.g * (MAXCOLOR / COLORS); vec2 lut_pos_l = vec2(cell_l / COLORS + r_offset, g_offset); vec2 lut_pos_h = vec2(cell_h / COLORS + r_offset, g_offset); vec4 graded_color_l = texture2D(lut, lut_pos_l); vec4 graded_color_h = texture2D(lut, lut_pos_h); vec4 graded_color = mix(graded_color_l, graded_color_h, fract(cell)); gl_FragColor = graded_color; } `; PIXI.Filter.call(my, null, code); my.uniforms.lut = LUTSprite.texture; } ColorGradingShader.prototype = Object.create(PIXI.Filter.prototype); ColorGradingShader.prototype.constructor = ColorGradingShader; export default ColorGradingShader; I then add this to my top level container: //relevant code from a wrapping class this.colorGradingSprite = new PIXI.Sprite.fromImage('/img/lut16.png'); this.pixiContainer.filters = [ this.colorGradingFilter ]; When using any LUT image, including the default without any color adjustments: I go from this: to this: I'm assuming there are some adjustments necessary to either the shader code, or how the lut sprite itself is being loaded - I have no clue.. Any help would be greatly appreciated! And for those curious, here's my end goal: Thanks, Sean
  8. hi i make this post for share some unexpected experience in shader code you can see a lot but you can fix it easly
  9. hi, anyone knows why changing the value of "pointSize" doesn't influence the size of points of a shaderMaterial? shaderMaterial.pointsCloud = true; shaderMaterial.pointSize = 10;
  10. I want to draw a lot of spheres with different colors and matrix. Since they all share the same geometry, so using instance draw will be an effective way. Babylon provides a build-in instance mesh with different matrix. But other instance attribute like 'color' seems not support. So I tried to write a custom material, but the documents about custom material/shader seems not support instance drawing. Who can give me an example or any tips on writing a custom instanced material? Thanks.
  11. I took a quick look at the source code and it seems that we have no way to update only one element of an uniform array? Just like below.. // in JavaScript at init time var someVec2Element0Loc = gl.getUniformLocation(someProgram, "u_someVec2[0]"); var someVec2Element1Loc = gl.getUniformLocation(someProgram, "u_someVec2[1]"); var someVec2Element2Loc = gl.getUniformLocation(someProgram, "u_someVec2[2]"); // at render time gl.uniform2fv(someVec2Element0Loc, [1, 2]); // set element 0 gl.uniform2fv(someVec2Element1Loc, [3, 4]); // set element 1 gl.uniform2fv(someVec2Element2Loc, [5, 6]); // set element 2 Well I need to hack like this... var locs = engine.getUniforms(material.getEffect()._program, ['test1[0]']); engine.setFloat3(locs[0], 0.0, 1.0, 0.0);
  12. Hi Folks, I have a question about the shaders used in Babylon. I saw some special stuff like the line below. I have searched __decl__lihghtFragment but I could not find it. And also this lines seems not to be standard grammar of GLSL. Could someone explain how it works? #include<__decl__lightFragment>[0..maxSimultaneousLights] The background is that I am using the CustomMaterial made by NasamiAsl and I would like to add some custom uniforms like the lights. So the stuff above interests me. Thanks
  13. Is there a way to pass a video texture to a shader using a similar way as the one used to pass an image texture? shaderMaterial.setTexture("textureSampler", new BABYLON.Texture(imgTexture, scene)); i'm wondering if there are more ways to use video textures in babylon, i have seen that i can set a BABYLON.VideoTexture as a diffuseTexture of a material, but that seems limiting. What if i want to manipulate an object which has a material with a video texture, in the vertex + fragment shaders?
  14. optic illusions

    I'm messing around with shaders and made an optic illusion sort of by accident: https://www.babylonjs-playground.com/#WX2PRW#4 I'm a fan of M.C. Escher, so maybe one day will try to build one of his scenes that depends on perspective... Does anybody else have one to share?
  15. Hi, I’m new in Babylon.js, and trying to use it to create geometry that has animated vertices and an animated procedural texture. I’m animating the vertices in the vertex shader. For the procedural texture, I tried to follow the instructions: https://doc.babylonjs.com/how_to/how_to_use_procedural_textures as well as checked the playground example. https://www.babylonjs-playground.com/#24C4KC#17 the problem with the example is that i can’t really find a complete implementation with the shader/config.json files. And i have a couple of basic questions as well. When creating a custom procedural texture with an external folder with the config.json and custom.fragment.fx files, is that the only fragment shader that can be used in the scene? Or can a BABYLON.ShaderMaterial be used additionally? I'm having a hard time grasping the concept of a ’fragment shader’ procedural texture VS a fragment shader as the last step of the webgl pipeline. Thanks.
  16. I'm experimenting with GLSL shaders in Pixi 4.4, and was trying to make some that would take in two images, the base and an overlay. The shaders would then replace either the Hue, Saturation, or Value of the pixels in the base image with the H/S/V of the corresponding pixel in the overlay image. Transparency on the overlay means "no change." For my tests, I used a 100x100 red square, and the following 100x100 "striped" overlays: That's the Hue overlay, Saturation overlay, and Value overlay respectively. Results were only partially consistent, and all wrong. Here are the results of the Hue and Saturation shaders against a black background. Okay, so the stripes are twice as tall as they should be and the bottom-most stripe is outright missing, as though either the overlay or base were resized as some point in the process. But Value takes the cake: Not only do we have the same problem as above, but there are "teeth" that have appeared off the edge of the 100x100 image (4px in each direction, making a 108x100 image), and there are outlines around every stripe, and if you zoom in especially close you can see that some of the outlines are actually 2 pixels tall, one of near-black, and one of another dark colour, none of which is in the original Value overlay! I'm at a loss to tell if the problem(s) originates in my shader code or in Pixi, especially since tutorials around the net are mum about how to create a second shader2D uniform in any other setup but Pixi. I do want to work with Pixi for this project, however, so a fix for Pixi would be appreciated if the problem really is from there. Here's the HTML/GLSL code. Please don't mind the If statement, I've already had a few ideas on how to get rid of it: <html> <head> <meta content="text/html;charset=utf-8" http-equiv="Content-Type"> <meta content="utf-8" http-equiv="encoding"> <style> body { background-color: black; margin: 0; overflow: hidden; } p { color: white; } </style> </head> <body> <script type="text/javascript" src="libs/pixi.js"></script> <script id="shader" type="shader"> #ifdef GL_ES precision mediump float; #endif varying vec2 vTextureCoord; uniform sampler2D uSampler; //The base image uniform sampler2D overlay; //The overlay vec3 rgb2hsv(vec3 c) { vec4 K = vec4(0.0, -1.0 / 3.0, 2.0 / 3.0, -1.0); vec4 p = mix(vec4(c.bg, K.wz), vec4(c.gb, K.xy), step(c.b, c.g)); vec4 q = mix(vec4(p.xyw, c.r), vec4(c.r, p.yzx), step(p.x, c.r)); float d = q.x - min(q.w, q.y); float e = 1.0e-10; return vec3(abs(q.z + (q.w - q.y) / (6.0 * d + e)), d / (q.x + e), q.x); } vec3 hsv2rgb(vec3 c) { vec4 K = vec4(1.0, 2.0 / 3.0, 1.0 / 3.0, 3.0); vec3 p = abs(fract(c.xxx + K.xyz) * 6.0 - K.www); return c.z * mix(K.xxx, clamp(p - K.xxx, 0.0, 1.0), c.y); } void main(void) { vec4 baseRGB = texture2D(uSampler, vTextureCoord); vec4 overlayRGB = texture2D(overlay, vTextureCoord); if(overlayRGB.a > 0.0) { vec3 baseHSV = rgb2hsv(baseRGB.rgb); vec3 overlayHSV = rgb2hsv(overlayRGB.rgb); // Hue // vec3 resultHSV = vec3(overlayHSV.x, baseHSV.y, baseHSV.z); // Saturation // vec3 resultHSV = vec3(baseHSV.x, overlayHSV.y, baseHSV.z); // Value vec3 resultHSV = vec3(baseHSV.x, baseHSV.y, overlayHSV.z); vec3 resultRGB = hsv2rgb(resultHSV); gl_FragColor = vec4(resultRGB.rgb, baseRGB.a); } else { gl_FragColor = baseRGB; } } </script> <script type="text/javascript" src="replaceTest.js"></script> </body> </html> And here's the JS: var width = window.innerWidth; var height = window.innerHeight; var renderer = new PIXI.WebGLRenderer(width, height); document.body.appendChild(renderer.view); var stage = new PIXI.Container(); var sprite = PIXI.Sprite.fromImage('flat.png'); sprite.x = width / 2;//Set it at the center of the screen sprite.y = height / 2; sprite.anchor.set(0.5);//Make sure the center point of the image is at its center, instead of the default top left stage.addChild(sprite); //Create a uniforms object to send to the shader var uniforms = {} uniforms.overlay = { type:'sampler2D', value: PIXI.Texture.fromImage('stripesVal.png') // or stripesSat, stripesHue, etc } //Get shader code as a string var shaderCode = document.getElementById("shader").innerHTML; //Create our Pixi filter using our custom shader code var rasShader = new PIXI.Filter(null,shaderCode,uniforms); console.log(rasShader.uniforms); sprite.filters = [rasShader]; function update() { requestAnimationFrame(update); renderer.render(stage); } update(); Any help would be appreciated!
  17. I am trying to get Terrain Splatmap support into the toolkit. So a terrain built with up to 12 separate textures with the 12 matching normal maps (if using bumping). Now including the up to 4 actual splatmaps or alphamaps used to 'SPLAT' the textures onto the terrain thats a total of 28 additional textures (besides any lightmap of reflection textures) needed to create a max detailed terrain. Thats way too many textures for WebGL. My browser only supports a max use of 16 textures at once even if in a textureArray. IPHONES only support 8 textures at once... GLSL MAX_TEXTURE_IMAGE_UNITS. So I created a Texture Atlas system in Babylon Toolkit (I also use this for my Texture Atlas Baking Tools) to pack textures into a texture atlas and return an array of UV coordinates rectangle structs for each tile or cell in the texture atlas. Example atlasRect Array, atlasInfo Array And The textureAtlas image: "atlasRect1": [ 0.0, 0.0, 0.5, 0.25 ], "atlasRect2": [ 0.0, 0.5, 0.5, 0.25 ], "atlasRect3": [ 0.25, 0.0, 0.5, 0.25 ], "atlasRect4": [ 0.25, 0.5, 0.5, 0.25 ], "atlasRect5": [ 0.5, 0.0, 0.5, 0.25 ], "atlasRect6": [ 0.75, 0.0, 0.5, 0.25 ], "atlasRect7": [ 0.5, 0.5, 0.5, 0.25 ], The matching atlasInfo array contains the texture tile or cell information (uScale, vScale, uOffset, vOffset): "atlasInfo1": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo2": [ 100.0, 100.0, 0.0, 0.0 ], "atlasInfo3": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo4": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo5": [ 100.0, 100.0, 0.0, 0.0 ], "atlasInfo6": [ 80.0, 80.0, 0.0, 0.0 ], "atlasInfo7": [ 80.0, 80.0, 0.0, 0.0 ], The texture atlas image I create, Note the first tile or cell is the bottom left: I dont know all the ins and outs of what to multiply by what to get the desired effect. Now I need to get the tile or cell from texture atlas and use atlasInfo.xy (uScale and vScale): Here are my main two functions I added to shader: vec4 textureAtlas2D(sampler2D atlas, vec4 rect, vec2 uv, vec2 offset) { vec2 atlasUV = vec2((uv.x * rect.w) + rect.x, (uv.y * rect.z) + rect.y); return texture2D(atlas, atlasUV + offset); } vec4 textureFract2D(sampler2D atlas, vec4 rect, vec2 scale, vec2 uv, vec2 offset) { vec2 fractUV = fract(uv * scale); return textureAtlas2D(atlas, rect, fractUV, offset); } textureAtlas2D uses the rectangle that holds the UV Coords from above to just get the 'Desired Cell'. This works great... EXCEPT IS DOES NOT SCALE ... The ONLY code I could find after months of goggle search of GLSL Texture Atlas Tiling (or Scaling) was to use the GLSL fract() function to REPEAT into the texture atlas giving you scale. So I created textureFract2D as a wrapper to incorporate 'uvScale'. Example snippet from my splatmap shader call texture atlas functions: #ifdef DIFFUSE // Splatmaps #ifdef splatmapDef vec4 splatColor = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor1 = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor2 = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor3 = vec4(0.0, 0.0, 0.0, 0.0); vec4 baseColor4 = vec4(0.0, 0.0, 0.0, 0.0); // Base splat colors (No Texture Tiling) if (splatmapRects > 0.0) { baseColor1 = textureAtlas2D(splatmap, splatmapRect1, vTerrainUV, uvOffset); } if (splatmapRects > 1.0) { baseColor2 = textureAtlas2D(splatmap, splatmapRect2, vTerrainUV, uvOffset); } if (splatmapRects > 2.0) { baseColor3 = textureAtlas2D(splatmap, splatmapRect3, vTerrainUV, uvOffset); } if (splatmapRects > 3.0) { baseColor4 = textureAtlas2D(splatmap, splatmapRect4, vTerrainUV, uvOffset); } // Texture atlas colors (Use Texture Tiling) if (atlasInfos > 0.0 && atlasRects > 0.0) { splatColor = textureFract2D(diffuseSampler, atlasRect1, atlasInfo1.xy, vTerrainUV, uvOffset) * baseColor1.r; if (atlasInfos > 1.0 && atlasRects > 1.0) { splatColor += textureFract2D(diffuseSampler, atlasRect2, atlasInfo2.xy, vTerrainUV, uvOffset) * baseColor1.g; } if (atlasInfos > 2.0 && atlasRects > 2.0) { splatColor += textureFract2D(diffuseSampler, atlasRect3, atlasInfo3.xy, vTerrainUV, uvOffset) * baseColor1.b; } // Second splat colors if (atlasInfos > 3.0 && atlasRects > 3.0) { splatColor += textureFract2D(diffuseSampler, atlasRect4, atlasInfo4.xy, vTerrainUV, uvOffset) * baseColor2.r; } if (atlasInfos > 4.0 && atlasRects > 4.0) { splatColor += textureFract2D(diffuseSampler, atlasRect5, atlasInfo5.xy, vTerrainUV, uvOffset) * baseColor2.g; } if (atlasInfos > 5.0 && atlasRects > 5.0) { splatColor += textureFract2D(diffuseSampler, atlasRect6, atlasInfo6.xy, vTerrainUV, uvOffset) * baseColor2.b; } // Third splat colors if (atlasInfos > 6.0 && atlasRects > 6.0) { splatColor += textureFract2D(diffuseSampler, atlasRect7, atlasInfo7.xy, vTerrainUV, uvOffset) * baseColor3.r; } if (atlasInfos > 7.0 && atlasRects > 7.0) { splatColor += textureFract2D(diffuseSampler, atlasRect8, atlasInfo8.xy, vTerrainUV, uvOffset) * baseColor3.g; } if (atlasInfos > 8.0 && atlasRects > 8.0) { splatColor += textureFract2D(diffuseSampler, atlasRect9, atlasInfo9.xy, vTerrainUV, uvOffset) * baseColor3.b; } // Final splat colors if (atlasInfos > 9.0 && atlasRects > 9.0) { splatColor += textureFract2D(diffuseSampler, atlasRect10, atlasInfo10.xy, vTerrainUV, uvOffset) * baseColor4.r; } if (atlasInfos > 10.0 && atlasRects > 10.0) { splatColor += textureFract2D(diffuseSampler, atlasRect11, atlasInfo11.xy, vTerrainUV, uvOffset) * baseColor4.g; } if (atlasInfos > 11.0 && atlasRects > 11.0) { splatColor += textureFract2D(diffuseSampler, atlasRect12, atlasInfo12.xy, vTerrainUV, uvOffset) * baseColor4.b; } } baseColor = splatColor; #else baseColor = texture2D(diffuseSampler, vDiffuseUV + uvOffset); #endif #ifdef ALPHATEST if (baseColor.a < 0.4)å discard; #endif #ifdef ALPHAFROMDIFFUSE alpha *= baseColor.a; #endif baseColor.rgb *= vDiffuseInfos.y; #ifdef splatmapDef #ifdef BUMP //normalW = perturbNormals(viewDirectionW, baseColor1, baseColor2, baseColor3, baseColor4, uvOffset, atlas1UV, atlas2UV, atlas3UV, atlas4UV, atlas5UV, atlas6UV, atlas7UV, atlas8UV, atlas9UV, atlas10UV, atlas11UV, atlas12UV); #endif #ifdef TWOSIDEDLIGHTING normalW = gl_FrontFacing ? normalW : -normalW; #endif #endif #endif My problem is I dont know how to grab the cell info using the UV coords and apply the scaling needed. If I use the textureFract2D GLSL fract to do the scaling I get edge seams (I will post screen shots in the next post). @Deltakosh pointed me the particleSystem direction because I guess it does something like using SPRITESHEET and what they ar calling a 'cellIndex' to get to the tile or cell. Now there is a bunch of code that deals with sprite sheet width and some calculations to get a rowOffset and columnOffset. Well I dont have that kind of info, but like I explained above, I have the ACTUAL UV COORDS for each tile or cell in the texture atlas. But I am STILLL GAME DEV NEWBIE, I dont know what I need to do to use those UV coords and info.xy (uScale, vScale) to get the desired effect. This is the gist of what the particle system does for texture atlas or sprite sheet support: //vec2 offset = options.zw; // Dunno what this is - ??? //attribute float cellIndex; // Dunno what this is - ??? //uniform vec3 particlesInfos; // x (number of rows) y(number of columns) z(rowSize) //#ifdef ANIMATESHEET //float rowOffset = floor(cellIndex / particlesInfos.z); //float columnOffset = cellIndex - rowOffset * particlesInfos.z; //vec2 uvScale = particlesInfos.xy; //vec2 uvOffset = vec2(offset.x , 1.0 - offset.y); //vUV = (uvOffset + vec2(columnOffset, rowOffset)) * uvScale; //#else //vUV = offset; //#endif I have no idea how to take this and adapt this to using the actual UV coords WITH SCALE. PLEASE, ANYBODY, I AM BEGGING (AGAIN)... HELP ME RE-WRITE textureFract2D to get the desired effect Here is my shader programs so far: splatmap.vertex.fx splatmap.fragment.fx UPDATE You can download the Test Terrain export project and edit the shader in the src/shader folder directly and just hit refresh on your browser to see effect Look at next post for example screen shots ... THANKS FOR READING THIS FAR Pinging @Deltakosh and @Sebavan and @Pryme8 and @adam and last but not least my man 'Wingy' at @Wingnut ... Any thought Guys ??? Yo @NasimiAsl ... The Shader Guru ... Maybe you can have another crack at it, but this time use the Test Terrain project from above and change the splatmap shader and hit refresh... Even better than play ground, you get the whole export project... easy access
  18. Water shader approach?

    I'm struggling with an approach to add a water effect to an area of container. I'd like to have the area in green distort the area below it via a shader, but also have the ability for this water area to "rise/fall". The rise/fall is of no problem when it's just a sprite I can move up and down as necessary - but being able to apply a shader to just that specific area is throwing me for a loop. Adding a filter directly to the sprite won't distort anything since it's not accounting for the pixels behind it.. Is there are way to capture this particular area, save it to a buffer, and apply a filter, and put it back in place? RenderTexture seems like it might be in the right direction, but I can't see where I can grab an area of an existing container... Any help would be much appreciated!
  19. Playing with shader

    Hello guys, see the GIF please https://ibb.co/kZkrBR. I wonder how to do something like this in babylonjs, I did this in opengl project by sending a value to a uniform in fragment shader. should I do the same with babylonjs ? ; should I modify babylonjs shaders ?, or there is a better way to do that.
  20. This is a small tutorial in which we learn how to transfer an existing shader from the Shadertoy library to PlayCanvas. Mastering this process opens up many possibilities for using open source shaders and effects in your projects. http://pirron.one/playingincanvas/color-palettes-using-shaders This tutorial is for advanced users and assumes that you are already aware of how the PlayCanvas shader chunks system works. You can study this tutorial to learn more on how to begin with PlayCanvas and shaders.
  21. I,m trying to develop something like this using Babylon.js (The example is implemented in Unity 3D): I suppose this can be implemented using a shader. The user can move a sphere with the inverted normals, and the shader will have to calculate if this sphere is painted depending on its position in the z buffer. Is it possible to do this using babylon ShaderMaterial? Thanks in advance.
  22. Quick question: Is there any way to fade out a ShaderMaterial'd mesh? mesh.material.alpha = X does not seem to have any affect on the alpha value, unless I am doing something wrong. Am I missing something or is it not possible? (I can probably fake it by adding an alpha uniform to the fragment shader and explicitly passing it in, however this would be painful to animate). Also, if I go the uniform route, I'm not sure how much overhead (if any), it adds, setting the uniform value for the shader before render. I've noticed a lot of games similar to mine (digital card games), will have their shader effects timed in sync for a particular effect, which led me to the assumption that there is probably a performance reason behind it? (because having the effects fire on a different start time would look better) - But maybe I'm way off mark with that assumption, anyone happen to know?
  23. Hello everyone! I'm making fluid simulation effect, i found a demo: http://jeeliz.com/demos/water/ and i want to clone it. The demo written by WebGL + shader but i don't know about WebGL, i'm worrying . I'm trying export to babylon.js and there are a few things i don't know how to do. I'm really stuck, i expect the babylon community to help me 1. var texture_water = GL.createTexture(); GL.bindTexture(GL.TEXTURE_2D, texture_water); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.NEAREST); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.NEAREST); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE); GL.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE); GL.texImage2D(GL.TEXTURE_2D, 0, GL.RGBA, 512, 512, 0, GL.RGBA, GL.FLOAT, null); 2. var quad_vertex = [-1, -1, 1, -1, 1, 1, -1, 1]; var QUAD_VERTEX = GL.createBuffer(); GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX); GL.bufferData(GL.ARRAY_BUFFER, new Float32Array(quad_vertex), GL.STATIC_DRAW); var quad_faces = [0, 1, 2, 0, 2, 3]; var QUAD_FACES = GL.createBuffer(); GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES); GL.bufferData(GL.ELEMENT_ARRAY_BUFFER, new Uint16Array(quad_faces), GL.STATIC_DRAW); GL.vertexAttribPointer(SHP_VARS.rendering.position, 2, GL.FLOAT, false, 8, 0); GL.bindBuffer(GL.ARRAY_BUFFER, QUAD_VERTEX); GL.bindBuffer(GL.ELEMENT_ARRAY_BUFFER, QUAD_FACES); GL.disableVertexAttribArray(SHP_VARS.rendering.position); 3. GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0); GL.disableVertexAttribArray(SHP_VARS.water.position); GL.framebufferTexture2D(GL.FRAMEBUFFER, GL.COLOR_ATTACHMENT0, GL.TEXTURE_2D, texture_normals, 0); GL.useProgram(SHP_NORMALS); GL.enableVertexAttribArray(SHP_VARS.normals.position); GL.bindTexture(GL.TEXTURE_2D, texture_water); GL.drawElements(GL.TRIANGLES, 6, GL.UNSIGNED_SHORT, 0); GL.disableVertexAttribArray(SHP_VARS.normals.position); GL.bindFramebuffer(GL.FRAMEBUFFER, null); GL.flush(); With the above code how i export to babylon.js? Thanks so much!
  24. So, I'm trying to convert a shader from shadertoy, I'm close but still can't get it working. Also in my actual scene it doesn't seem to be working at all, but it's hard to tell if it's related to the issue I am having w conversion, since I need to rotate the sphere to get it to show up to begin with. Shader is here: (it appears blank at first, but if you rotate it youll start to see the fire. The actual effect I am going for you will see only if you rotate it just right so that you see the fire starting with the white in the middle, and it filling up the sphere). http://cyos.babylonjs.com/#M11GKA The source shader is here: https://www.shadertoy.com/view/lsf3RH So the one place I was not sure how to proceed, was mapping over the iResolution variable (which shadertoy states is the viewport resolution). I played around with a bunch of different things and ended up trying the camera input, which works, but requires rotating the mesh to see it at all. Anyone know what input would map over to viewport resolution (or how to get it), and or what I am doing wrong/missing here?
  25. I try to start it with this example : http://phaser.io/examples/v2/filters/blue-dots I use download zip , but seam need to fix a php server. I try to fix by using just the files and folder to run the example that but seem is not working . Can you give me some help ? Thank you. Best regards.