Search the Community

Showing results for tags 'shader'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

Found 128 results

  1. I am trying to implement an "Additive Shader" (from space shooter tutorial) where BLACK pixels are transparent (or do NOT ADD) and the rest of color add on top... Do we (BabylonJS Community) has a shader already that does something like that??? if not, i will have to make one... I tried to start off by just return a transparent color: void main(void) { gl_FragColor = vec4(0.0, 0.0, 0.0, 0.0); } I have "needsAlphaBlending = true" on shader material options object BUT I STILL SEE BLACK SQUARE (I little less bright , but still there)... I would assume that setting a color rgba (0,0,0,0) would make EVERY pixel transparent... But it is not. Any help or info would be very kool
  2. Hello i'm exploring the matrix func. inside Babylonjs, a matrix lib can be easy see here : http://glmatrix.net/docs/module-mat4.html or very hard see here : https://doc.babylonjs.com/api/classes/babylon.matrix I kind of understand where the problem occurs, after unsucessfull 3h, i think i might be better to ask the community. Thanks https://www.babylonjs-playground.com/#Y7Q181
  3. florentd

    Creating custom filter

    Hello, I'm trying to create a custom Filter in Phaser to make dithering. I used this great resource as a reference: http://alex-charlton.com/posts/Dithering_on_the_GPU/ Here is my code (I modified the DotScreenFilter code): PIXI.DotScreenFilter = function() { PIXI.AbstractFilter.call( this ); this.passes = [this]; // set the uniforms this.uniforms = { scale: {type: '1f', value:1}, angle: {type: '1f', value:5}, dimensions: {type: '4fv', value:[0,0,0,0]}, indexMatrix4x4: {type: '16i',value:[0, 8, 2, 10, 12, 4, 14, 6,3, 11, 1, 9,15, 7, 13, 5]} }; this.fragmentSrc = [ 'precision mediump float;', 'varying vec2 vTextureCoord;', 'varying vec4 vColor;', 'uniform vec4 dimensions;', 'uniform sampler2D uSampler;', 'uniform float angle;', 'uniform float scale;', 'uniform int indexMatrix4x4[16];', 'float indexValue() {', 'int x = int(mod(gl_FragCoord.x, 4));', 'int y = int(mod(gl_FragCoord.y, 4));', 'return float(indexMatrix4x4[(x + y * 4)]) / 16.0;', '}', 'float dither(float color) {', 'float closestColor = (color < 0.5) ? 0.0 : 1.0;', 'float secondClosestColor = 1.0 - closestColor;', 'float d = indexValue();', 'float distance = abs(closestColor - color);', 'return (distance < d) ? closestColor : secondClosestColor;', '}', 'void main() {', 'gl_FragColor = vec4(vec3(dither(vColor.a)), 1);', '}' ]; }; PIXI.DotScreenFilter.prototype = Object.create( PIXI.AbstractFilter.prototype ); PIXI.DotScreenFilter.prototype.constructor = PIXI.DotScreenFilter; /** * The scale of the effect. * @property scale * @type Number */ Object.defineProperty(PIXI.DotScreenFilter.prototype, 'scale', { get: function() { return this.uniforms.scale.value; }, set: function(value) { this.dirty = true; this.uniforms.scale.value = value; } }); /** * The radius of the effect. * @property angle * @type Number */ Object.defineProperty(PIXI.DotScreenFilter.prototype, 'angle', { get: function() { return this.uniforms.angle.value; }, set: function(value) { this.dirty = true; this.uniforms.angle.value = value; } }); When I run it, i got these errors: Any ideas what is wrong ?
  4. Is there already a way to make a displacement filter in p3? something like that: https://pixijs.io/examples/#/filters/displacement-map.js would like to use it for explosions (shockwave effect) any idears on implementing this effect into p3.? regards
  5. Fenopiù

    Phaser 2.6.2 & shader vs new iPad

    Good morning everyone! I use a shader in my game who create a starfield and then, after a while, it slowly disappear from the center to all the screen. On every device I've tested (pc, Android, iPhone 5c, first iPad, Mac book pro, ecc...) it work as I want. In the new iPad the whole in the center, instead of be "trasparent" or "invisibile", it is total black. Anyone have a clue to solve this?
  6. hi i make this post for share some unexpected experience in shader code you can see a lot but you can fix it easly
  7. See PG http://www.babylonjs-playground.com/#VFQCJR and take a look at console errors. Unable to compile effect: What I'm trying to do here is have the diffuse texture apply to the first set of UVs and the emissive texture apply to the second set of UVs (using emissiveTexture.coordinatesIndex = 2). I'm not sure if I'm doing this right (I couldn't find much documentation) or if it's a bug.
  8. I was unable to transform mesh instance if the original mesh had a shader material. Is this a bug? I have created a playground example using box mesh. If you open the debug -> mesh panel, you realize there are 6 meshes, but only 5 are displayed. This is because one of the instances in the second row was not transformed properly. On the other hand, clones of meshes with shader material can be transformed properly. https://playground.babylonjs.com/#B2NZ1M#1
  9. Hello, Is there a way to access the texcoord1 in the shader as an attribute? I know I can access texcoord0 using attribute vec2 uv; but uv0/uv1/uv2 etc don't seem to do anything. This is trivial for using lightmaps
  10. Hi guys! Since I use BJS, I always be kind of stucked with lightmap blending with my materials. I was working on standard workflow, but now I'm testing PBR, I have still the same issue, so this thread to understand deeper the blocking factor. I never succeeded to get (basically) the same render from my 3D lighting scene to BJS, after many many tests So I consistently had to tweak my hundred of lightmaps files in Photoshop to increase contrast/luminosity. These lightmaps could came from the usuals RenderToTextures passes from 3D softwares: 3DSMax & VRay, from a RawTotalLighting pass Blender & Cycles, from a Diffuse Direct & Indirect pass Having to manually tweak these lightmaps cause me to lighting my scenes in a blind way (which is of course very annoying), and to not have a WYSIWYG workflow (like: tweak a lightmap > save > reload BJS scene > retweak > etc). So here a test scene to discuss about: http://www.babylonjs-playground.com/#59ZXVF#11 (sources) (here the lightmap of the room). You can easily switch or add lightmap mode using the line 32, default here is lightmap file set in lightmapTexture as shadowMap. Because I'm a devianted person, I wanted to compare with another engine, so I made a scene on https://playcanv.as/b/OuZIGpY0/ : here you can see that my white data on my lightmaps files are read in a better way, they burn materials a little ; I think dark data are darker too. (note that PlayCanvas allow to send HDR files and convert them to RGBM, giving an interesting result, but in the link above I used the exactly same png file than in BJS, of course). And the result is totally satisfying. So, have I missed a magic parameter? Is the shader need a little tweaking? Is my workflow totally wrong? How do you personally deal with lightmaps?
  11. Hi! I did a Game Jam recently, our group used the p5 library because someone suggested it to us, and indeed it helped us develop an ok prototype in a short amount of time. The game wasn't perfect at all of course, because 48 hours is still really short, but one of the main flaws that bothered me was the graphic integration; our graphistes liked working with pixel art, and we used a tileset to create the background map, which didn't tile right, with grey line appearing between tiles; on top of that, the other sprites didn't look nice at high resolution either, because of linear filtering. After the Jam ended, I decided to try to fix these. I found the way to activate nearest filter in p5, which made the graphics look way better, but still not as nice as I wanted it to be. After googling a bit, I found this article, which was exactly what I was looking for. Then I noticed... in order to use shaders, I needed to switch the context to WebGL, which meant changing most of the code. This didn't actually bother me, so I started working on it, it was quite a nightmare (because of p5's bugs and lacks of integration...) but when I got to the point where I had basically the same game then before, but switched to a WebGL context (without the shaders), I noticed HUGE performance drops (I'm talking 10 FPS for displaying something 400 sprites in a 500 * 500 context!!!). That's when I decided that I needed to switch to a different library. A little more googling later, I opted for the PIXI library, which, I must agree, is a nice library to work with, even though the official documentation is lacking a bit of informations. I started recoding the game from scratch once again, and as soon as I could draw the map on screen, I tried to implement the pixel art shader; I copy-pasted the code and... Magic! It didn't work. I was actually not that surprised, and decided to play around with the GLSL to see where the problem was. That's when I started noticing strange things: weird texture offsets that I tried to adjust for manually, which didn't work because the offset changed depending on the scaling of the image... and then the texture itself, scaling itself up and down when I scaled the sprite... After a while and thanks to more googling, I found out that Pixi did some pre-processing on both the texture and the texture coordinates which were passed to the fragment shaders. I tried adjusting for it, I tried a lot, I got close, but it's still really messy, and float precision starts doing strange things that makes the textures look even worst then before. That's where I am now. I tried searching for others libraries, but most of them seemed to be either 3D libraries, or probably as weak as p5, so I'm here to ask, is there a good library that I can work with which won't bring me such problems? Or is there something I can do to fix my problem with PIXI? Or should I just resort to using webGL without any additional library (which at this point seems like the best solution to me)?
  12. I wanted to make a tiled floor in the past. In that case, it was a tool scene to test walking poses, so performance did not matter. It was also to make it easy to count distance. When I tried to do it with a texture, it was always blurry. I just wrote a quick script to make a mesh with a bunch of black squares, and set the clear color of the scene to white. Here is what I got. Now, I would like the same effect & quality for a real scene (no shadows required). Not sure how big each square should be in advance. Needs to be adjustable to try different sizes. Do not want to use white for clear color for scene. Is this a good candidate for a procedural? If so, it seems there a number of variations in the framework & extensions (shaderbuilder). What would you recommend? Here is mesh:
  13. Hi Folks, Just by curiosity, how do we avoid expensive glUseProgram calls internally in Babylonjs? Do we sort meshes by shader so that we can minimize shader changes?
  14. Hello BJS community ! I just began to understand HDR textures, gamma correction and so on in order to learn how to do IBL. In this process, I used BABYLON.HDRCubeTexture to convert my equirectangular HDR texture to a usable environment HDR cubemap, as explained here. Then, I need to apply a convolution on this cubemap to obtain my final irradiance cubemap that I will sample during IBL. To compute my irradiance cubemap from the environment cubemap, I use a RenderTargetTexture. Until there, everything works fine ! In the above tutorial link, the guy uses OpenGL and doesn't matter about having output color exceeding [0..1] range. It's useful to keep HDR textures until the last step where he will tone map its result. I learned the hard way that it's not as simple with WebGL. When I store color outside [0..1] range and then sample this result in another shader, result has been clamped between [0..1] range. This stackoverflow question taught me that not only I need to use a floating point texture but also I have to render to a floating point frame buffer. Or something like that, I never dived into pure WebGL code. To render to a floating point frame buffer, I need to enable the EXT_color_buffer_float extension (only available with WebGL 2), but it doesn't seem to be enough. I think I also need to configure the framebuffer with pure WebGL code. So, my question is: Is is yet possible to render color outside [0..1] range using BabylonJS at this time ? How ? If this not ready yet, I'll normalize and denormalize data at each step of course. But I would love to know if doing it in the ideal way is possible. Thank you a lot in advance !
  15. Nephelococcygia

    Best way to handle Character with shadow

    I was wondering what is the best way to handle shadow for a "character". I made a character and i want to put a shadow at the bottom a kind of "circle" with opacity, for convenience let's say it's gonna be a transparent PNG. The way i can think of : - (1) Just add the sprite to the game, this leads to duplicated calculations in the update part cause i have to update both the "character" sprite and his shadow; - (2) Add the shadow as a child sprite of the "character", this leads to having the shadow on top of the sprite instead of back to it; - (3) Add the "character" sprite as a child of the shadow sprite, this seems not logical, but it works somehow; - (4) Create a group and add both the shadow and the sprite, (havn't test it but most likely) this leads to define specific property to the group for the size, the position and the overflow boundaries for collision relative to the "character" sprite and the shadow; - (5) Using a filter (Shader) applied on the hole game, with my character position as uniform to render the shadow directly on the "groundLayer". What do you think please ? Is there a better way that i am missing ? Thanks.
  16. Hi there - I'm going down a rabbit hole trying to implement a color grading / LUT shader for PIXI. Color grading is where you can you use a sprite as a lookup table to quickly transform one set of colors to another - this is handy for applying realtime contrast and color adjustments. I'm using this as a reference: https://www.defold.com/tutorials/grading/ I've created a filter/shader using the code in the link above: var ColorGradingShader = function(LUTSprite) { var my = this; var code = ` precision lowp float; uniform vec4 filterArea; varying vec2 vTextureCoord; uniform sampler2D uSampler; uniform sampler2D lut; #define MAXCOLOR 15.0 #define COLORS 16.0 #define WIDTH 256.0 #define HEIGHT 16.0 void main() { vec4 px = texture2D(uSampler, vTextureCoord.xy); float cell = px.b * MAXCOLOR; float cell_l = floor(cell); float cell_h = ceil(cell); float half_px_x = 0.5 / WIDTH; float half_px_y = 0.5 / HEIGHT; float r_offset = half_px_x + px.r / COLORS * (MAXCOLOR / COLORS); float g_offset = half_px_y + px.g * (MAXCOLOR / COLORS); vec2 lut_pos_l = vec2(cell_l / COLORS + r_offset, g_offset); vec2 lut_pos_h = vec2(cell_h / COLORS + r_offset, g_offset); vec4 graded_color_l = texture2D(lut, lut_pos_l); vec4 graded_color_h = texture2D(lut, lut_pos_h); vec4 graded_color = mix(graded_color_l, graded_color_h, fract(cell)); gl_FragColor = graded_color; } `; PIXI.Filter.call(my, null, code); my.uniforms.lut = LUTSprite.texture; } ColorGradingShader.prototype = Object.create(PIXI.Filter.prototype); ColorGradingShader.prototype.constructor = ColorGradingShader; export default ColorGradingShader; I then add this to my top level container: //relevant code from a wrapping class this.colorGradingSprite = new PIXI.Sprite.fromImage('/img/lut16.png'); this.pixiContainer.filters = [ this.colorGradingFilter ]; When using any LUT image, including the default without any color adjustments: I go from this: to this: I'm assuming there are some adjustments necessary to either the shader code, or how the lut sprite itself is being loaded - I have no clue.. Any help would be greatly appreciated! And for those curious, here's my end goal: Thanks, Sean
  17. hi, anyone knows why changing the value of "pointSize" doesn't influence the size of points of a shaderMaterial? shaderMaterial.pointsCloud = true; shaderMaterial.pointSize = 10;
  18. I want to draw a lot of spheres with different colors and matrix. Since they all share the same geometry, so using instance draw will be an effective way. Babylon provides a build-in instance mesh with different matrix. But other instance attribute like 'color' seems not support. So I tried to write a custom material, but the documents about custom material/shader seems not support instance drawing. Who can give me an example or any tips on writing a custom instanced material? Thanks.
  19. I took a quick look at the source code and it seems that we have no way to update only one element of an uniform array? Just like below.. // in JavaScript at init time var someVec2Element0Loc = gl.getUniformLocation(someProgram, "u_someVec2[0]"); var someVec2Element1Loc = gl.getUniformLocation(someProgram, "u_someVec2[1]"); var someVec2Element2Loc = gl.getUniformLocation(someProgram, "u_someVec2[2]"); // at render time gl.uniform2fv(someVec2Element0Loc, [1, 2]); // set element 0 gl.uniform2fv(someVec2Element1Loc, [3, 4]); // set element 1 gl.uniform2fv(someVec2Element2Loc, [5, 6]); // set element 2 Well I need to hack like this... var locs = engine.getUniforms(material.getEffect()._program, ['test1[0]']); engine.setFloat3(locs[0], 0.0, 1.0, 0.0);
  20. Hi Folks, I have a question about the shaders used in Babylon. I saw some special stuff like the line below. I have searched __decl__lihghtFragment but I could not find it. And also this lines seems not to be standard grammar of GLSL. Could someone explain how it works? #include<__decl__lightFragment>[0..maxSimultaneousLights] The background is that I am using the CustomMaterial made by NasamiAsl and I would like to add some custom uniforms like the lights. So the stuff above interests me. Thanks
  21. Hi all, I found something weird while optimizing my code: I was using InstancedMesh as much as possible, and switched all my materials to ShaderMaterial. And then boom, everything disappeared. After a bit of research, it seems instanced meshes will simply not be drawn when their source has a ShaderMaterial assigned. Playground repro: http://www.babylonjs-playground.com/#TWDEA#6 Uncomment lines 40 to 44 to see for yourself. It's actually a bit more complex than that: when you assign a shader material to it, even the source mesh disappears if it has been instantiated (?), but only if its instances are in the field of view. I'm currently looking at BJS code and ANGLE_instanced_arrays doc (the extension used for drawing instanced meshes), but I thought I'd come here to fish for ideas... FYI these are two errors I noticed in the log when this problem showed (errors not always there): drawElementsInstancedANGLE: at least one enabled attribute must have a divisor of 0glDrawElementsInstancedANGLE: attempt to draw with all attributes having non-zero divisorsThanks
  22. Hi, I’m new in Babylon.js, and trying to use it to create geometry that has animated vertices and an animated procedural texture. I’m animating the vertices in the vertex shader. For the procedural texture, I tried to follow the instructions: https://doc.babylonjs.com/how_to/how_to_use_procedural_textures as well as checked the playground example. https://www.babylonjs-playground.com/#24C4KC#17 the problem with the example is that i can’t really find a complete implementation with the shader/config.json files. And i have a couple of basic questions as well. When creating a custom procedural texture with an external folder with the config.json and custom.fragment.fx files, is that the only fragment shader that can be used in the scene? Or can a BABYLON.ShaderMaterial be used additionally? I'm having a hard time grasping the concept of a ’fragment shader’ procedural texture VS a fragment shader as the last step of the webgl pipeline. Thanks.
  23. Is there a way to pass a video texture to a shader using a similar way as the one used to pass an image texture? shaderMaterial.setTexture("textureSampler", new BABYLON.Texture(imgTexture, scene)); i'm wondering if there are more ways to use video textures in babylon, i have seen that i can set a BABYLON.VideoTexture as a diffuseTexture of a material, but that seems limiting. What if i want to manipulate an object which has a material with a video texture, in the vertex + fragment shaders?
  24. brianzinn

    optic illusions

    I'm messing around with shaders and made an optic illusion sort of by accident: https://www.babylonjs-playground.com/#WX2PRW#4 I'm a fan of M.C. Escher, so maybe one day will try to build one of his scenes that depends on perspective... Does anybody else have one to share?
  25. I'm experimenting with GLSL shaders in Pixi 4.4, and was trying to make some that would take in two images, the base and an overlay. The shaders would then replace either the Hue, Saturation, or Value of the pixels in the base image with the H/S/V of the corresponding pixel in the overlay image. Transparency on the overlay means "no change." For my tests, I used a 100x100 red square, and the following 100x100 "striped" overlays: That's the Hue overlay, Saturation overlay, and Value overlay respectively. Results were only partially consistent, and all wrong. Here are the results of the Hue and Saturation shaders against a black background. Okay, so the stripes are twice as tall as they should be and the bottom-most stripe is outright missing, as though either the overlay or base were resized as some point in the process. But Value takes the cake: Not only do we have the same problem as above, but there are "teeth" that have appeared off the edge of the 100x100 image (4px in each direction, making a 108x100 image), and there are outlines around every stripe, and if you zoom in especially close you can see that some of the outlines are actually 2 pixels tall, one of near-black, and one of another dark colour, none of which is in the original Value overlay! I'm at a loss to tell if the problem(s) originates in my shader code or in Pixi, especially since tutorials around the net are mum about how to create a second shader2D uniform in any other setup but Pixi. I do want to work with Pixi for this project, however, so a fix for Pixi would be appreciated if the problem really is from there. Here's the HTML/GLSL code. Please don't mind the If statement, I've already had a few ideas on how to get rid of it: <html> <head> <meta content="text/html;charset=utf-8" http-equiv="Content-Type"> <meta content="utf-8" http-equiv="encoding"> <style> body { background-color: black; margin: 0; overflow: hidden; } p { color: white; } </style> </head> <body> <script type="text/javascript" src="libs/pixi.js"></script> <script id="shader" type="shader"> #ifdef GL_ES precision mediump float; #endif varying vec2 vTextureCoord; uniform sampler2D uSampler; //The base image uniform sampler2D overlay; //The overlay vec3 rgb2hsv(vec3 c) { vec4 K = vec4(0.0, -1.0 / 3.0, 2.0 / 3.0, -1.0); vec4 p = mix(vec4(c.bg, K.wz), vec4(c.gb, K.xy), step(c.b, c.g)); vec4 q = mix(vec4(p.xyw, c.r), vec4(c.r, p.yzx), step(p.x, c.r)); float d = q.x - min(q.w, q.y); float e = 1.0e-10; return vec3(abs(q.z + (q.w - q.y) / (6.0 * d + e)), d / (q.x + e), q.x); } vec3 hsv2rgb(vec3 c) { vec4 K = vec4(1.0, 2.0 / 3.0, 1.0 / 3.0, 3.0); vec3 p = abs(fract(c.xxx + K.xyz) * 6.0 - K.www); return c.z * mix(K.xxx, clamp(p - K.xxx, 0.0, 1.0), c.y); } void main(void) { vec4 baseRGB = texture2D(uSampler, vTextureCoord); vec4 overlayRGB = texture2D(overlay, vTextureCoord); if(overlayRGB.a > 0.0) { vec3 baseHSV = rgb2hsv(baseRGB.rgb); vec3 overlayHSV = rgb2hsv(overlayRGB.rgb); // Hue // vec3 resultHSV = vec3(overlayHSV.x, baseHSV.y, baseHSV.z); // Saturation // vec3 resultHSV = vec3(baseHSV.x, overlayHSV.y, baseHSV.z); // Value vec3 resultHSV = vec3(baseHSV.x, baseHSV.y, overlayHSV.z); vec3 resultRGB = hsv2rgb(resultHSV); gl_FragColor = vec4(resultRGB.rgb, baseRGB.a); } else { gl_FragColor = baseRGB; } } </script> <script type="text/javascript" src="replaceTest.js"></script> </body> </html> And here's the JS: var width = window.innerWidth; var height = window.innerHeight; var renderer = new PIXI.WebGLRenderer(width, height); document.body.appendChild(renderer.view); var stage = new PIXI.Container(); var sprite = PIXI.Sprite.fromImage('flat.png'); sprite.x = width / 2;//Set it at the center of the screen sprite.y = height / 2; sprite.anchor.set(0.5);//Make sure the center point of the image is at its center, instead of the default top left stage.addChild(sprite); //Create a uniforms object to send to the shader var uniforms = {} uniforms.overlay = { type:'sampler2D', value: PIXI.Texture.fromImage('stripesVal.png') // or stripesSat, stripesHue, etc } //Get shader code as a string var shaderCode = document.getElementById("shader").innerHTML; //Create our Pixi filter using our custom shader code var rasShader = new PIXI.Filter(null,shaderCode,uniforms); console.log(rasShader.uniforms); sprite.filters = [rasShader]; function update() { requestAnimationFrame(update); renderer.render(stage); } update(); Any help would be appreciated!