Search the Community

Showing results for tags 'shaders'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



Website URL





Found 47 results

  1. Hi all, i've been messing with something for a couple days. I'm passing in a transparent blurred PNG to a shader, as well as a solid PNG background, and running it through a grayscale shader. If I simply display the pixels of the blurred PNG, it is output on top of the solid background as expected. The grayscale part of the shader works fine. However, the grayscale portion of the photo has a harsh transition to the background. It doesn't fade nicely and follow the alpha of the blurred PNG, like a blendMode would. I realize they are different things, but I feel like I am missing something obvious, and that it should work as I'm expecting. I have a playground setup to demonstrate the issue. Ignore the ugliness of the assets, it gets the point across better In the photo attached, the desired result is on the left (from Photoshop Color BlendMode). The right is the result from the playground. You can tell that the grayscale area on the right is much larger, since I believe that any alpha that is NOT 0, it being set to 1. I would like to try and maintain the alpha from the original blurred PNG. It may not seem like much but it really kills what I'm going for with the aliased edge like that. Thank you!
  2. I have a `PIXI.Container()` I apply a shader/filter to using `container.filters = [filter]`. Documentations says, to remove a filter, just set `container.filters = null`. This works, but it's kind of a hard cut, when the image/sprite inside of the container is still visible, hence my question: Can I remove a filter with a kind of fade/transition?
  3. I'm picking up an older project again, and moving it to the current version of BJS. This is a pain, though, because I'd been extending the StandardMaterial shader within a copy of the BJS framework itself. The project made extensive use of noise and shape functions to simulate textures within the GPU by altering the diffuse, specular and normal values of the StandardMaterial, based on a flag hacked into the framework. The advantage of this method vs. ShaderMaterial was getting unlimited resolution textures at no bandwidth cost without having to reimplement all the goodies in the StandardMaterial--SSAO2, fog, shadows, etc. The disadvantage: Lack of portability, and having to find a way to re-minify everything myself before deployment. (My kingdom for uglify.js to support the `` multiline literal...) Before I start migrating my hacks, I wanted to ask this of the smart people around the water cooler: Can anyone suggest a more elegent way to do this, without modifying BJS itself?
  4. So I figured with a few people making some cool shaders now and the purposed improvements to the CYOS. I figured we should have a thread for shader development to showcase what people are making and talk about different methods and concepts. To kick things off I figured id post a procedural skymap... this is a cleaned up version of the first on I posted last night and is based off a standard box element. I have not tested it in scene yet but the CYOS output is promising. Ill be looking to add volumetric weather here soon and will be making the suns position dependent on a light on the scene. Anyways feel free to comment it is pretty much a direct port of a Atmospheric GLSL process I found on github. Does anyone have any good resources for volumetric cloud rendering with a light source? Im reading up on this first
  5. Hi, I'm trying to implement the pixi-v4 filters in phaser 2 (CE Version). I'm especially interested in this one: I've noticed this example ( in phaser 2 is outdated/not working (link to pixi filter is wrong). Even when i fix the link (e.g. the filter gives an error because pixi is not included. So I'm wondering what is the right method to do it. I guess the main questions here are: is pixi-v4 filters compatible with phaser-v2 method of importing of pixi filters? if yes, where are the new filters (interested in the zoom-blur one: if not, any pointers how to port them? Thanks!
  6. Hi, I wold like to implements one effect that I found in codepen: this is the link of the effects: codepen My question is : Is there one way to implements it in pixi.js. I'm new with this framwork, i tried to search somethink on google but I don't found nothink. Thanks in advanced
  7. Hi guys, I'm new here on the forum as you can see my first post. I'm curious to know is there any good sourced tutorial for V4 Shader Implementation? I have looked but find old tutorials and from what I see its changed in v4. I've got down all the other stuff for PixiJS and I love it, but the shaders implementation for V4 has me a bit lost lol. What would be great is a demo of the the V2 or V3 shader that came packaged working in V4. Or maybe a version. I'm looking to implement it using the JavaScript Tag encapsulated. Maybe I've overlooked info somewhere?
  8. Hello, Is there a guide or some documentation on how to convert shaders from ThreeJS or shadertoy? I see that each one uses different uniforms and it a bit confusing. For example I wanted to convert this shader from shadertoy: I added the code to CYOS on the fragment shader box but got many errors mostly due to the lack of uniforms that shadertoy uses by default. It would be nice if a list of equivalents existed that would make the convertion easier from one tool to another since all use GLSL And this is the ThreeJS example that I tried to convert and also failed: Thanks in advance. I know this is a complicated issue and not so difficult for people with abundant knowledge with shaders, but we all have to start somewhere
  9. Hi, this is my first time using shaders and the new filters in pixi v4, so please bare with me First of all I'm a bit confused by the term (custom) filter.. Is that the same as the user of a shader? I assume it is. Second I tried using this example. I got it to work in pixi 3.x but somehow I can't figure out what i'm doing wrong in v4. no error messages, just a black canvas. My goal is to create custom button hovers with PIXI, using shaders, etc. I tried 2 different ways but... alas, same black canvas HTML: <a href="#" class="btn btn-one"> <canvas></canvas> <span class="title">button effect one</span> </a> shader.frag: precision mediump float; uniform vec2 mouse; uniform vec2 resolution; uniform float time; void main() { gl_FragColor = vec4( sin(time), mouse.x/resolution.x, mouse.y/resolution.y, 1.0); } JS: var btnOne = document.querySelector('.btn-one'); var width = btnOne.clientWidth; var height = btnOne.clientHeight; var app = new PIXI.Application({ width: width, height: height, view: btnOne.querySelector('canvas') }); btnOne.append(app.view); // create rect to fill the button and apply shaders to const rect = new PIXI.Graphics() .beginFill(0x00ff00) .drawRect(0, 0, width, height); app.stage.addChild(rect); // Stop application wait for load to finish app.stop(); PIXI.loader.add('shader', 'shader.frag') .load(onLoaded); var simpleShader; var uniforms = {}; uniforms.mouse = { type: 'v2', value: { x:0, y:0 } } uniforms.time = { type: 'f', value: 0 } uniforms.resolution = { type: 'v2', value:{ x: width, y: height} } function onLoaded (loader,res) { // Create the new filter, arguments: (vertexShader, fragmentShader, uniforms) simpleShader = new PIXI.Filter(null,, uniforms); rect.filters = [simpleShader]; app.start(); // bind mouse to shader effects btnOne.onmousemove = function(evt) { // Get the mouse position mousePos = { x: evt.clientX, y: evt.clientY } // Assigning a new value lets Pixi know to update the uniform in the shader // But doing something like uniforms.mouse.x = 0, won't update in this current version of Pixi simpleShader.uniforms.mouse.value = mousePos; } // apply the filter rect.filters = [simpleShader]; // Animate the filter app.ticker.add(function(delta) { simpleShader.uniforms.time.value += 0.1 }); } I read the post about creating filters in v4, but to a beginner it's just confusing as I don't understand a lot of the terminology used. Anyone can (hint how to) fix this so I can continue my explorations programming shaders?
  10. Hey All, Can I use any GLSL fragment or vertex shader (including 3d raymarching stuff) as a texture in Babylonjs including animated ones? I've done some google searches and I know you can use some, but what are the limitations? For example could I put any animated texture from GLSL Sandbox onto a Babylon.js plane mesh? Do I need to put the uniform variables in the render loop for animation to work? Super hoping the answer is yes, but any and all info will be helpful?
  11. I am new to Babylon and I have some knowledge about unity shaders but I am not sure how can I create shaders for Babylon, can any one suggest me any tool or any link which can help me to design or create some beautiful Shaders for Babylon. as BabylonJs Object's material, as we have shader forge available for unity, can any one suggest me a tool for WebGL shader create tool? that will be your good help for me, I am getting frustrated and need extream help from Babylonjs Community.
  12. Hi! I started a blog with tutorials about PlayCanvas and WebGL development. This is one of the first tutorials, on how to use shaders to mask and animate textures in real time. Let me know if you find this useful! You can follow this tutorials on twitter as well:
  13. Good evening, I was watching the tutorials and I was curious what can be done with the shaders, I managed to do some things, however now I have a lot of questions, 1. What is the difference between using BABYLON.Effect.ShadersStore, BABYLON.PostProcess, and BABYLON.ShaderMaterial ? 2. how can apply 2 effects to the same object.? for example I am trying to make a 360 degree view with the wave effect that is in here is my test code Basic Html <!DOCTYPE html> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>360 + wave</title> <script src=""></script> <script src=""></script> <script src=""></script> <script src=""></script> <script src="main.js"></script> <style> html, body { width: 100%; height: 100%; padding: 0; margin: 0; overflow: hidden; margin: 0px; overflow: hidden; } #renderCanvas { width: 100%; height: 100%; touch-action: none; -ms-touch-action: none; } </style> </head> <body> <canvas id="renderCanvas"></canvas> </body> </html> main JS "use strict"; function create360(sphere, scene) { var sphereMaterial = new BABYLON.StandardMaterial("world", scene); sphereMaterial.emissiveTexture = new BABYLON.Texture("world.jpg", scene); sphereMaterial.emissiveTexture.uScale = -1.0; sphereMaterial.emissiveTexture.vScale = -1.0; sphereMaterial.emissiveTexture.hasAlpha = false; sphereMaterial.backFaceCulling = false; sphere.material = sphereMaterial; sphere.scaling.x = 1000; sphere.scaling.y = 1000; sphere.scaling.z = 1000; } function startGame() { if (BABYLON.Engine.isSupported()) { var canvas = document.getElementById("renderCanvas"); var engine = new BABYLON.Engine(canvas, false); var scene = new BABYLON.Scene(engine); scene.collisionsEnabled = true; var camera = new BABYLON.ArcRotateCamera("camera1",Math.PI/2, Math.PI/2, 90, new BABYLON.Vector3(0, 0, 0), scene); camera.checkCollisions = true; camera.attachControl(canvas); var sphere = BABYLON.Mesh.CreateSphere("Sphere", 16, 10, scene); sphere.checkCollisions = true; create360(sphere, scene); BABYLON.Effect.ShadersStore["customVertexShader"]= "precision highp float;\r\n"+ "// Attributes\r\n"+ "attribute vec3 position;\r\n"+ "attribute vec3 normal;\r\n"+ "attribute vec2 uv;\r\n"+ "// Uniforms\r\n"+ "uniform mat4 worldViewProjection;\r\n"+ "uniform float time;\r\n"+ "// Varying\r\n"+ "varying vec3 vPosition;\r\n"+ "varying vec3 vNormal;\r\n"+ "varying vec2 vUV;\r\n"+ "void main(void) {\r\n"+ " vec3 v = position;\r\n"+ " v.x += sin(2.0 * position.y + (time)) * 0.5;\r\n"+ " \r\n"+ " gl_Position = worldViewProjection * vec4(v, 1.0);\r\n"+ " \r\n"+ " vPosition = position;\r\n"+ " vNormal = normal;\r\n"+ " vUV = uv;\r\n"+ "}\r\n"; BABYLON.Effect.ShadersStore["customFragmentShader"]= "precision highp float;\r\n"+ "// Varying\r\n"+ "varying vec3 vPosition;\r\n"+ "varying vec3 vNormal;\r\n"+ "varying vec2 vUV;\r\n"+ "// Uniforms\r\n"+ "uniform mat4 world;\r\n"+ "// Refs\r\n"+ "uniform vec3 cameraPosition;\r\n"+ "uniform sampler2D textureSampler;\r\n"+ "void main(void) {\r\n"+ " vec3 vLightPosition = vec3(0,20,10);\r\n"+ " \r\n"+ " // World values\r\n"+ " vec3 vPositionW = vec3(world * vec4(vPosition, 1.0));\r\n"+ " vec3 vNormalW = normalize(vec3(world * vec4(vNormal, 0.0)));\r\n"+ " vec3 viewDirectionW = normalize(cameraPosition - vPositionW);\r\n"+ " \r\n"+ " // Light\r\n"+ " vec3 lightVectorW = normalize(vLightPosition - vPositionW);\r\n"+ " vec3 color = texture2D(textureSampler, vUV).rgb;\r\n"+ " \r\n"+ " // diffuse\r\n"+ " float ndl = max(0., dot(vNormalW, lightVectorW));\r\n"+ " \r\n"+ " // Specular\r\n"+ " vec3 angleW = normalize(viewDirectionW + lightVectorW);\r\n"+ " float specComp = max(0., dot(vNormalW, angleW));\r\n"+ " specComp = pow(specComp, max(1., 64.)) * 2.;\r\n"+ " \r\n"+ " gl_FragColor = vec4(color * ndl + vec3(specComp), 1.);\r\n"+ "}\r\n"; // Compile var shaderMaterial = new BABYLON.ShaderMaterial("shader", scene, { vertex: "custom", fragment: "custom", }, { attributes: ["position", "normal", "uv"], uniforms: ["world", "worldView", "worldViewProjection", "view", "projection"] }); var refTexture = new BABYLON.Texture("world.jpg", scene); refTexture.wrapU = BABYLON.Texture.CLAMP_ADDRESSMODE; refTexture.wrapV = BABYLON.Texture.CLAMP_ADDRESSMODE; var mainTexture = new BABYLON.Texture("world.jpg", scene); shaderMaterial.setTexture("textureSampler", mainTexture); shaderMaterial.setTexture("refSampler", refTexture); shaderMaterial.setFloat("time", 0); shaderMaterial.setVector3("cameraPosition", BABYLON.Vector3.Zero()); shaderMaterial.backFaceCulling = false; sphere.material = shaderMaterial; var time = 0; engine.runRenderLoop(function () { var shaderMaterial = scene.getMaterialByName("shader"); shaderMaterial.setFloat("time", time); time += 0.02; shaderMaterial.setVector3("cameraPosition", scene.activeCamera.position); scene.render(); }); } }; document.addEventListener("DOMContentLoaded", startGame, false); I know that in doing this sphere.material = shaderMaterial; I'm over writing the field 360 so try to make the sphere first, render and then apply wave effect, but nothing, some suggestion or example of how to do this? what I really want to do for practice is skydome with the fisheye effect in the camera, that is, everywhere I look goes with that effect in 360 degrees Finally I'm new to this so I'm sorry if the question seems silly.
  14. Hi guys, After some previous attempts to solve some problems in our project, we saw that we were not getting anywhere so we decided that we needed a professional help. This is the offer we published on the Upwork platform, so if anyone is seriously interested in completing the job, please contact us. This is the text of the offer: _______________________________________________________________________________________________________________________________ Representation of a snow-covered mountain scene in Babylon.js Engine The task consists of representing a snow-covered mountain landscape with forest and water bodies in Babylon.js. The focus has to be made on the visual attractiveness while still offering high performance. An example of a scene like that would be this one (Scene1), made in Three.js. We are providing a basic scene (Scene2) made in Babylon.js, and the following improvements have to be made to this scene: 1) Implementing a GLSL Shader / Babylon.js Custom Material which accepts the following criteria: Terrain Texture: - Rocks: should be a procedural texture with a nice natural transition between the snow and the rock texture like in the Scene1. - Water: should be the WaterMaterial provided by Babylon.js Material library. - Snow: it is already acceptable in the Scene2, but any visual improvement is welcome. If needed, the mask image for the texture mapping can be also provided as three separated alpha mask images, one for each type of texture. Furthermore, it has to support a Dynamic Texture on top of the Terrain Texture, so that it should be possible to draw directly on the ground. An example of this would be this scene (Scene3), implemented with ShaderBuilder : The problem with this scene is that it lacks bump maps, fog and shadow effects. There should be visual consistency across the most popular browsers: Chrome, Firefox, Edge and Safari. _______________________________________________________________________________________________________________________________ If you have any questions, we're ready to answer them. Thanks!
  15. So, I'm trying to convert a shader from shadertoy, I'm close but still can't get it working. Also in my actual scene it doesn't seem to be working at all, but it's hard to tell if it's related to the issue I am having w conversion, since I need to rotate the sphere to get it to show up to begin with. Shader is here: (it appears blank at first, but if you rotate it youll start to see the fire. The actual effect I am going for you will see only if you rotate it just right so that you see the fire starting with the white in the middle, and it filling up the sphere). The source shader is here: So the one place I was not sure how to proceed, was mapping over the iResolution variable (which shadertoy states is the viewport resolution). I played around with a bunch of different things and ended up trying the camera input, which works, but requires rotating the mesh to see it at all. Anyone know what input would map over to viewport resolution (or how to get it), and or what I am doing wrong/missing here?
  16. i have one shader which is working fine with Unity Engine, now i want the same into BabylonJs Engine's Application so can any one suggested me the Tool which will help me to resolve this. i am new to shader and its implementation. i can not write shaders as frequently available in Unity so i highly require a tool which can convert that shader into javascript WebGL code.
  17. I recently started experimenting with Filters (shaders) and while it runs well when using a single filter on the world, applying a filter to my sprites seems to be killing performance. Even a simple filter that does nothing but output the color set on just 50 sprites is dropping my framerate from 50 to 18, even when sprites are offscreen. Is that to be expected? It makes filters virtually unusable unless very sparingly.
  18. Dear Babylon JS community, we as a company have decided, that we want to use Babylon JS for a larger project. For this we have specific requirements for what the shaders have to be able to do. I will first state the problem I am trying to solve and then give the context, for possible alternative solutions. PROBLEMS: For our more complex shader computations we want to integrate shadows from at least one shadow-generator in a custom shader. For reasons of confidentiality I can not submit our current project code, that is why I created this test playground: We want to get the influence of all shadows on a fragment as a float value in the shader for further computations. For this we encountered the following problems: - Mapping to shadow-map coordinates seems to be wrong - using functions like computeShadow() from #include<shadowsFragmentFunctions> yields not-declared-error - computeShadow() yields always 1.0 as a result COURSE OF EVENTS: We started playing around with the standart material and shadow generators and quickly got them to work. we wrote a small utility function for setting up the shadow generators, which you can find at the top of the linked playground code. After this we played around with uploading textures into our custom shaders and were able to create the desired effects. We looked into uploading the shadow-map and the shadow-generator parameters into the shader, which was sucessful. You can find the uploads at line 113-115 of the linked playground code. Since we do not want to write the mapping to shadow map coordinates ourselves, we looked if there is already existing code, which we found in the shadowsVertex.fx, shadowsFragment.fx and shadowsFragmentFunctions.fx files. While trying to get the mapping right, we encountered the aformentioned problems. We were not able to get correct results regarding the shadow-uv-coordinates, shaderincludes like the above mentioned #include<shadowsFragmentFunctions> yields a "computeShadow() has not been declared" error when used in the code after the statement and what code we currently copied from these files seems to always yield 1.0 as a result for the sha- dow intensity. We are turning to you now, because we are at a point where we cannot find the errors in our approach/code anymore. We are required to use Babylon JS version 2.5 in our project. Although it didn't seem to make a difference for the shader code we looked through I wanted to mention it. CONTEXT: Our scene is basically shadeless, with multiple materials per object, distributed via a mask. Therefor we combine a precomputed light texture (for individual objects) with a diffuse texture and multiple material textures blended via a mask texture. Since we require no lighting computation we just want the shadow values to get some visual depth in the scene. Therefor the standart material seems to be not sufficient for our purposes, hence the reliance on a custom shader. I saw code that created a custom material with the standart shaders and then re- placed parts of the vertex and fragment code via a function. We would be ready to do this kind of code insertion, if it yields correct shadow information. Sadly I cannot find the example project for this anymore, so if you could provide a link to a simmiliar source it would be much appreciated. Thank you sincerely for your time and help With best regards from the green heart of Germany The Mainequin Team
  19. Hi everybody ! While I was writing my latest shader, I juste came up with annoying errors like this one : Guess what ? No, my typo is NOT at line 85. And I wondered why we couldn't have the real source code. The one after conversion, addition of defines and insertion of includes. Therefore the one where the typo actually IS at line 85. I know that getVertexShaderSource() function exists but it doesn't work without a successfully compiled program (and on success, no need of debugging). Guess what ? It's easy to get ! (Open your console) I would like to know, @Deltakosh , would you be interested in a PR for that or is it already implemented somewhere/useless ? Because in my PR I had to compute again the migratedVertexCode and migratedFragmentCode values, but these ones are available when entering the Effect.noError callback in the _prepareEffect() function... If you're interested, I see two options : - Just returning the two values in the onError callback and let the user make something with that. - Or printing the source code with numbered lines by default exactly as you print the defines Just let me know what you think is better. PeapBoy
  20. When I try using shaders with textures I end up with WARNING: there is no texture bound to the unit 0 Sometimes I get so many it just stops 'cos there are too many webGL errors. I think the texture is not fully loaded when its being accessed.
  21. We use material.effect.setFloat and setTexture so on an so forth... Is there a way to GET a rule from a shader... Unless i am just stupid and don't see it, again How do or can we event getFloat on a value that we can calculate in the shader... or even some present GLSL output property we can read ???
  22. What are all the babylon possible shader attributes and uniforms: // Attributes attribute vec3 position; attribute vec3 normal; attribute vec2 uv; // Uniforms uniform mat4 worldViewProjection; i assume they will all be equiv to some webgl attribute (if you had to do in strait webgl). So i assume: position = gl_Vertex -> To vec3 normal = gl_Normal -> To vec3 uv = gl_MultiTexCoord0 - To vec2 and worldViewProjection = gl_ProjectionMatrix * gl_ModelViewMatrix and so on... What are all the other possible attributes and uniforms, and most importantly what do they equal in regular GLSL. A... I just would to really understand where each attribute and uniform comes from. B... I am trying to make a 'Universal Unity Babylon GLSL Shader Template' for use directly in the Unity for creating Babylon shaders. Minimal Example Shader For Unity: Shader "BabylonJS/Sample basic shader" { // defines the name of the shader SubShader { // Unity chooses the subshader that fits the GPU best Pass { // some shaders require multiple passes GLSLPROGRAM // here begins the part in Unity's GLSL #ifdef VERTEX // here begins the vertex shader void main() // all vertex shaders define a main() function { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; // this line transforms the predefined attribute // gl_Vertex of type vec4 with the predefined // uniform gl_ModelViewProjectionMatrix of type mat4 // and stores the result in the predefined output // variable gl_Position of type vec4. } #endif // here ends the definition of the vertex shader #ifdef FRAGMENT // here begins the fragment shader void main() // all fragment shaders define a main() function { gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0); // this fragment shader just sets the output color // to opaque red (red = 1.0, green = 0.0, blue = 0.0, // alpha = 1.0) } #endif // here ends the definition of the fragment shader ENDGLSL // here ends the part in GLSL } } } My intent was to parse the shader text blocks for the vertex and fragment sections during export. Then do some kind 'key text replace' in the vertex part that reads all the attributes and uniforms replace with babylon.js equivalent at export. I would also love to base64 encode this vertex and fragment program right in the ShaderMaterial section of the .babylon son file. I posted a topic on this, hopefully others can see the benefits as well. Anyways... Any info on all the BUILT-IN attributes and uniforms that babylon.js exposes (and how they are calculated so i can duplicate that calculation when running in unity editor) THANK YOU VERY MUCH
  23. Hi, I noticed that Babylon's implementation of VR camera rig is such that it calculates distortion correction inside of fragment shader. Since the calculations are done per pixel, it results in a steep performance drop, especially on high-density screens, which renders the rig unusable for any mobile phone. On a simplest of scenes, I get only 30fps on Google Pixel. I wonder why this particular method has been chosen over, say, displaying the rendered texture on a dense plane (20x20) and then performing all calculations by vertex of that plane. With this method we would be performing calculations some 400 times per eye (on a 20x20 mesh), versus over 900 000 times (for each pixel on a QHD screen, for example). What I am referring to is the 2nd approach described here: Both WebVR polyfill and Google VR View use this method and I notice no performance drop AT ALL when running their examples. The reason I ask is because I am thinking of developing this method for Babylon, simply because current pixel-based implementation is unfortunately completely unusable. But before I start I'd, like to know if there is some underlying problem, inherent to Babylon, to implement this method? Thanks
  24. Hi everyone ! Being a newbie to a lot of things dev-wise, I only have the 'creative' approach, then learn what I can about how to do it, but some things are obviously out of my reach so far ... But I keep learning little bits by little bits, and hopefully someday I'll be able to contribute more deeply ! I have understood that Shaders can be powerful, and ran across that awesome site that some of you know : It has lots of amazing examples like those : Fog example : Clouds example : So my question is quite simple, would Babylon.js be able to render something like those examples above ? They look amazing !
  25. Do you know any active shaders forum with shaders experts where I can learn shaders properly, ask for examples, etc? It's not that the shaders experts in this forum are not good, but there are very few and they have no time to answer my countless questions. Hopefully you don't mind me asking here about other forums, if it's a problem, you can delete this topic.