Jump to content

Search the Community

Showing results for tags 'glsl'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

  1. I think I have seen this question asked before, but I have not been able to implement a working solution in pixi.js v4.5.0. I am new to this, so bear with me.... I have a custom GLSL filter (the Game of Life, for now)-- I want to continually apply the filter to the sprite in a feedback loop. That is: Apply filter to sprite image Capture the result Set sprite image to result Apply filter to sprite image Loop I have attempted this with PIXI.Texture.fromCanvas(app.view) -> texture.update(), but that resulted in a black screen. I think using RenderTextu
  2. Hello, friends. There was such a task. Start up one! a wave over a sphere, which, as it were, looks like a map of the planet, which (map) in turn is generated from planeBufferGeometry and then these geometries are combined using THREE.BufferGeometryUtils.mergeBufferGeometries(geometries, false); Code of vertex shader: vertexShader:` varying vec2 vUv; uniform float time; void main(){ vUv=uv; vec3 newposition = position + position*sin(position.z*12.)*0.03; gl_Position = projectionMatrix * modelViewMatrix * vec4( newposition, 1. );
  3. Our game has mostly interior environments, we need reasonably correct reflection for the floor. We’ve already using box projected cubemaps (i.e. parallax envMap) but as each mesh can only has 1 envMap, we have to split the floor to multiple parts according to the local cubemap position, which is unreasonable for our use case. We need someone to implement the POI based cubemap blending method as described very detailed here: https://seblagarde.wordpress.com/2012/09/29/image-based-lighting-approaches-and-parallax-corrected-cubemap/ References: https://seblagarde
  4. A part of my game's post-process render pipeline: Downscale render to 25% size Do some post-processing on the downscaled image Pass both the image before step 1 and the image after step 2 into a GLSL fragment shader with effect.setTextureFromPostProcessOutput(...) Fragment shader outputs the low-res processed image overlaid on top of the original high-res render Problem: The final render is pixelated. I guess the initial downscale made it so the shader doesn't use the higher-res input texture as the "base resolution"? What's going on here? How do I properly set
  5. So I figured with a few people making some cool shaders now and the purposed improvements to the CYOS. I figured we should have a thread for shader development to showcase what people are making and talk about different methods and concepts. To kick things off I figured id post a procedural skymap... this is a cleaned up version of the first on I posted last night and is based off a standard box element. I have not tested it in scene yet but the CYOS output is promising. Ill be looking to add volumetric weather here soon and will be making the suns position dependent on a light on the sc
  6. I am trying to implement an "Additive Shader" (from space shooter tutorial) where BLACK pixels are transparent (or do NOT ADD) and the rest of color add on top... Do we (BabylonJS Community) has a shader already that does something like that??? if not, i will have to make one... I tried to start off by just return a transparent color: void main(void) { gl_FragColor = vec4(0.0, 0.0, 0.0, 0.0); } I have "needsAlphaBlending = true" on shader material options object BUT I STILL SEE BLACK SQUARE (I little less bright , but still there)... I would assume that setting a
  7. I took a quick look at the source code and it seems that we have no way to update only one element of an uniform array? Just like below.. // in JavaScript at init time var someVec2Element0Loc = gl.getUniformLocation(someProgram, "u_someVec2[0]"); var someVec2Element1Loc = gl.getUniformLocation(someProgram, "u_someVec2[1]"); var someVec2Element2Loc = gl.getUniformLocation(someProgram, "u_someVec2[2]"); // at render time gl.uniform2fv(someVec2Element0Loc, [1, 2]); // set element 0 gl.uniform2fv(someVec2Element1Loc, [3, 4]); // set elem
  8. Hey All, Can I use any GLSL fragment or vertex shader (including 3d raymarching stuff) as a texture in Babylonjs including animated ones? I've done some google searches and I know you can use some, but what are the limitations? For example could I put any animated texture from GLSL Sandbox http://glslsandbox.com/ onto a Babylon.js plane mesh? Do I need to put the uniform variables in the render loop for animation to work? Super hoping the answer is yes, but any and all info will be helpful?
  9. Hey folks, first of all: thank you for this great forum. It came to the rescue a few times, now. So, thanks everyone, who's participating. I'm currently working on an idea, where I would like to project a spherical panorama texture to a mesh from inside (meaning from the viewpoint). Similar to a standard VR-Viewer, where the texture is mapped on a sphere from inside. But in my case I would like to map it on the actual scene-mesh which I get from 3dsmax. Now, I know, that I could create the UV's or bake the texture in 3dsmax, but I want to switch between two camera positions and theref
  10. Do we have now or are we going to support the webgl (version 2 i think ) sampler2DArray. I know we have the sampler2D[x] approach that actual take a array of separate babylon textures... But i think each texture STILL counts against the MAX_TEXTURE_IMAGE_UNITS and the new sampler2DArray approach counts against a MAX_COMBINED_TEXTURE_IMAGE_UNITS as well as the sampler2DArray approach allow for tiling in your texture atlas... Here is sample WEBGL texture array code: uniform sampler2DArray myTextureSampler; in vec2 UV; in int index; out vec3 out_Color; void main(void) {
  11. I am trying to create a fragment shader via a PIXI.AbstractFilter to create a wave rippling effect to be applied to a background texture. I have already worked out the algorithm for the wave effect in JavaScript. What I am having difficulty doing is getting the data I need into the shader through PIXI. For my effect to work, I need to have a large Float32Array to keep track of wave heights and a texture containing the original, unaltered contents of the background image to read from in order to apply the effect of pixel displacement (light refraction). I've been doing a lot of searching a
  12. I'm morphing an object's vertices using a vertex shader in Babylon.js. The morphed object looks great, but I can't figure out a way to cause the object's shadow to update as well. I know in Three.js there is a customDepthMaterial for a mesh where you can pass in the same custom vertex shader and correctly update the object's shadow, but is there something similar in Babylon.js? Thanks!
  13. While trying to make my own shader I seem to get stuck in getting shadows to work for a directional light. I copy/pasted what I believe are all the relevant parts from the babylon standard shaders into my own vertex and fragment shaders, and while it compiles and light/textures work just fine, shadows do not appear. Could anyone point me to the issue? gl throws a warning about lack of textures, but I don't believe that affects the outcome as my complete shader with textures has the same issue. http://www.babylonjs-playground.com/#1JFVDG#0; Uncomment line 124 to apply the shader to th
  14. I am not very familiar with GLSL. I am trying to integrate an Ambient Occlusion Shader with Babylon using ShaderMaterial. The shader source can be found at: https://github.com/mikolalysenko/ao-shader Vertex Shader: https://github.com/mikolalysenko/ao-shader/blob/master/lib/ao.vsh Fragment shader: https://github.com/mikolalysenko/ao-shader/blob/master/lib/ao.fsh I've setup the shader in CYOS at: http://www.babylonjs.com/cyos/#1F1POU CYOS is throwing some errors: [.Offscreen-For-WebGL-0x7f9aaa872c00]PERFORMANCE WARNING: Attribute 0 is disabled. This has significant performa
  15. I really need help understanding the GLSL versions of the provided BABYLON uniforms. I have seen them called so many different names depending on who's docs your reading. The ONLY ONE I know for sure is in babylon.js when you say 'worldViewProjection' that is the GLSL computation of 'gl_ProjectionMatrix * gl_ModelViewMatrix' or you can use the built-in shortcut 'gl_ModelViewProjectionMatrix' I need someone who know BABYLON JS GLSL Stuff to PLEASE PLEASE PLEASE... Tell me what the others equal in GLSL terms: view = gl_??? projection = gl_??? (maybe its: gl_ProjectionMatrix )
  16. Hi! I'm trying to apply one filter to a large number (~256) of small (32x32 px) sprites. Within the filter, I'm using vTextureCoord to get the current sprite's coordinates, to draw borders on it. vTextureCoord breaks, apparently referring to the containing canvas's coordinates instead of the individual sprites' coordinates. BUT if I apply that same filter twice (two elements in .filters[] array), in one of the copies vTextureCoord actually does point to the sprite coordinates, and borders are drawn correctly. The other copy still points to the canvas coordinates, and the whole thing
  17. Hi guys. I was working on some shading and discovered a strange behavior. I'm new to all this so maybe I just don't understand something, perhaps you could help - I localized that behaviour in this shader: fragmentSrc = [ 'precision mediump float;', 'varying vec2 vTextureCoord;', 'uniform sampler2D texture;', 'void main(void) {', ' vec4 ownColor = texture2D(texture, vTextureCoord);', ' gl_FragColor = vec4(1.0, 0.0, 0.0, ownColor.a);', '}' ]; Now it looks to me that my shader should tint a sprite in red, but instead it sorta does but then it also sn
  18. Hi! I have small quest for bjs-jedi-knight's If we need get screen position of point in 3d we make something like this: gl_Position = worldViewProjection * vec4(vPosition, 1.0); but what we make do, if we need back operation (get 3d point), if we know screen coords and have 3d object geometry? In BABYLON here help Scene.pick() method, but how looks his analog in GLS? What algoritm? p.s. i can make Scene.pick and translate result in Shader, but it will not so fast, as computing in GLSL, i think. p.p.s. May the BJS-Force be with you
  19. Hello I'm having some issues with working with the aTextureCoord when writing custom filters for pixiJS (https://github.com/pixijs/pixi.js/issues/2142). Do any of you know by accident, how can I make the transforms so that I can work with the textureCoord as in a clean webGL environment?
  20. Hey people! I wanna implement this shader (https://www.shadertoy.com/view/MslGWN#) in phaser state. So i transformed the shader to WebGL style, and added patch to phaser to update iChannel uniforms on update. Phaser.Filter.prototype.update = function (pointer, uniforms) { if (typeof pointer !== 'undefined') { var x = pointer.x / this.game.width; var y = 1 - pointer.y / this.game.height; if (x !== this.prevPoint.x || y !== this.prevPoint.y) { this.uniforms.mouse.value.x = x.toFixed(2); this.uniforms.mouse.value.y = y.toFixed(2);
  21. Hello ! I'm currently working on a "fog of war" material : the standard material + a texture to keep track of once-lit areas, and display them later event if they are not illuminated (because of the standard lighting model or shadowgenerator). It would produce the ~same effect as classical real-time strategy games FoW,(with a moving light revealing the model) but based on actual lighting and on arbitrary UV-unwrapped models. However, I can't figure out how to write to a texture : when the shader gl_FragColor rgb component would be different from (0,0,0), it must modify the corresponding po
  22. Hey there, I'm just getting started with Phaser, and am looking at GLSL shaders (also for the first time) to see how they can be applied to sprites for effects on characters, backgrounds, etc. I ran into behaviour today that's likely a function of how GLSL works, rather than Phaser itself, and probably not the ideal approach. In any case, I'm hoping that someone can confirm one way or the other. Currently, my game has: A TileMap background using two layers using different values for scrollFactorX/Y A Sprite for the player with a filter applied, which is being rendered correctly (although t
  23. Hi, I'm trying to create a shader and I'm having a really hard time to find a good tutorial for GLSL shaders for WebGL. I have never written a shader before and so I really need to learn the basics, does anyone know a good resource for this or could offer some help with it? Here are some of the more basic question I have: - What editor would you recommend for shader language? - Where to look up types and functions (documentation)? - Tutorials for WebGL? Thanks for any help and recommendations, Dinkelborg
  24. Hello there! I'm writing a custom lighting shader that would take a low-frequency lightmap and apply it to sprite / image with 'hard light' algorithm. Here is the relevant code: "void main (void)","{", // sampling sprite albedo "vec4 albedo = texture2D (uAlbedo, vTextureCoord.xy);", // calculating current fragment's position in lightmap space // ................... (unrelevant code) // sampling lightmap with calculated coords "vec4 lightmap = texture2D (uLightmap, lightmapCoord.xy);", // per-component 'hard light' blending of albedo with lightmap "vec3 A = step (vec3 (0.5, 0.5, 0.5), light
  25. I'm working on a soft particle shader, which of course needs a depth texture for the scene. Is there any way to generate that easily with babylon so it can be passed as a texture to a shader?
×
×
  • Create New...