Jump to content

Alpha in Shader not working correctly?


winty
 Share

Recommended Posts

Hi all, I have a problem with the shader I'm trying to use in Pixi.js. Not sure if I'm doing something wrong or if this is a bug, but any help would be appreciated.

 

The problem occurs when I use shader to set gl_FragColor to something with an alpha of 0. Since this is transparent, I would expect the shader to show the area behind it, with nothing in front. Instead I get what seems to be some kind of blending as if the alpha were not 0. Here is some example code. This is just the "bunny" example with some changes. One bunny sits behind, and is drawn normally. The bunny in front is drawn with a shader that simply sets the color to (1,0,0,0). But instead of being drawn transparently, a yellow rectangle appears. It seems like the color from the shader is being ADDED to the screen without regard for the alpha.

 

If however I change the line in the shader, from 'gl_FragColor = vec4(1.0, 0.0, 0.0, 0.0);' to 'gl_FragColor = texture2D(uSampler, vTextureCoord);' so that it is simply passing through the bunny texture normally, the transparent part of the bunny is rendered correctly!

 

So my question is, is this a bug, or am I misunderstanding this? And is there any way to generate a varying alpha in a shader?

index.html

post-14037-0-10154800-1428511690.png

Link to comment
Share on other sites

  • 8 months later...

Hi,

 

I ran into same issue as above. I made a test script with default textureShader copied as new shader and another with just the gl_FragColor changed to vec4(1.0, 1.0, 1.0, 0.0);

The script can be seen here: https://dl.dropboxusercontent.com/u/8932415/pixi_shader_alpha/shader_alpha.html
TestShader1 is the one with constant color. TestShader2 is exact clone of default TextureShader.

I also tried doing:
vec4 c = texture2D(uSampler, vTextureCoord)*vColor;
c.a = 0.0;
gl_FragColor = c;

But that still keeps the graphic in view, with a slightly changed color. Multiplying each color with 0 produces wanted result.

The question for me is the same as above, is this intended or a bug? If intended, then is there a way to change alpha per pixel?

Link to comment
Share on other sites

Got the shader working correctly by multiplying the alpha to color values. So most likely has to do something with premultipliedAlpha.

According to webglfundamentals.org one solution could be to change the blendfunction used to gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA. Will test that at some point.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...