Recommended Posts

Hello !

As WebGL2 comes with new texture formats, I decided to play a bit with them, and it seems to work well in pure WebGL2:
https://playground.babylonjs.com/#RBQYSP

(If it prints red, that means the RGB texture did work 🙂)

I saw texture format has been added to createRenderTargetTexture function so I wanted to try it out.
But whatever I do, I never achieve to create a RGB Render Target Texture. 😥

This code works to create a RGBA RenderTarget: https://playground.babylonjs.com/#RBQYSP#5
This code fails to create a RGB RenderTarget: https://playground.babylonjs.com/#RBQYSP#6

Framebuffer is incomplete.

I already pulled the last version of BJS and added gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1) everywhere but it doesn't help much.

I'm struggling with this, I don't understand where something is different from the pure WebGL2 version. I verified InternalSizedFormat, InternalFormat and TextureType and they're OK.

If anybody has an idea... Thanks in advance 😊

PeapBoy

Share this post


Link to post
Share on other sites

Hello in your webgl2 example you are not rendering to the texture so I'm not sure it is a proof

 

I  remember having a LOT of issues with RGB format and this is why we only offer RGBA so far

 

Let me know if you manage to render to a RGB texture in webgl2, I will then try to understand what we are doing wrong

Share this post


Link to post
Share on other sites

Hi !

As we can see here, RGB format is not required by Opengl specifications (and therefore, by WebGL). That means it's supported for textures but not always for renderbuffers. That's why it worked in my PG which didn't render to a target.
I'm sorry for wasting your time, I didn't know this. That's just not possible for now. 🙂

Share this post


Link to post
Share on other sites

This answers is marked as solved. I found this interesting.
float texture formats are not color readable per default. What i see on the Web people using formats such as RGBA32F for fluid animations or pass large arrays of data as texture.

internalformat: RGB myimage = new Uint8Array([255, 0, 0,]) is waiting for UNSIGNED_BYTE
while internalformat: FLOAT is at Float32Array, so you run into a conversation error. gl_FragColor = float 1. what does it mean ? When converted back to an (u)sampler2D RGA texture ? hahah

Here is a good explanation
https://stackoverflow.com/a/45573301/7332242

playing around, for fun : )
https://playground.babylonjs.com/#RBQYSP#8

 


 

Share this post


Link to post
Share on other sites

Hi Nabroski !

Thanks for the precious info.

Render to a RGB16F texture was possible in WebGL1 with the EXT_color_buffer_half_float extension but isn't possible in WebGL2 anymore as this extension doesn't exist anymore.
With the EXT_color_buffer_float extension available with WebGL2 though, it's possible to render to R16F, RG16F, RGBA16F, R32F, RG32F, RGBA32F and R11F_G11F_B10F texture.

I didn't know the existence of the function uintBitsToFloat(), very handy !

Now, I should pay attention at the precision loss and at the performance gain of using RGB8 texture (not necessarily faster to read RGB textures).

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.