Jump to content

How to use WebGL2 RGB Texture Format


Recommended Posts

Hello !

As WebGL2 comes with new texture formats, I decided to play a bit with them, and it seems to work well in pure WebGL2:

(If it prints red, that means the RGB texture did work ?)

I saw texture format has been added to createRenderTargetTexture function so I wanted to try it out.
But whatever I do, I never achieve to create a RGB Render Target Texture. ?

This code works to create a RGBA RenderTarget: https://playground.babylonjs.com/#RBQYSP#5
This code fails to create a RGB RenderTarget: https://playground.babylonjs.com/#RBQYSP#6

Framebuffer is incomplete.

I already pulled the last version of BJS and added gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1) everywhere but it doesn't help much.

I'm struggling with this, I don't understand where something is different from the pure WebGL2 version. I verified InternalSizedFormat, InternalFormat and TextureType and they're OK.

If anybody has an idea... Thanks in advance ?


Link to comment
Share on other sites

Hello in your webgl2 example you are not rendering to the texture so I'm not sure it is a proof


I  remember having a LOT of issues with RGB format and this is why we only offer RGBA so far


Let me know if you manage to render to a RGB texture in webgl2, I will then try to understand what we are doing wrong

Link to comment
Share on other sites

Hi !

As we can see here, RGB format is not required by Opengl specifications (and therefore, by WebGL). That means it's supported for textures but not always for renderbuffers. That's why it worked in my PG which didn't render to a target.
I'm sorry for wasting your time, I didn't know this. That's just not possible for now. ?

Link to comment
Share on other sites

This answers is marked as solved. I found this interesting.
float texture formats are not color readable per default. What i see on the Web people using formats such as RGBA32F for fluid animations or pass large arrays of data as texture.

internalformat: RGB myimage = new Uint8Array([255, 0, 0,]) is waiting for UNSIGNED_BYTE
while internalformat: FLOAT is at Float32Array, so you run into a conversation error. gl_FragColor = float 1. what does it mean ? When converted back to an (u)sampler2D RGA texture ? hahah

Here is a good explanation

playing around, for fun : )


Link to comment
Share on other sites

Hi Nabroski !

Thanks for the precious info.

Render to a RGB16F texture was possible in WebGL1 with the EXT_color_buffer_half_float extension but isn't possible in WebGL2 anymore as this extension doesn't exist anymore.
With the EXT_color_buffer_float extension available with WebGL2 though, it's possible to render to R16F, RG16F, RGBA16F, R32F, RG32F, RGBA32F and R11F_G11F_B10F texture.

I didn't know the existence of the function uintBitsToFloat(), very handy !

Now, I should pay attention at the precision loss and at the performance gain of using RGB8 texture (not necessarily faster to read RGB textures).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Create New...