Jump to content

Search the Community

Showing results for tags 'cubemap'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
    • Paid Promotion (Buy Banner)
  • Frameworks
    • Pixi.js
    • Phaser 3
    • Phaser 2
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
    • GameMonetize
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered
    • Marketplace (Sell Apps, Websites, Games)

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

Found 4 results

  1. Hello BJS community ! I just began to understand HDR textures, gamma correction and so on in order to learn how to do IBL. In this process, I used BABYLON.HDRCubeTexture to convert my equirectangular HDR texture to a usable environment HDR cubemap, as explained here. Then, I need to apply a convolution on this cubemap to obtain my final irradiance cubemap that I will sample during IBL. To compute my irradiance cubemap from the environment cubemap, I use a RenderTargetTexture. Until there, everything works fine ! In the above tutorial link, the guy uses OpenGL and doesn't matter about having output color exceeding [0..1] range. It's useful to keep HDR textures until the last step where he will tone map its result. I learned the hard way that it's not as simple with WebGL. When I store color outside [0..1] range and then sample this result in another shader, result has been clamped between [0..1] range. This stackoverflow question taught me that not only I need to use a floating point texture but also I have to render to a floating point frame buffer. Or something like that, I never dived into pure WebGL code. To render to a floating point frame buffer, I need to enable the EXT_color_buffer_float extension (only available with WebGL 2), but it doesn't seem to be enough. I think I also need to configure the framebuffer with pure WebGL code. So, my question is: Is is yet possible to render color outside [0..1] range using BabylonJS at this time ? How ? If this not ready yet, I'll normalize and denormalize data at each step of course. But I would love to know if doing it in the ideal way is possible. Thank you a lot in advance !
  2. Hello everyone, I'm using this material and would love to be able to use it as a cubemap with the PBR shaders in the reflectionTexture channel. I have looked what is available with the shader but didn't find a way to get the result. Does anyone knows if that's possible ?
  3. If you specify texture customType = "BABYLON.HDRCubeTexture" and the other HDRCubeTexture properties... With .hdr as the name... It will still try to use as regular CubeTexure and look for _px and _nx cube texture files: I think babylon.texture.ts Parse function: now is : if (parsedTexture.isCube) { return CubeTexture.Parse(parsedTexture, scene, rootUrl); } and should be something like: if (parsedTexture.isCube) { if (parsedTexture.customType === "BABYLON.HDRCubeTexture") return HDRCubeTexture.Parse(parsedTexture, scene, rootUrl); else return CubeTexture.Parse(parsedTexture, scene, rootUrl); } OR IS THERE MORE to make HDRCubeTextures load from serialized .babylon file instead of only hand coding?
  4. Hi all ! My scene is an apartment, and i want one cubemap for each room. I could generate it with precalculated render engine (vray, cycles, etc) but it force to reset & redo all materials of the scene, just for 6 texture generation... So i think it's easier to generate cubemap once during application launch. I suppose i just have to create one probe per room, on their center, with their refresh rate set to once ; plus its allow to regenerate on demand if the user make change on material settings (change wood floor to tiles for example). Here are my questions : how can i easily push all the scene during the generation, in each probes ? it is possible to push meshes which already use the probe on their reflection texture (is this will generate conflict) ? Thanks
×
×
  • Create New...