mr_pinc

Members
  • Content count

    25
  • Joined

  • Last visited

  1. Reusing LensRenderingPipeline

    I've created this playground : https://playground.babylonjs.com/#63QN3J Weird thing is, you need to click the toggle button a couple of times. Detaching / does not seem to work the first time even though it works in my project
  2. Scene Environment Texture

    It's a property to see for all materials. So you load that 1 texture set it on the scene and all meshes in the scene will have that texture applied for their environment or reflection textures.
  3. I'm trying to leverage the built in Depth of Field effect - when I attach it to my camera I get an error - attachCamerasToRenderPipeline() - >You're trying to reuse a post process not defined as reusable. I can still seem to toggle the effect on an off. I looked at the typescript code and can see that the internal effects of the LensRenderingPipeline class are indeed set to be non-reusable. The effect does seem to be behaving the way it we expect but is there anything I should be concerned about in regards to this error?
  4. Pretty sure you need to use - https://github.com/crosswalk-project/cordova-plugin-crosswalk-webview - I have not tried it myself yet, but that's what I found in doing research for an upcoming project.
  5. PBR, cameraContrast, cameraExposure

    imageProcessingConfiguration is now an object that is a member of the 'scene' object. You can access contrast and exposure there.
  6. PBR in 3d max

    Sebvans suggestion helped a lot with substance - thanks. Those settings were definitely necessary for the PBRs to look good. In regards to max. I don't know how specifically the materials are set up but that's a bit besides the point. Right now I tell my team members that max can export the PBRs. They say... ok show me.... and that's where we are stuck. https://doc.babylonjs.com/exporters/3dsmax This is a useful guide but it doesn't include any info on PBRs. Like I said, we're going to figure it out one way or another but having a proper guide would have (and could still) saved us a bunch of time.
  7. From my technical artist: our typical project showcases a very complex mechanical object, or set of objects. we are constrained by draw culls and by content size in megabytes, due to the fact that our app needs to download quickly even with a mediocre connection speed. since the amount of pieces in our objects is high, the triangle count we can afford per piece has to be minimal. what makes matters worse is often most pieces have unique shapes, so instancing does not completely solve the problem. we are using normal maps to fake some details, but we run into the common issue of lowpoly geometry smoothing which produces visual artifacts. we can provide examples later if required; but just think of your keyboard which has no definite smoothing seams, but lots of smooth angles and bevels, many of which have to have a geometrical representation in addition to pixels. utilizing the normal compensation approach we can bake the geometry smoothness data, invert it, and combine with the normal map to restore the visual smoothness where required (check handplane.com for some nice examples). we are also averaging the geometrical normals to reduce the artifacts. but this can only take us half-way, since this approach does not always work 100%. in such cases, using object-space normal map instead would give us full control over the look of our assets - and those arealways either static, or animated on object level. of course the same could be achieved with world-space normal maps, but this would result in wrong visuals for instanced objects (think of two instanced bolts facing different directions); thus we would have to revert to unique meshes with unique UVs and correct world space normal data in the texture. the cost of one extra channel does not seem too high (meaning we could use B of tangent space normal map for storing something useful, and that of course would not be possible with OS or WS). the precision difference would be negligible as well. we are also not using any detail normals currently, so combining normals would not be an issue. now, we understand that adding the OS option could lead to refactoring in many unexpected places. but we keep our hopes up. The attached image is for illustration purposes only - example of smoothness artifacts resulting from geometry; adding tangent-space normal map here would not hide the issues completely. this asset in general also illustrates how the artist had to introduce numerous hard edges across bevels to preserve the smoothness. this introduces a lot of sharp lines where things need to be smooth, and adds extra cost to geometry, multiplying the number of verts to be exported
  8. PBR in 3d max

    For example - if you look at this demo - https://www.babylonjs.com/demos/pbrglossy/ - the model just works with just a simple load. But where did the model come from? Was it max or substance or blender? How were the materials set up for it?
  9. PBR in 3d max

    I think this is the core of the issue. My team has been working on trying to export PBRs from max and from Substance (via both GLTF format and .Babylon format) but we're not getting all the maps exported correctly, or some files aren't linked. We're going through a process of trial and error right now to figure out how to resolve it which is fine but I imagine a lot of our experimenting would be eliminated if there were some proper guides.
  10. Is there a way to use object-space normal maps instead of tangent-space normal maps? If not do you plan to support object-space normal maps, or world-space normal maps, in the future builds?
  11. Reflection Probes with PBRs and Max

    Perfect. Thank you for the awesome help.
  12. Reflection Probes with PBRs and Max

    I understand now, thank you. I set up a playground doing what you describe. Only thing is I don't know how to omit the skybox from main camera while keeping it in the reflection capture. https://www.babylonjs-playground.com/#28G6UT#66
  13. Reflection Probes with PBRs and Max

    Yes that was a great explanation thank you. The thing is though I am only using the environment texture of a skybox, I'm not actually rendering a skybox. SoI don't really understand how the reflection capture - could see it.
  14. Reflection Probes with PBRs and Max

    I don't quite understand. I have a scene with 10 meshes and all 10 have the same environment texture, a DDS that is effectively a skybox (the actual skybox is not visible). I want 3 of those 10 meshes to also reflect each other along with that original environment map. How do I combine the two?
  15. Reflection Probes with PBRs and Max

    I'm already using an environmental texture on the material, is there a way to composite the reflection capture and the environmental texture?