Jump to content


  • Content Count

  • Joined

  • Last visited

About dsman

  • Rank
    Advanced Member

Recent Profile Visitors

1697 profile views
  1. @trevordev Clothing on the character in PG your link is kind of simpler. I am talking about clothing like Fabric Motion demo on babylonjs.com home page. Besides I want to understand how to change clothes for an avatar and still have clothes dynamically animated like the Fabric Motion demo.
  2. What is the top level logic for creating avatars where the user can change some elements (clothes, accessories etc)? Characters can be made easily by skeleton/bones and animation but how to lay clothes which move/stretch as per animations?
  3. How to integrate this with Babylon?
  4. @Deltakosh Ok. We can sure contribute when we implement this. @Nabroski That is what we have considered. User orientation based adaptive streaming for 360. Will have to explore HLS and DASH to see if we can customize play functionality to maintain stream sync.
  5. @Deltakosh I checked the 360VideoDome earlier. But it mentions video must be equirectangular. I think when you said "we definitely have it", you mean one can use equirectangular video projection on a cube. What I am wondering is if a cubemap formatted video (6 faces arranged in a certain way) can be projected on a cube. If not off-the-shelf, is UV mapping also supported for VideoTexture? Equirectangular mapping is not the best option for live 360 videos because it may require 10Mbps download speed for reasonable quality video (4K).
  6. I have some questions about videoTexture and 360. 1. Is there any off-the-shelf way to use cubemap video for 360? If not, can we map a video containing 6 faces to Cube mesh with UV like we do in case of simple Image texture? 2. Does VideoTexture support HLS/Mpeg-DASH/MicrosoftSmoothStreaming and their adaptive bit rate features? 3. Is it practical to stream 6 faces separately and apply them as videoTextures on six faces of a cube? And Sync them with VideoDOM element's controls (and Streaming protocol's manifests)?
  7. Ok. Got it. onBeforeCameraRenderObservable will work I think. Will check. And yeah, not anaglyphic but real stereo rendering in split screen (just like vrDeviceOrientationCamera), meant for smartphone VR box.
  8. @Rodrix3 It doesn't matter how you bring the scene to Unity. Via FBX or directly from Max to Unity. Not sure if lights would be exported from Max to Unity but it is advisable to set lighting in Unity. You will have to set all lights to "Baked" in order for those lights to be considered in the lightmap generation (except for Area lights which are "Baked Only" only by default). @MackeyK24 Like I always have been saying. I am desperately waiting for next release of the Unity Tookit. And I still request you to make a commit to Babylon exporter repository on github with whatever you have.
  9. @Rodrix3 Yes. Yes. It does. You just have to select "Generate Lightmapping UVs" in the fbx properties (appears in the "inspector" tab after selecting the fbx file in the project explorer. AO is baked along with lighting and global illumination (GI affects only if you enable Final Gather pass in lightmapping). @MackeyK24 has contributed entire Unity exporter to Babylon. Just download it from Babylon github repo and place it in Asset folder of unity project and there will be a Babylon menu in the main menu bar. You can export babylon scene from there. Make
  10. @Rodrix3 You can take your scene to unity and bake the lightmap there, and use Unity Tookit (babylon) to export your model. This process is less painful as unity lightmapping is easy and automated.
  11. How do we setup camera if we want to display stereo cubemap? Many renderers are able to render stereo cubemaps. Which means two separate images for each eye. How do we display it with the deviceOrientation camera?
  12. After we select physical material in the 3D max for any mesh, it exports as "PBRMetallicRoughnessMaterial". How to make sure it exports as PBRMaterial and not PBRMetallicRoughnessMaterial ?
  13. Also, I think the above image also has those sun rays (I guess called "God Rays" in 3D graphics terminology). We do that in Babylon using volumetric light scattering?
  14. Thank you all for the inputs. I just saw Threejs bloom post process (which they say is inspired by UE) https://threejs.org/examples/webgl_postprocessing_unreal_bloom.html In that example, the bloom is applied to the material, not to the frame. That one is more realistic. Most quality Unreal Engine visualizations use it heavily I think. As far I as I can tell, Babylon doesn't have it. Right?
  15. This is general guidance question. How to build detailed landscapes like the image below? Trees can be modeled within lower poly count. But how about grass and micro plants? I can see bloom, fog and lens flare are essential to create such a scene.
  • Create New...