• Content Count

  • Joined

  • Last visited

  1. I have a car moving based on some custom inputs. I want to put WebVRCamera in the driver's position, so they can look around using their headset. I have tried several ways to attach it to my car with no success: helper.webVRCamera.parent = content.headContainer seems to have no effect. helper.webVRCamera.position = content.boardFloor.position and helper.webVRCamera.rotation = content.boardFloor.rotation is messing up when the car rotates
  2. @Deltakosh It is worth noting that this trick with _autoLaunch will not work when used from TypeScript - this property is private. This is part of why I created my own VideoTexture that does not take control of playback and just depends on events. Another part is that I needed to prevent `updateVideoTexture` calls for textures that are not visible. It asks for more wiring to detect visible textures through visible meshes and therefore is not currently in shape to replace current VideoTexture, but would you be interested in discussing ways to integrate this into the engine?
  3. @JCPalmer Thanks for the detailed insights. I will pass to my colleagues who are creating the models.
  4. Thanks! Will give it a go in hopes for the binary format to reduce transport and decoding time - my `.babylon` currently weight 13 MB.
  5. I am using Blender exporter to `.babylon` file format but lately am noticing more mentions of `.gltf` (and `.glb`) besides your support for others. What is your recommendation for my use case explained below or more generic rule of thumb on when to to use which format? My context: We make scenes in Blender with more than 100 smaller pieces with textures and expose those with BJS. We allow users to change textures and materials. We allow users to move objects. We do do not have any animations.