• Content Count

  • Joined

  • Last visited

  • Days Won


PatrickRyan last won the day on November 21 2018

PatrickRyan had the most liked content!

About PatrickRyan

  • Rank
    Advanced Member

Contact Methods

  • Website URL
  • Twitter

Profile Information

  • Gender
  • Location
    Redmond, WA

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. @JCPalmer a normal map isn't a new concept, nor is it something that does not apply to Babylon.js. We have support for normal maps as can be seen at There isn't a lot of information on creating them on this doc page, mostly because the information can be easily found elsewhere and all mesh bakers will give you the option for baking them if you can supply a high-poly mesh to project to. The difference between a bump map, which your exporter supports, and a tangent space normal map is the amount of detail that the map contains. Bump maps are black and white images that convey the offset along the face normal for a texel from 0-1. This means that all texels from your texture have a normal parallel with the face normal and light will bounce in the same directions across the face. A normal texture stores a Vector3 per texel (R, G, and B values) so that you can calculate a vector at that texel rather than just an offset giving more realistic and detailed microsurface representation of your model. Due to the extra information you get from a tangent space normal, you won't find a lot of 3D artists opting for a bump map instead of a normal map anymore.
  2. @Madclaws can you post a playground for us to look at, even a partial one with the particle system only would help. Taking wild guesses at what is happening will just waste your time so looking at the code will produce the fastest solution. Thanks!
  3. @Madclaws, the most likely issue here without seeing a playground is you did not set a blendMode on your particle system. If a blendMode is not set the default is BLENDMODE_ONEONE which ignores the alpha of your texture for more performance in your system as there is no overdraw. To get alpha you need to use: particleSystem.blendMode = Babylon.ParticleSystem.BLENDMODE_STANDARD; You can learn more about the available blendMode options at To help you out a bit I converted the particle demos from our release video into a playground so you can dig in and play with the code to see how we normally set up particle systems. Make sure you focus on the canvas (click on it) and press space bar to iterate through the systems. I usually set up demos like this so I have control over repeating the system so I can evaluate the interactions of the particles and tune the variables. Let me know if you have more questions.
  4. @Madclaws I took a look at your playground and added a couple of things for debug. The first was to play the particle system on key press (spacebar) so that I don't have to rerun the scene every time to see what is happening. The second is that I commented out your manualEmitCount line as that will never allow more particles to spawn than the ones it spawns. It essentially says "emit only X particles all at once" which is why when you call start again nothing happens. You've already hit your particle limit on that system. I normally never use manualEmitCount just because of that limitation unless I am instantiating a system to be destroyed right after it's done. If you want a system that hangs around that you can turn on and off, you want to couple a high emitRate with a low targetStopDuration. This will simulate a burst, but you can time the emitRate to give some more pleasing gradients of speed than you get with a burst. Now your particle system plays when you reach each point or when you hit space, but it looks like your code needs a little massaging because the second collision continues to happen even after the cube is gone. To fix this I added a second isDestroyed boolean to prevent the particle system from running again like you do with your first collision. Let me know if you have more questions! Take care.
  5. @Ulfheden I dug into the exporters to the Babylon format for Blender and Maya and am finding some strange behavior that may be bugs. I don't know if its directly related to the normals being sharpened because the normal data is present in the Babylon file. However, I noticed that while the materials are exporting with the mesh coming out of Blender using the Babylon 5.6.4 exporter, the textures aren't being read correctly. I am pinging @kcoley to take a look as he is the one working on our exporters. We've been focused on glTF export lately so neither of us are sure if all the new exporter features are pathing correctly to the Babylon format.
  6. @Ulfheden can you walk me through how you are creating your Babylon file of your mesh from Blender? I can't figure out why the parenting of the mesh would affect the normals of the mesh since you have normals written into the file so there must be something else going on. What exporter are you using and how are you setting up the export? Also, why aren't you getting a materialId in your Babylon file? Are you exporting the mesh with no material? If so, it makes it harder to debug with the sandbox inspector because I don't have access to some of the normal display options when there isn't a material in the file. I'd like to figure out if the issue lies in the file setup, the exporter, or the export process so that we can address any bugs or communicate broadly about the process.
  7. @Ulfheden I looked into your mesh and the only thing I saw missing was your mesh had hard vertex normals, so when that exports you will see faceting in the mesh. The thing to do is select your mesh and soften your faces/edges: I also exported a glb (binary version of a glTF) from the Khronos glTF exporter and used that to test. You can drag/drop it right into the sandbox to see how it renders with Babylon, and other apps can handle it like 3D Viewer for Windows or Sketchfab. You can get the plugin and docs at You can see that smoothing the normals renders correctly... I did make the material metallic just so you could see the normal map easier without building a whole lighting rig to see it: I noticed that your rig breaks when exporting it, so I would look into your skeleton creation and where the bone hierarchy paths because this looks like there is a bin pose rotation on the jaw that is causing it to rotate into the skull. Hope this helps!
  8. @Ulfheden would you be able to share the Blender file as well? I can't debug the mesh as easily with a Babylon file as I have to convert back to a DCC format to be able to see what may be going on. Thanks!
  9. @timetocode that model is from and can be downloaded as an FBX with animations. You should be able to import it into Blender, but I have no idea what the original authoring program is.
  10. @JohnyCage at the moment, we don't have the feature implemented into our Max exporter that allows you to set clips from your timeline. We have this available in the Maya exporter and it is on the roadmap for Max, but we haven't gotten it in yet. What @Dad72 mentioned is the best way to do this until we are able to update the Max exporter with clip functionality.
  11. @Ulfheden, would you be able to share your mesh and textures with us so we can take a look at what is happening. I have used XNormal in the past, but it is not currently a part of my pipeline. I am usually baking normals either in Substance or ZBrush. But we can successfully use normal maps in Babylon.js as can be seen in this demo asset I created. Happy to take a look at the files and debug what is going on.
  12. I'm looking at your one arm animation file and the reason you don't see animation on that file is that there are two validation errors on the file with the following information: { "uri": "DemonOneArm_Max.gltf", "mimeType": "model/gltf+json", "validatorVersion": "2.0.0-dev.2.5", "validatedAt": "2018-11-21T23:13:19.321Z", "issues": { "numErrors": 2, "numWarnings": 0, "numInfos": 3, "numHints": 0, "messages": [ { "code": "MESH_PRIMITIVE_UNUSED_TEXCOORD", "message": "Material does not use texture coordinates sets with indices (0, 1).", "severity": 2, "pointer": "/meshes/0/primitives/0/material" }, { "code": "MESH_PRIMITIVE_UNUSED_TEXCOORD", "message": "Material does not use texture coordinates sets with indices (0, 1).", "severity": 2, "pointer": "/meshes/1/primitives/0/material" }, { "code": "NODE_EMPTY", "message": "Empty node encountered.", "severity": 2, "pointer": "/nodes/60" }, { "code": "ACCESSOR_INDECOMPOSABLE_MATRIX", "message": "Matrix element at index 831 is not decomposable to TRS.", "severity": 0, "pointer": "/accessors/16" }, { "code": "ACCESSOR_INDECOMPOSABLE_MATRIX", "message": "Matrix element at index 831 is not decomposable to TRS.", "severity": 0, "pointer": "/accessors/17" } ], "truncated": false }, "info": { "version": "2.0", "generator": "babylon.js glTF exporter for 3ds max 2019 v1.3.9", "resources": [ { "pointer": "/buffers/0", "mimeType": "application/gltf-buffer", "storage": "external", "uri": "DemonOneArm_Max.bin", "byteLength": 601992 } ], "hasAnimations": true, "hasMaterials": true, "hasMorphTargets": false, "hasSkins": true, "hasTextures": false, "hasDefaultScene": true, "primitivesCount": 2, "maxAttributesUsed": 7 } } If I export just the body of the character, the animation plays, but we still get one error in the validator: { "uri": "DemonOneArmBody_Max.gltf", "mimeType": "model/gltf+json", "validatorVersion": "2.0.0-dev.2.5", "validatedAt": "2018-11-21T23:36:54.474Z", "issues": { "numErrors": 1, "numWarnings": 0, "numInfos": 2, "numHints": 0, "messages": [ { "code": "MESH_PRIMITIVE_UNUSED_TEXCOORD", "message": "Material does not use texture coordinates sets with indices (1).", "severity": 2, "pointer": "/meshes/0/primitives/0/material" }, { "code": "NODE_EMPTY", "message": "Empty node encountered.", "severity": 2, "pointer": "/nodes/59" }, { "code": "ACCESSOR_INDECOMPOSABLE_MATRIX", "message": "Matrix element at index 831 is not decomposable to TRS.", "severity": 0, "pointer": "/accessors/7" } ], "truncated": false }, "info": { "version": "2.0", "generator": "babylon.js glTF exporter for 3ds max 2019 v1.3.9", "resources": [ { "pointer": "/buffers/0", "mimeType": "application/gltf-buffer", "storage": "external", "uri": "DemonOneArmBody_Max.bin", "byteLength": 466064 }, { "pointer": "/images/0", "mimeType": "image/png", "storage": "external", "uri": "UVmap2017255_baseColor.png", "image": { "width": 1024, "height": 1024, "format": "RGBA", "bits": 8 } } ], "hasAnimations": true, "hasMaterials": true, "hasMorphTargets": false, "hasSkins": true, "hasTextures": true, "hasDefaultScene": true, "primitivesCount": 1, "maxAttributesUsed": 6 } } Interestingly, the FBX I ran through Maya for export results in no errors, but one warning and the animation plays correctly. The difference is that the Maya export does not have the NODE_EMPTY error which I am sure is the problem here. Exporting the body only and the cloth only from the one-arm animation file results in a NODE_EMPTY warning in each case. Unfortunately, with the cloth only export I can't see if the animation is working. { "uri": "DemonTestMaya2019.gltf", "mimeType": "model/gltf+json", "validatorVersion": "2.0.0-dev.2.5", "validatedAt": "2018-11-21T23:41:03.318Z", "issues": { "numErrors": 0, "numWarnings": 1, "numInfos": 2, "numHints": 0, "messages": [ { "code": "UNSUPPORTED_EXTENSION", "message": "Unsupported extension encountered: 'KHR_lights'.", "severity": 1, "pointer": "/extensionsUsed/0" }, { "code": "MESH_PRIMITIVE_UNUSED_TEXCOORD", "message": "Material does not use texture coordinates sets with indices (1).", "severity": 2, "pointer": "/meshes/0/primitives/0/material" }, { "code": "MESH_PRIMITIVE_UNUSED_TEXCOORD", "message": "Material does not use texture coordinates sets with indices (1).", "severity": 2, "pointer": "/meshes/1/primitives/0/material" } ], "truncated": false }, "info": { "version": "2.0", "generator": "babylon.js glTF exporter for maya 2018 v1.2.6", "extensionsUsed": [ "KHR_lights" ], "resources": [ { "pointer": "/buffers/0", "mimeType": "application/gltf-buffer", "storage": "external", "uri": "DemonTestMaya2019.bin", "byteLength": 9452252 }, { "pointer": "/images/0", "mimeType": "image/png", "storage": "external", "uri": "UVmap2017.png", "image": { "width": 1024, "height": 1024, "format": "RGBA", "bits": 8 } } ], "hasAnimations": true, "hasMaterials": true, "hasMorphTargets": false, "hasSkins": true, "hasTextures": true, "hasDefaultScene": true, "primitivesCount": 2, "maxAttributesUsed": 7 } } At this point I think we need to call in our glTF expert @bghgary as he will be able to make more sense of the errors than I can.
  13. Our exporters should be kicking warnings about non-normalized skin weights, so if that is truly what's going on it is still likely a bug. Though I am unsure why the same FBX sent through Maya would succeed because exporting with non-normalized weights results in an invalid glTF. I'm guessing that @kcoley will know what is going on.
  14. @jsunandmax it looks like there may be a bug in the Max exporter as the Maya exporter works correctly. I will ping @kcoley as he is working on the DCC exporters and give him my files to look at. Demon_MaxMayaComparison.mp4
  15. You are going along the right path so far, and I just see one issue from your last post. It sounds like you are skinning the white hitbox to the skeleton which puts you in the same problem as before where it will only hit the box's bind position. This is because the vertices are taking their final position from the translation of the joints they are skinned to and interpolating a position based on an offset between them weighted by the skin. An example would be that you have a vertex that is skinned to two joints with a 0.7 weight to one and a 0.3 weight to the other. All skin weights must be normalized (add up to 1) and you can have up to 4 joint influences in Babylon.js. When you move that sample skeleton the vertex will take its final position as a linear interpolation between the two joints, not midway between them but 20% closer to the 0.7 weight joint including the offsets from the bones. What you want from your hitboxes is not to calculate the vertices of the box like you do for skinning, but rather to take the translation, rotation, and scale from a joint and apply it to the transform of the hitbox. The vertices of the hitbox do not change at all and just take their position from the triangle list of the mesh. To do this in Blender, you are looking for a parent relationship like this: Setting a parent on an object confers no skin to it, but rather the transform takes the translation, rotation, and scale from the parent node. In essence, it is a separate mesh with the properties you need for a ray cast, but follows a joint. In a sense, it's similar to skinning to the skeleton, but the difference is that the mesh won't deform and you can only take the properties of your single parent node. That means if you have a leg mesh and parent it to the leg joint, and the knee bends, your leg mesh won't follow that. This could be useful for a simple minecraft-type character, but again, you would need to carry 1 mesh per body part rather than one mesh for the whole body, which skinning allows. For attachments, however, parenting is the best way to go... that could be accessories, hitboxes, or even things like attaching a character to another character like a mount.