• Content count

  • Joined

  • Last visited

About focomoso

  • Rank
    Advanced Member

Contact Methods

  • Website URL

Profile Information

  • Gender
  • Location
    Los Angeles, CA
  1. Recalculate mesh shading

    Thanks Pryme8, but the idea here is to find a solution that doesn't require baking the transform because it takes too long and destroys the local bounding box.
  2. Recalculate mesh shading

    Here's the issue layed out as best I can. In this playground: There are 4 spheres. The underlying geometry for the one on the left is a regular sphere, so there is no scaling applied and it looks as expected. The other three are flat and have been scaled up 4x in the y to become spherical. Sphere number 2 uses VertexData.ComputeNormals to explicitly calculate the normals. I would expect this method to produce a sphere with exactly the same normals as sphere 1, but it doesn't. The normals are scaled 2x more in the y than they should be (which is why the top and bottom of the sphere look darker). With sphere number 3 I calculate the normals explicitly, but again, they seem to be over scaled in the y, this time by the square of the scale factor. With sphere number 4, I calculate the normals by hand, then apply the "descaling" and it looks as expected. The results are the same if you translate or rotate the spheres. It is only scaling that gives strange results. It's possible that there's something wrong with my normal calculations, but I've been banging my head against this for a while and it seems that the way the normals are applied to the mesh is the culprit.
  3. Recalculate mesh shading

    Because if you don't scale the normals, they are scaled "elsewhere" and are incorrect. This is the bug I've been dealing with for days now.
  4. Recalculate mesh shading

    Yes - that's the whole point. Without this, the normals are not correct, even if you explicitly call to recalculate the normals, they are exaggerated in the in the direction of the scale. This solution is the only way I have come up with to get the normals to be correct. This is why I think this is actually an underlying bug. I'll make a playground to show the problem.
  5. Recalculate mesh shading

    I'll make the pr, then. It just seems strange that we have to "unscale" the normals before we move them to the mesh.
  6. Recalculate mesh shading

    By the way, here's a playground of it working:
  7. Recalculate mesh shading

    So, it took me way too long to figure this out, but there's something strange going on with the way mesh.setVerticesData(...NormalKind...) works. It is exaggerating the normals along scaled axes. This is the root cause of the problem. The workaround is to calculate the normals by hand and then divide each component of the vector by the square of the scale in that direction (which is why it took me so long to figure out). This makes our normal updating function look like: export function updateNormals(mesh, scene) { const worldMatrix = mesh.computeWorldMatrix(true); const scale = BABYLON.Vector3.Zero(); worldMatrix.decompose(scale, new BABYLON.Quaternion(), new BABYLON.Vector3()); var positions = mesh.getVerticesData(BABYLON.VertexBuffer.PositionKind, false, true); var normals = []; var v1 = BABYLON.Vector3.Zero(); var v2 = BABYLON.Vector3.Zero(); var v3 = BABYLON.Vector3.Zero(); var normal = BABYLON.Vector3.Zero(); for (var i = 0; i < positions.length / 9; i++) { v1 = BABYLON.Vector3.FromArray(positions, i * 9); v2 = BABYLON.Vector3.FromArray(positions, i * 9 + 6); // flipped v3 = BABYLON.Vector3.FromArray(positions, i * 9 + 3); normal = BABYLON.Vector3.Cross(v1.subtract(v2), v1.subtract(v3)); normal.x /= scale.x**2; normal.y /= scale.y**2; normal.z /= scale.z**2; normal = normal.normalize(); // each normal pushed 3 times, once for each vert normals.push(normal.x);normals.push(normal.y);normals.push(normal.z); normals.push(normal.x);normals.push(normal.y);normals.push(normal.z); normals.push(normal.x);normals.push(normal.y);normals.push(normal.z); } mesh.setVerticesData(BABYLON.VertexBuffer.NormalKind, normals, true); } I'm happy to do a pr, but this probably isn't the best way to handle this. To me, this is a bug in the babylon code. Somewhere, when it scales a mesh, it's multiplying the normals by the scale twice (or three times) which is what's making scaled meshes look so strange.
  8. Recalculate mesh shading

    I think I have a solution. You just need to transform the vertices by the mesh's world matrix like so: var worldMatrix = mesh.computeWorldMatrix(); var positions = mesh.getVerticesData(BABYLON.VertexBuffer.PositionKind, false, true); for (var i = 0; i < positions.length / 3; i++) { var idx = i * 3; var vertex = BABYLON.Vector3.TransformCoordinates(BABYLON.Vector3.FromArray(positions, idx), worldMatrix); positions[idx] = vertex.x; positions[idx+1] = vertex.y; positions[idx+2] = vertex.z; } var indices = mesh.getIndices(); var normals = []; BABYLON.VertexData.ComputeNormals(positions, indices, normals); mesh.setVerticesData(BABYLON.VertexBuffer.NormalKind, normals, true); It might be good to add an "updateNormals" to the Mesh class that does this so you don't have to bake the transform every time.
  9. Recalculate mesh shading

    Is there a way to recalculate the normals without baking the transform in? The baking process can be very slow on complex meshes (2 to 3 seconds) and it remakes the bounding box which is not ideal.
  10. Recalculate mesh shading

    Thankyou, thankyou, thankyou... I knew it had to be something relatively simple.
  11. Recalculate mesh shading

    And here's one that's even more "apples to apples". Both spheres here were created in Tinkercad. They are identical except that sphere-flat.stl was scaled down to 25% in the y. As you can see, when the flattened sphere is scaled back up again, its shading is very different from the non-scaled sphere. I wonder if the stl import is doing something to the normals? I'm not sure what exactly is going on here.
  12. Recalculate mesh shading

    I'm not sure that's what's going on. Take a look at this playground. A regular sphere vs an imported flat sphere that's then scaled up to look regular: The shading looks very different between the two despite them both having flat shading. ps - this one is a little more apples to apples:
  13. Recalculate mesh shading

    How do we get the normals to update, then? That would solve this. I've implemented the updateFacetData() mentioned in a comment to one of the threads above, but it doesn't seem to take scaling into account.
  14. I've searched the forums and found these threads: But neither is helping me with my issue. If you take a look at the playground here: I'm importing an stl. It looks great on import, but when scaled, the shading doesn't seem to update. Is there a way to tell a mesh to update it's shading? Thanks
  15. Is there a complement to getAbsolutePivotPoint()? I need to set the pivot point of an object that has gone through complex translations to it's lowest point in the world y axis. Just noticed the "global" option... Thanks