Jump to content

Image gets darkened after resizing the object


flyingbee
 Share

Recommended Posts

Hi folks,

I am a beginner for babylon.js and blender. So I imported a cube with one face textured from blender to babylon js using  BABYLON.SceneLoader.Load.

The scene has the default point light in blender and the HemisphericLight I created using babylon.js. Everything looks good (please see the 2.jpg in the attachment), but after I resize the cube to make the cube flat (using cube.scaling.y = 0.1), the lower part of the textured face will be always shadowed, no matter how i adjust the light or how to move the cube (please see the 1.jpg). 

I already made the scene as simple as possible to narrow down the cause of the issue, but I still cannot figure out why the lighting cannot be rendered properly. So any help will be much appreciated!

1.jpg

2.jpg

Link to comment
Share on other sites

Hi Flyingbee... welcome to the forum.  Sorry to hear you are having problems, and thanks for reporting and testing.

I made a BabylonJS playground scene... that is similar to your issue.

https://www.babylonjs-playground.com/#13XGJV#2

It is something to experiment-with.  I was not able to reproduce the problem that you see.  FB or anyone, feel free to make edits and do more RUNs and SAVEs.  Tell us what you learn.  I will continue testing, too.

FB... can you publish/share your .babylon file and perhaps your .blend file, too? 

When I need to share/publish BabylonJS-related files, I use my free Github account... and then I can use those files in playgrounds... like this

You could do the same.  You can actually drag'n'drop files into GitHub folders (then scroll to bottom and press COMMIT button after the drag).  Quite handy, if you don't have a CORS-cleared web server of your own.

All in all, I think there is an issue with the Blender pointLight (perhaps a range setting), or possibly something odd with the lighting NORMALS on the Blender cube.  We can learn more... if you can provide .babylon and .blend files.  I'm going to ping @gryff and see if he has some ideas (thx Gryff).  Gryff is quite experienced with Blender and imports from it (and he's a darned nice guy, too)  :)

Also, we can examine .babylon files with Windows Notepad or many other online/offline JSON-file viewers.  Perhaps the Blender exporter is doing something unusual. We might be able to see/find it... by examining the .babylon file carefully.

I assume you are using BabylonJS 3.0 preview?  And you are using newer versions of Blender and its exporter?

Anyway, we WILL find out WHY this is happening... if we do enough experimenting.  Comments from everyone... welcome.

Link to comment
Share on other sites

Hi @Wingnut and @gryff,

Thank you very much for all your help! After getting frustrated with this issue, your detailed answers just encouraged me a lot. Sorry it took me a while to mange to upload my project to GitHub. @Wingnut Thanks for doing the same experiment for me, the textured effect is exactly what i am trying to achieve. So I tried to put my code in BabylonJS playground, but it looks like i need to populate the createScene() function by creating scene from scratch, instead of importing a scene from blender.....sorry i still have not figured out how to use BABYLON.SceneLoader.Load() in BabylonJS playground to load an existing scene, if it is possible to do that, can you please let me know how? ~  

So I created a new GitHub account and uploaded my simple project there. The webpage with that flattened cube which refuses to be lightened properly is served here: 

https://babylonbee.github.io/  

In the runRenderLoop function, I increase the scaling.y value by 0.001 to make the cube fatter and fatter each time, so we can easily see how the cube gets rendered as it resizes.  So when the cube reaches its full size (scaling.y == 1), we can see that the texture gets rendered perfectly with the point light effect. Also, as we can see, the textured face is getting closer to the light as it resizes, which makes me wondering if it is just of matter of position...but even I tried to put the flattened cube closer, it still cannot be rendered properly....so that's something that frustrated me these days....So all the source files is here: https://github.com/babylonbee/babylonbee.github.io.git  Thanks @gryff and @Wingnut if you can take a look!!!!!!! I might have more questions to bug you @gryff i just like blender lol

@Wingnut sorry i forgot to check...so im using Babylonjs 2.5, i imported this script in my project http://cdn.babylonjs.com/2-5/babylon.max.js, so it is version 2.5, i guess? 

I download the latest exporter from the GH here: https://github.com/BabylonJS/Babylon.js/tree/master/Exporters/Blender

My blender version is 2.78. It is also the latest version. I will do some more experiment to see if i can figure out the cause of this issue by any change :-)

Have a great day~

 

 

Link to comment
Share on other sites

@flyingbee : TY for posting your files :)

I notice from the home.babylon file that you are using version 4.6.1 of the Blender exporter but have downloaded the latest version :) Same results with the latest Blender exporter?

The one file I did not find was your .blend file. Can you post that so I can check some settings?

2 hours ago, flyingbee said:

i just like blender

Nothing wrong with that - it is a nice piece of software and it is free!!!

And @JCPalmerand @Deltakosh  have developed a great exporter for it

cheers, gryff :)

Link to comment
Share on other sites

Thanks @gryff for looking into this for me!

Yep, I am using version 4.6.1 of the Blender exporter, looks like this is the latest version, but still get that result. So everything looks good in Blender but it is a different result in Babylonjs:unsure: I used the default camera and point light in Blender, added a monkey, resized the cube and textured it, and finally exported it to Babylonjs. I uploaded .blend file here https://github.com/babylonbee/babylonbee.github.io/blob/master/Espilit/home.blend  (also in attached files). Thanks @gryff for taking a look !! :D

 

 

home.blend

Link to comment
Share on other sites

@flyingbee : TY for the blend file - I will take a look at it. The latest version of the Blender Babylon Exporter is v5.3. It comes as a zip file.

Download that zip file,. uninstall the old exporter then install the new version using the "Install From File" option in the Preferences Panel. No need to unzip the file, just point Blender to the zip file you downloaded and Blender will handle it.

Activate the addon, then hit the "Save User Settings" button.

I will take a look at your file.

cheers, gryff :)

Link to comment
Share on other sites

@flyingbee: I looked at your blend file and make this suggestion - see image below:

1. Select the mesh object with the photo texture on it

2. In the Properties Panel select the Mesh Properties the triangle highlighted in blue in the image below.

3. Now under the Babylon Export Properties check the box marked "Use Flat Shading"

Now export as usual to check the result.

And just so you can see what is happening, look at Suzanne. In your Blender file.  Suzanne is flat shaded - you can see all the individual mesh faces. However, when you view her in your browser, the mesh has been smoothed. That is the default option for the exporter.

cheers, gryff :)

blender2.png

Link to comment
Share on other sites

Hi @gryff, I tried your solution and it is working perfectly!! You are awesome!!!:D

So I installed Blender Exporter v.5.3 and tried the flat shading option. Now the lighting and shadowing of the objects rendered in the webpage is almost identical with that rendered in Blender. That's just amazing! I also checked the .blend files, so it seems that the indices, normal and positions of the object got changed after we use the flat shading property, which would affect the lighting, is it correct? So based on my understanding, if we want to apply texture onto a flat surface (e.g. a wall or a floor), we want to check this option, otherwise, Blender would render the surface as a 3D object which would cause some lighting issue when it is exported to Babylonjs? Thank you @gryff! I appreciate your help and all of you! I will do some more experiment on that and keep learning more about Blender modeling and Babylonjs.

Have a good night:)

 

 

Link to comment
Share on other sites

Nice work, you guys!  Yep, BJS boxes are set flat-shaded by default, and that is why I could not reproduce issue when using standard BJS box.

An interesting test would be... UN-check "use flat shading" button in Blender export, but then do...

cube.convertToFlatShadedMesh();    ...in BJS scene code.  I think that would work, too.

Forum hero @JohnK is a teacher extraordinaire, and he has written "The BabylonJS Guide" and stores it off-site (it is a multi-layer site, based upon reader skill levels - just fantastic).  One of his docs... "Facet Normals"... does a good job at describing the differences between flat shading, and other shading methods.

Look at the difference between his picture #2 and picture #4.  Notice how many MORE white lines in picture #4.... and how many MORE vertices are used... when a cube is converted to flat-shaded.  In John's example, the cube changes from 8 vertices and lighting-normals... to 24 vertices/normals.... when converted.  Why, you ask?

In all of BJS land, only ONE lighting normal is allowed for EACH vert of a model.  But for flat-shaded cube, we need THREE lighting normals (white lines) at each corner of the cube (one for each of the faces that intersect at that corner position).  So... somemesh.convertToFlatShadedMesh()... adds many (repeat-positioned) vertices to a mesh, which makes the mesh bigger (vertices-count-wise).  Thus, checking that "Use Flat Shading" checkbox will INDEED change the vertexData seen in the .babylon file.  It probably triples the amount of cube verts/normals, etc.

There was nothing "wrong" with your earlier import.  Perhaps if you would have used different lighting - a directionalLight aiming STRAIGHT-AT the cube image... from the side... then the dark area might disappear (without needing to set flat-shaded). 

So it is not really important to remember the flat-shading checkbox.... but to understand WHY the lighting acted that way... when the cube was Phong-rendered or Blinn-rendered, or whatever method is the standard.  Those rendering algorithms are SUPPOSED-TO do that lighting fall-off... that darkening.  That lighting fall-off was quite normal... for above-cube lighting upon a non-flat-shaded cube imported from Blender.

Wingnut should have discovered... "Oh, this is an 8-vertice box, and not a 24-vertice BJS default box"... but I was too stupid.  I get that way, sometimes.  :)

Anyway, I'll let you guys get back to business.  Thanks @gryff!  Congrats, FB!   Good to hear your happiness.  Your friendly words and enthusiasm... are enjoyed by everyone.  You seem to be very kind and appreciative... thanks for that.  Good to have you with us.

Link to comment
Share on other sites

Hi @Wingnut,

Thanks for your detail explanation! I read the doc Facet Normals, which reminded me of the computer graphics lesson I learned in college and enlightened me a lot. So here is my learning, please help to correct me if any part is wrong^_^

So when we exported the textured object made from Blender with "use flat shading" checked, the normals of the textured facet (4 of them) are at right angle facing the point light, Regardless of specular and ambient lighting, the final color of each point on the facet would be calculated as below:

point_color = max(n, l, 0) * diffuse_material(r, g, b) * diffuse_light(r, g, B)

where "n" is the normal of the vertex, and "l" is the normal of the light to the vertex. So when we use flat shading, all the points on the textured facet have the same normal after interpolation, which is perpendicular to the facet, and hence produces the maximum value from max(n, l, 0), so that's why we are seeing a much brighter surface using flat shading, since flat shading treats each facet individually while rendering the object. See the effect below (i disabled ambient and specular lighting):

flatshading.jpg.2ee344bfcf09d8d086b48664db7d3152.jpg

(As we can see, the lighting got equally rendered across the textured facet, since every vertex has the same normal, at right angle)

However, if we don't use flat shading in Blender and export the object to BJS, BJS will render the object using "Normals and Minimum Vertices" method mentioned in Facet Normals, which calculates the normal of the vertex by averaging the normals of all the shared facets (not sure it it Phong shading?) And when we flatten the cube (object.scaling.y gets smaller and smaller), the 4 normals of the textured facet will be affected…so the thinner of the cube, the further those normals will be facing away from the light, which makes the y component of the normal smaller and smaller, and consequentially, the interpolated normals of the points on the facet gets smaller y component. And the final result is that, it produces way too small value from max(n, l, 0) and we ended up seeing that darkened image! See the effect below (disabled ambient and specular lighting either):

 phong_shading.thumb.png.aaba6c69c214b55b9462e2bd1778d211.png

 

So what makes me keep learning is that whether we want to use “flat shading” or other shading methods when it comes to realistic rendering. This is so interesting especially when we can directly see the result. I will dig into more of that.  :)

Lastly, do you mind me asking one more thing...so if we want to develop a 3D roaming system (roaming inside / outside of a building) using BJS, do we often leverage other tools like Blender, 3ds Max and etc. for the modeling and animation instead of creating a complicated scene purely in BJS?

Thanks @Wingut! and all of you! 

Edited by flyingbee
typo due to the auto emotion icon
Link to comment
Share on other sites

Excellent analysis, FB!  I understood most of it, but hopefully, smarter people than I... will comment.

Once upon a time (recently), I had a theory (and still do)... that there was an issue with the brightness of scaled mesh.  It is a SEE-able phenomena, and it is related-to facet data.  What I noticed... is that when some mesh were scaled, they got darker, and lost much of their specular "shine".  Here is a test scene...

https://www.babylonjs-playground.com/#11YE88#18

Based-upon some advice/code from @adam, I built a function called "yaymaneuver()".  I really never understood what it does, but I DID see scaled mesh get brighter and get improved specular shine... once I ran them through the yaymaneuver() machine.  :)

The above playground is a test.  SOME mesh are sent thru the yaymaneuver(), some not.  Play with this playground... see the differences, tell me your thoughts, if you please.  A quick and easy way to see some yaymaneuver() differences... is to switch between #18 and #19 playground.  #19 has a disabled yaymaneuver() func.  #18 has a yaymaneuver() active on SOME of the mesh.  Goof around, have fun.

And, of course, borrow the yaymaneuver() function to see what it does to your scene.  I would be REAL interested in what you find.

Your analysis is quite good, and I think... on-target.  I believe that when a mesh is scaled, the "magnitude" of the lighting normals... are scaled, too.  I don't know if that is a wise move, and it MIGHT be a bug in the framework.  About 3 months ago is when I first saw this, while I was trying to build a bobsled model, dynamically.  Some of the bobsled's parts were BRIGHT green and nicely specular shiny.  Other parts were darker green and had crappy shine. I talked about all of it in The Wingnut Chronicles, so, forum search for 'bobsled' should find much too much info about that mess:)

But, #18 playground above... is the "test scene" to prove to myself and anyone who cared... that some scaled mesh were losing their brightness and shine... and I never noticed such things... before BJS 3.0.

Nobody seemed to agree with me... that something was broken.  I am not a pro at BJS or JS.  In fact, I struggle to get to "half-assed" level.  :D  So, sometimes I just say things like "Does anyone else think this is broken?" and if nobody replies, I just shut my yap.  I have been wrong many many times, so I don't trust myself when I think something is broken in the framework.  I leave that to the big dogs.  @adam is one of those big dogs.  He's an excellent coder and forum helper.  I KNOW he saw the yaymaneuver() and how it repaired the colors on my bobsled.  He's the one that gave me the code.  :)

If HE didn't say "Hey, something is wrong with the framework" when he saw my bobsled colors/brightness get fixed, then there is likely nothing wrong with the framework.  The guy is really sharp, and has done many contributions of code... to the framework.

But you, FB... are just as scary.  You're not supposed to know about facet data THIS early in your BJS learning.  That's... like... year #2 stuff.  :)  I'm thinkin' YOU are pretty much a genius, too.  Cooooooool! 

Anyway, I thought I would tell you the story of the bobsled colors... and about my friend... the yaymaneuver().  It was named that... because... you know... I was pretty happy when I first saw the bobsled in ONE COLOR.  :)  I guess it all started here... Adam says it all "Calling updateFacetData updates the normals".  Geo-superhero Jerome liked it... so... that's like getting the Good Meshkeeping Seal of Approval. 

But what does "updates the normals" mean?  HOW does it update them?  Only targeting?  Or magnitude, as well?  Normals ARE direction-vectors, so I THINK their magnitude is important. (With directionVectors, 5,0,5 is the same direction as .2,0.2, but 5,0,5 is higher magnitude.  You probably know this already.)  But I really don't know if the magnitude of normals... are used in the rendering. I don't think so.  I think most lighting normals are "normalized"... which means... um... all values are between -1 and +1, I think.  :o

When I used updateFacetData() on the bobsled... I used the handy showNormals func to show my bobsled normals, and they had been re-aimed.  Aimed much better than before... and it fixed my problem.  BUT... our handy showNormals() func is NOT magnitude-sensing.  It draws all the normals... the same length.  SO, showNormals() func does not "draw" lighting normals magnitudes.  It only shows aiming/direction of the normals.  (exciting story, eh?  snore)  :)

ONE WOULD THINK... that updating the normals is not necessary after a simple mesh scaling.  But my #18 test scene... says it IS necessary... perhaps.

*sigh*  I'm scared.  :D  I'm in over my head.  Still, I smell a framework bug, OR... I smell something that happened during the inventing of the facetData system.  Perhaps, some... special vitamins that @jerome and/or @adam are injecting into the framework, in some kind of mad scientist experiment.  :)  Not sure.  I never am.  Be well, talk soon.

Link to comment
Share on other sites

On 2017-06-27 at 9:50 PM, flyingbee said:

Lastly, do you mind me asking one more thing...so if we want to develop a 3D roaming system (roaming inside / outside of a building) using BJS, do we often leverage other tools like Blender, 3ds Max and etc. for the modeling and animation instead of creating a complicated scene purely in BJS?

@flyingbee : An interesting question FB :)@JohnK created a little demo for building a house purely with BJS

johnK is a very good coder but I am not, so, I would have created the house in Blender, and perhaps done a number of additional things like add furniture:o Leave my coding to the "roaming" aspect of things.

You can see an example of my approach here

I have recently watched some youTube videos from Tim Schafer's DoubleFine Productions - the Amnesia Fortnight contest that they run. Teams of 6-7 compete - with each team member contributing different skills. I suspect the answer to your question is both coding and tools. :)

cheers, gryff :)

Link to comment
Share on other sites

Isn't that "roaming" thing called "path-finding"?  I think so.  http://doc.babylonjs.com/playground?code=pathfinding

That third one is kind of cool.

Sam Girardin's Crowd Thing is still online, on a friend's server, with Sam's permission (I like scene 11 most).  Zip available too. BJS 1.14 I believe.  Read all about it.

There was a playground I once saw... where rays/lines were being shot-around in a maze... attempting to find various paths between point A and point B... but I can't find it. 

Besides, I think it was a "seeking enemy" project.  A seeking-enemy pathfinder... you know... is for a smart enemy... who can FIND you, no matter where you hide.  (But it takes him search-time).

Wandering AI "little computer people"... that's a different thing, I suppose.  All in all, I think it's been tried both ways, FB.  I think SOME have used 3rd party pathfinders, and some have tried home-brew.  Perhaps seeking-enemy will work fine for wandering, too.  Just place an invisible seek-target in the corners of various rooms, and whenever wanderbot touches it, move it to another corner of another room.

But, all the pathfinders I have seen heard about... are doing flat dungeons.  A 3-story house or multi-apartment skyscraper?  Phew.  Ow!  :)

I think the smart-robot algorithms exist, but I think they are rather "hot" code for REAL robots.  So... maybe not open-source code.  Similar code might be used on smart missiles.  $$$  Mapping.

Smarter people than I... are sure to comment soon.

Link to comment
Share on other sites

On 6/28/2017 at 1:03 AM, Wingnut said:

"Does anyone else think this is broken?"

This could be considered an issue.  When a mesh is squashed, the normals need to be updated.  This function is costly though.  If a user is scaling their mesh non-uniformly every frame this could cause performance issues.  One possible solution is to just do this operation once before the mesh has been rendered.  That would probably solve most of these issues. 

Link to comment
Share on other sites

*nod*.  Thx for the words, @adam (and thx for doing basic .showPhysicsImpostors, very cool.). 

Costly or not, a programmer would not expect a material to change colors... when a mesh is scaled, right?  (unless it changes its orientation with the lights).

A reminder to others, we are looking-at differences between  #18 PG (yaymaneuver() used on some mesh to improve brightness/shine) and #19 PG (yaymaneuver disabled).

I think the issue is easiest to see... when a sphere is squashed (down-scaled) to pancake-shaped.  :)

I suppose programmers could create their own mesh.setScaling([vec3]: value, [Bool]: update normals?)  That would give them choice.  *shrug*  All comments welcome.

 

Link to comment
Share on other sites

ah haaaa!  Wingnut was being fooled (quite normal).

https://www.babylonjs-playground.com/#11YE88#20

The material "appears- to" change color.... because Mister Wingnut was not using a .groundColor on his Hemi-light.

Turn-on line 17, and it fixes itself.  I'll be darned.  Yep... the amount of photons that travel UP the sides of the mesh (from hemi.groundColor)... is summed with the photons that travel DOWN the sides (from hemi).  That creates the mesh side colors.  The "sides" of a flattened sphere... move-to the top and bottom.  

So I am noticing the hemisphericLight... acting totally normal, while Wingnut acts totally uneducated.  :D  (no photons involved except those that leave our display screens)  :)

Thx guys.  I'll probably forget this in a year, and go through this hell again.  :)  I think I will have this info tattoo'd somewhere.  "Scaled mesh going dark?  Got .groundColor?"

Link to comment
Share on other sites

Hi @Wingnut@gryff,

Sorry for my late response. I have been working hard on something else recently....:( but i am keeping an eye on this thread. Thanks for all your reply. While I know so little about BJS by now, I played with the yaymaneuver() in playground  #18 and #19, it is interesting. so I simplified the code a little bit just to narrow down the cause of the lighting difference. Now it looks like a pill in the playground I created: https://www.babylonjs-playground.com/#WB2I9E. It looks like  other than yaymaneuver(), the lighting is also affected by this function: bakeCurrentTransformIntoVertices, if we commented out this function, shadow comes with or without yaymaneuver....here yaymaneuver just applies a world metrics to the vertices of the object, even it doesn't make a difference if i commented it out myVertexData.transform(mesh.getWorldMatrix())...which makes me wondering what "bakeCurrentTransformIntoVertices" is doingbut i think i am still too new to BJS to understand that....I will dig more into that and share you with any of my finding....This is really interesting...I was recently frustrated with an lighting/normal issue of an object imported from blender, the normal seemed inverted when the object was imported to BJS, it turned all transparently dark, but after i used bakeCurrentTransformIntoVertices, the surface got shiny just as what it appeared in blender....thanks for your sharing @Wingnut, :D i think I have to figure that out sooner or later.

Thanks @gryff for sharing your approach...the Chirstmas scene you made is so amazing! I like the snow and the smoke from the chimney.:)  I am also trying to create a beautiful scene like that so I am trying to learn how to do modelling in blender and do some coding in BJS for the roaming, controlling...I am just kind of slow in that:D Sometimes when i get stuck, I might stop and think if i should give up....but eventually i end up continuing on and start learning from you guys...Just slow lol, but I will keep learning and share with you any of my findings and progress. Thanks all you guys! Talk to you soon!

Link to comment
Share on other sites

Hi FB... sounds good, and thanks for the kind words.

Just a little reminder:  I eventually discovered that the "darkening" I experienced... when scaling spheres flat... was caused by not having a light.groundColor on my hemisphericLight.  What happened... was that dark(er) sides of the sphere... moved to the top and bottom of the sphere... when it was downscaled on Y axis.  Those dark(er) sides of ANY mesh... are often caused by having no .groundColor (usually set light gray) on hemispheric lights.  It is normal. 

When hemi-lit mesh are squashed-down flat, those dark side areas "roll-up-onto" the top and bottom of the mesh.  This makes it appear that they are getting darker-colored.

It fooled me.  :)  I really didn't need the yaymaneuver()... or certainly LESS yaymaneuver()... but still... the yaymaneuver IS interesting and there are things to learn, there... mostly about normals.  I like learning.  I wish I could remember things, though.  :)

Bake... really shouldn't change shine.  But those last lines... those are the interesting lines... to me.

    mesh.updateFacetData();
    var myVertexData = BABYLON.VertexData.ExtractFromMesh(mesh, true);
    myVertexData.transform(mesh.getWorldMatrix());
    myVertexData.applyToMesh(mesh);


First, .extractFromMesh is COOOOOL.  Who invented that?  Nice.  "Gimme the guts of this mesh... RIGHT NOW"  :)  Love it.

Then, transform all those positions? PER the mesh's worldMatrix?  Woah.  Heavy.  :) Then put 'em back from where ya got 'em.

What kind of idiot would steal code from parts of BJS mergeMeshes()... and put it into a stupidly-name func like 'yaymaneuver'?

Oh yeah, that was idiot me.  ahem.  It's all part of  "Wingnut's School for Blindly Trying to Learn Matrix Transformations Without Needing to Read About Them"  heh.  It actually happened because I saw the parts of my bobsled... get better color... when all the meshes were merged together.  I went searching for WHY.

AQll in all, FB... don't follow me too closely, or look to me for divine guidance.  I'm often lost or bewildered.

But baking, I understand that.  It, essentially, zeroes-out the .position, .rotation, and .scaling of ANY mesh... no matter HOW the mesh is currently positioned, rotated, or scaled.  It sets all the meters back to 0,0,0... but doesn't move/change the mesh.  It "naturalizes" the current pos, scale, and rot... of the mesh.... making the current "state"... its natural state.  With me? 

Position, scale, and rotate a mesh in some odd ways.  Now bake it.   Its .scaling, .position, and .rotation... are all 0,0,0 after the bake (actually, scaling goes to 1,1,1, I think.  Default state.). 

But... perhaps do a little rotating of that mesh after the bake... and check where the pivot point / origin is located.  Did IT get re-located/re-oriented when the bake happened?  Curious minds want to know.  :)  Baking techniques can be used in-place-of the wonderful setPivotMatrix() func. 

But setPivotMatrix is one line, and shifting a pivot point using a bake... takes 3 lines of code, I think.  SO, most folk use setPivotMatrix to slide-around pivot points.

I can't remember if mesh need to be updatable for bakes.  I don't think so.  Okay, that's enough rambling for this session.  Party on.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...