JCPalmer

Particle Hair

37 posts in this topic

I have just reverse engineered what happens to a Hair particle system when it is converted to a mesh.  No matter how many "segments" you say to use when it is a particle system, you get a block of 65 vertices (for 64 segments) for each strand.

In this picture of a system with 2 strands:

straightHairs.jpg.40ea09e770d0c1b3eb364b002fbfc78a.jpg

The output looks like this (already converted to left handed):

vert: 0 location: -.7644,1.2222,-.8473
vert: 1 location: -.7644,1.2549,-.8473
vert: 2 location: -.7644,1.2904,-.8473
vert: 3 location: -.7644,1.3285,-.8473
vert: 4 location: -.7644,1.3692,-.8473
vert: 5 location: -.7644,1.4122,-.8473
...
vert: 64 location: -.7644,4.7778,-.8473
=========================================
vert: 65 location: 1.2222,.1286,.33
vert: 66 location: 1.2549,.1286,.33
vert: 67 location: 1.2904,.1286,.33
vert: 68 location: 1.3285,.1286,.33
vert: 69 location: 1.3692,.1286,.33
vert: 70 location: 1.4122,.1286,.33
...
vert: 129 location: 4.7778,.1286,.33

Clearly 65 data points for a 25,000 hairs is not going to be practical for a line system.  In this example, it only really needs 2 points per strand.  Any ideas on a way to reduce the data points between the first and last, based on some kind of straightness between sets of 3 points tolerance?

When the strands are not straight, pulling out data points are going to make them look jagged, but with enough hairs, might this not be obscured?  Especially since you are probably not going to get this close.

curls.jpg.d6037f5911e18da106322e5689fddf0f.jpg

Like most of my topics, I'll probably be talking to myself once again.

Share this post


Link to post
Share on other sites

Think the Blender operation Limited dissolve on the converted mesh might to a nice job.  Am starting to work on a LineMesh sub-class called Hair.  It does not strictly need to be a sub-class, but it is cleaner.

In order to build a LineMesh with multiple lines, you need to know how many vertices are in each line to build the vertexData.indices. I only know where each strand begins when it it first converted ( every 65 points ).  Limited dissolve might do all the "mathy stuff" to remove many points and still look good, but now I will not know how many points are in each strand.

Fortunately, I was already planning to start my export process while it is still a particlesystem, and do the conversion to a mesh operation right in the script.  The original reason I was going to do that is so that you can still edit / comb it after export.  All that is required is to push an undo state onto the stack before the conversion, then do an undo after all the data is sucked out.

bpy.ops.ed.undo_push()
        
# find the modifier name & temporarily convert it
for mod in [m for m in mesh.modifiers if m.type == 'PARTICLE_SYSTEM']:
    bpy.ops.object.modifier_convert( modifier = mod.name )
    break
        
bmesh.ops.dissolve_limit(args)

bpy.ops.ed.undo()

If I get all the beginning points of each strand BEFORE the dissolve, I can hopefully still find them afterward (and thus know the number of points in each line).

Deltakosh likes this

Share this post


Link to post
Share on other sites

@JCPalmer; I read most of your posts - but I know if you are asking a question that there is a very good chance it will be beyond my abilities :o

Not sure what you are planning here, but I have never liked hair creation with the particle system - all that "brushing" seems rather fussy 

cheers, gryff :)

Share this post


Link to post
Share on other sites

Well, I'm through.  There is a lot of overhead to get through to adding an entire new class of geometry across systems, especially when using source code generation,.  In this 20 strand example, the 1300 vertices (20 * 65) reduce down to 418 after a built-in limited dissolve with a 5 degree angle limit (hard coded).  That is a 68% reduction, and I am very pleased with the result.  Of course, 65 verts per strand is a pretty low bar, so it is easy to improve when you suck.

This might actually work.  I am 3 days in to a 5 day test.  I have already coded a JS routine to generate skeleton single bone matrix weights.  Have not tested that, and probably going to spend the rest of the time seeing what it really can do at scale, and how good it looks.

@gryff, I have basically closed code on TOB 5.3, for QI 1.1, and am doing an experiment before moving on.  The helmet hair specimens for MakeHuman are limited, and most not really that good.  I saw a turtorial on particle hair I thought pretty good.  It spent a lot of time on what not to do, perhaps your fussing. 

I do not know how far I am going to get.  For sure, it might only going to make it into TOB.

 

Share this post


Link to post
Share on other sites

I dunno, guys.  @JohnK's fur is pretty nice.  Yeah, I'm sure it is a post-processing effect, so it has limitations, and is a bit off-topic, here. 

Still, I think it needs to be "considered" when studying feasibility, plausibility, practicality and maybe some other 'alities' and 'ilities', too.  :)

As soon as you start down-scaling the number of verts in the strand, it loses its ability to "flow in the wind" and "swish with nice bendings" during fast head-turns.

Sucks, eh?  *nod*.  Depending upon the length of the strands, I think 64 path-points is actually not near enough.  All your girl NPC's are going to expect 1024 path-points per strand... or else they won't be able to "do their hair" in the latest fashions.  :)  Women be some hair-bendin' fools, they be.  :)

Share this post


Link to post
Share on other sites
6 hours ago, JCPalmer said:

It spent a lot of time on what not to do, perhaps your fussing.

@JCPalmer ; well, Jeff, if you go to the third tutorial in that series - where he is "brushing" the hair - he admits several times that it can be "frustrating", as the hair shoots through the body. :o

Short haircuts may not be too bad - long hair you need a diploma in hair styling :lol:

Still not sure where you are going - 20 strands ? What happens with a 1000 strands, even with Limited Dissolve?

cheers, gryff :)

Share this post


Link to post
Share on other sites

20, I was only really doing 2 strands with 10 children.  Need to get working process before trying scaled test. 

Speaking of, the scene now has 1538 strands.  I do not know how it got that number.  Limited dissolve did very little, so ended up with 98,768 vertices.  That stray coming out the front is not in Blender.  This is what stuff looks like when you are bootstrapping.  I'll shortly know more.  I am only going to have one page, so it can change at any time.

I actually do not think long hair (beyond shoulder length) is any good for games, due to head turns with hair cutting thru the body.  Make human only has 3 stock male heads of hair (black, brown, & Afro).  What about bald, or old guy with just a little combed over, or facial?  To get more believable characters you need more than 3.

Wingy, I looked at fur, but not for very long.  I am doing people not dogs.  When I ran your page on an A8 Ipad Air 2, my fps dropped to the high 20's.  I have never seen a single mesh do that before.  Do not know or care what a girl NPC is.

Share this post


Link to post
Share on other sites
30 minutes ago, JCPalmer said:

I actually do not think long hair (beyond shoulder length) is any good for games, due to head turns with hair cutting thru the body.  Make human only has 3 stock male heads of hair (black, brown, & Afro).  What about bald, or old guy with just a little combed over, or facial?  To get more believable characters you need more than 3.

Ohh I agree that long hair presents issues with head turning, Jeff.

As for MakeHuman and hair, the are some contributed community assets, including a beard and moustache. And as for bald - see image below of my old friend Sholto (created a few years back for Second Life ).  The hard one would be the "combover" case.

I will follow this thread with interest :)

cheers, gryff :)

 

sholto1.png

Share this post


Link to post
Share on other sites

The saga continues.  In this episode, I found that the 65 vertex per strand conversion rate only holds when the emitter mesh is a cube. Have to find a way that always works.  Seems like when I started shapekeys.  Now after 5 generations of dev, they are really solid.

Do not think that it is going to take that amount of effort here.  I have already found if I interrogate the meshes edges array, that the edges are in order of the lines.  Each edge has 2 indices into the vertices array(which are also in order though not always in 65 per stand).  Since both are in order, there is a jump where the 2nd vert index of the previous edge is not equal to the starting vert index of the next edge.  Do not think I even need the vertices to be in order.

edges.jpg.38c579abdb27da21d7ede174a2be91cf.jpg

This is about a 3 hour rework, but I like it.  It means that the clunky method of finding all the starts before the dissolve does not need to be done.  Can open up multiple work flow possibilities, if this gets this far.  My test for success with be that the stray stand in the face will be gone.

Assuming export gets straighten out to always work, the next hurtle I think is going to be the fragment shader.  When you get enough strands, then things are just solid color.  I know you need a face / 3 points to get a normal, but could not a direction / 2 points be a way to somehow differentiate a color slightly?  @Deltakosh?

Share this post


Link to post
Share on other sites

Ok, re-coded.  Since I am only using one scene, here is the full scale scene using the original method.  The stray highlighted in white:

strayhair.jpg.5c0304cd1a851e3b3cef3971ebfb2db4.jpg

Again, this is the url for the scene.

The color shader used by the LinesMesh class really looks like the next problem to solve.  FYI, though Blender has a real fancy method inside the Cycles renderer for hair, here is what the same scene looks like in blender using just a diffuse color.  Still problems, but better than in BJS.

fromBlender.thumb.jpg.7d270802354beffb5a1c94507e39f4e2.jpg

A normal is a direction.  The normal of line might be inverse of its direction, maybe?  Seems like there should also be a "lower bound", to avoid the black on top.  Maybe making a LinesMesh subclass instead of a static method was more than just a convenience.

Wingnut likes this

Share this post


Link to post
Share on other sites

Thinking about it, I can computer generate vertex colors ( actually I already have).  For now, I was not changing the value of the color for the individual vertices of a strand, but adding / removing a -.1 to +.1 from each channel.  This way the lines would not all be the same color.  It would not increase the export size, since it is done on load.  If this test worked, I could play with it to dial in the right amounts for hair.

LinesMesh does not load that attribute though ( only positions ).  Wondering if I might try to modify LinesMesh directly for PR?  Might this fail for some reason I do not know about yet?  I remember @jerome had even mentioned he wanted to do this.

Share this post


Link to post
Share on other sites

Yep, it's on my (long) todo list, but I'm having a BJS break for while currently.

Share this post


Link to post
Share on other sites

I take the silence as no objection.  I Guess in LinesMesh would need to:

  • add an arg to LinesMesh constructor, say useVertexColor? : boolean
  • When true, switch out the shaderPath arg to ShaderMaterial constructor from "color" to not yet known (May need to write a fragment shader that uses a varying vColor)
  • Also remove the color uniform when using vertexcolor
  • Maybe need to add a _colorBuffer property, and call it in _bind()

In ShaderMaterial, it looks like it checks the mesh for a skeleton and conditionally adds it no, so probably set.  Need to also:

  • Add a test for vertex colors in mesh, similar to what is done in MaterialHelper

Am I close?  I am at the end of my 5 day limit, but have to stop now to take my little killer to his annual hair cut (almost on-topic).  Maybe add a little on to this project tomorrow to make up.  His pic from this mornings walk5900ca6b4802e_20170426_0931231.thumb.jpg.2f559c68203e840b287b1ae081d85723.jpg

 

Jaskar and jerome like this

Share this post


Link to post
Share on other sites

"Little killer" knows how to grow good hair. :)  Thanks for the briefings and demos/pics, Jeff.  Great reading for us.  It's fun watching your mind work.

Share this post


Link to post
Share on other sites

I wish there was a dedicated chip for hair, also for pixel perfect collision.  Hair and collision are always the most problematic and processor intensive areas.

Share this post


Link to post
Share on other sites
14 hours ago, ozRocker said:

I wish there was a dedicated chip for hair, also for pixel perfect collision.  Hair and collision are always the most problematic and processor intensive areas.

Oh, wow.  I had heard that nVidia had a new GPU with code name Grecian in dev.  Now it all makes sense!

Jaskar likes this

Share this post


Link to post
Share on other sites
17 minutes ago, JCPalmer said:

Oh, wow.  I had heard that nVidia had a new GPU with code name Grecian in dev.  Now it all makes sense!

Sorry for my comment.  I guess I was just thinking out loud.

Share this post


Link to post
Share on other sites

Nothing, to be sorry about.  I was making a joke.  My style is more mock serious, so I did not put some little face next to.  Does not really translate on paper, I guess.

Jaskar likes this

Share this post


Link to post
Share on other sites

based on your todo list I think it is good. I can offer my help to do the required change in ShaderMaterial to support vertex color if you want.

Share this post


Link to post
Share on other sites

I compared the binding of Mesh with LinesMesh's override.  Since now it will potentially need positions, vertex color, & the 4 matrix weights Indexes, I switched to using geometry's _bind().  Now also get vertex object arrays, so I actually deleted the positions buffer in LinesMesh.  This is now _bind():

public _bind(subMesh: SubMesh, effect: Effect, fillMode: number): LinesMesh {
    // VBOs
    this._geometry._bind(this._colorShader.getEffect() );

    // Color
    this._colorShader.setColor4("color", this.color.toColor4(this.alpha));
    return this;
}

The constructor for LinesMesh is now:

constructor(name: string, scene: Scene, parent: Node = null, source?: LinesMesh, doNotCloneChildren?: boolean, public useVertexColor? : boolean) {
    super(name, scene, parent, source, doNotCloneChildren);

    if (source) {
        this.color = source.color.clone();
        this.alpha = source.alpha;
        this.useVertexColor = source.useVertexColor;
    }

    this._intersectionThreshold = 0.1;
    
    var options = {
        attributes: [VertexBuffer.PositionKind],
        uniforms: ["world", "viewProjection"],
        needAlphaBlending: false,
        defines: []
    };
    
    if (useVertexColor) {
        options.defines = ["VERTEXCOLOR"];
    } else {
        options.uniforms.push("color");
        options.needAlphaBlending = true;
    }

    this._colorShader = new ShaderMaterial("colorShader", scene, "color", options);
}

I did not actually make any changes to ShaderMaterial for  VertexColor.  I put the defines in the constructor above.  Might be better to put it in ShaderMaterial though, so any shader material could use it..

The color shaders were so small, making completely separate ones seems overkill.  I added Vertex color attribute, bones declarations, and split up viewPorjection & wolrd, so the bonesVertex code would work.

// Attributes
attribute vec3 position;

#ifdef VERTEXCOLOR
attribute vec4 color;
#endif

#include<bonesDeclaration>

// Uniforms
uniform mat4 viewProjection;
uniform mat4 world;

// Output
#ifdef VERTEXCOLOR
varying vec4 vColor;
#endif

void main(void) {
    mat4 finalWorld = world;
#include<bonesVertex>
	gl_Position = viewProjection * finalWorld * vec4(position, 1.0);

#ifdef VERTEXCOLOR
	// Vertex color
	vColor = color;
#endif
}

The fragment shader now either uses a varying color or a uniform color.


#ifdef VERTEXCOLOR
varying vec4 vColor;
#else
uniform vec4 color;
#endif

void main(void) {
#ifdef VERTEXCOLOR
	gl_FragColor = vColor;
#else
	gl_FragColor = color;
#endif
}

When I run my scene, regardless of whether I say yes to the new useVertexColor constructor, it fails silently.  Eventually, Arcrotate's _getViewMatrix() fails, but kind of discounting that.  Scene runs fine will a 2.5.  Must of missed something.

Share this post


Link to post
Share on other sites

Thanks, still same error in ArcrotateCamera._getViewMatrix().  I even put a syntax error into the vertex shader, nothing.  Do not think I am getting that far.

Starting to pay attention to Arcrotate.  It is trying to call getViewMatrix in the constructor.  It has this fair new _targetHost thing that gets checked for in the call to _getTargetPosition().  It is not there, so target as passed in the constructor is returned.  My target, the red cube, has no "addToRef" method.  I am ditching ArcRotateCamera.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.