Jump to content

The Blue Lady - an experiment with animation


gryff
 Share

Recommended Posts

I wanted to test babylon.js, Blender export and animations. This is the result:

 

The Blue Lady

 

The animation is one of the Carnegie Mellon University bvh files - 61_01.bvh - that I reduced from 120fps to 30fps. The animation was imported into blender and then modified - final result 928 frames of animation to fit the music file length ~31 secs

 

I created all the mesh geometry except for the hair which is from :

 

'Koz's Short bob' - free to download and use for personal and commercial use.

 

I reduced the mesh manually and removed vertices covered by the beret.

 

Royaly free music by Ken Mcleod.

 

Couple of points I don't understand:

 

1. Blender tells me that the meshes have 5791 verts to create 11,074 tris. The babylon sandbox tells me that the total vertices is 33,222 verts  I'm curious about the difference.

 

2. I tried to add a second camera in blender which did export but the export always seemed to set this camera as the active camera. Even changing the babylon file so that the other camera was active had no effect.

 

Depending on your internet connection, it may take a few moments to load as the babylon file id 7+ Mb.

 

cheers, gryff :)

 

Link to comment
Share on other sites

1. 5791 vertices (for position) but because each face has its own texture coordinates and blend indices we have to duplicate vertices to 11.074 x 3 = 33222 vertices

2. What about changing the current camera directly into blender? here is the code I use in Blender: 

# Active cameraif scene.camera != None:   Export_babylon.write_string(file_handler, "activeCamera", scene.camera.name)

By the way I love your demo, do you want me to add it on our homepage ?

Link to comment
Share on other sites

By the way I love your demo, do you want me to add it on our homepage ?

 

Well not so much a 'demo' - more a bit of a stress test for large animations.

 

Sure you can. Anything you need - or just the link and maybe a thumbnail?

 

You are welcome to all the files if you need them.

 

Thanks for the explanation of the vertices differences. I will look into the exporter code with cameras

 

cheers, gryff :)

Link to comment
Share on other sites

I think I used to date her grandmother!  :)  Nice work, gryffster!  When you say you created the geometry, did you umm, ya know, hand sculpt it in a NURBS editor, err, Poser, err...  did ya use tools?  Which tools?  (thx)

 

You definitely have a newer version of Cones Maker Pro... than I do.  ;)

Link to comment
Share on other sites

When you say you created the geometry, did you umm, ya know, hand sculpt it in a NURBS editor, err, Poser, err...  did ya use tools?  Which tools?  (thx)

 

 

My workflow and toolchain are graphed out in the image below. For the "Blue Lady", I used MakeHuman, bvhacker and a graphics program (in my case an old version of Photoshop that I got with a contract a few years ago). At the centre of it all is Blender. If you use Gimp as your graphics software - everything is then free. 

 

1. Make Human

 

This open source software is designed to create humanoid characters by taking a standard mesh then applying morphs to create the features you want - including body size and shape, gender, age, etc. Your final character can be exported in a .mhx format that can be imported into Blender. The MakeHuman group also provides a number of plugins for Blender including the .mhx importer, MakeWalk and MakeClothes. These you use in Blender.The MakeClothes addon allows to to create clothes for your figure (who would have guessed :o  ). I created the dress using this addon. The MakeWalk addon allows you to import .bvh animation files and retarget them to the rig that came in with the .mhx file. It is not perfect, but has a number of tools to fix glitches in individual bones (I fixed problems with hands passing though the body mesh this way by tweaking the collar bones). I also tweaked the weight painting on the dress I had created. The MakeHuman group works closely with the Blender folks.

 

As for Poser or Daz there are two issues - the models are very high poly, and licence issues for use in games. Some folks using the Unity game engine do try to decimate the models - but such a process can destroy good mesh topology which is important for animation. The MakeHuman models have good topology. A Daz game licence is $500 (I think) but that does not include any items (clothes, armour, weapons etc.) you purchase from their store from 3rd party vendors.

 

I could have modeled the figure using a combination of box/edge modeling, which I have done in the past, but newest version of MakeHuman makes it so easy and quick.

 

There is other free software for creating animated figures eg. Fusion from Mixamo - but you are then stuck, as far as I can tell,  with using their motion capture files (~$20+ per file)

 

2. Bvhacker

 

A small little program created by Dave Wooldridge that allows you to do some clean up of .bvh mocap files. It allows repositioning the file to the 0,0,0 mark, reducing the fps, adding a T-pose, renaming bones, merging bones, deleting bones etc. With a free Carnegie Mellon University bvh file. I reduced the fps (from 120-> 30fps) and repositioned to the 0,0,0 mark.

 

3. Blender

 

I used Blender with the MakeHuman plugins installed to import the figure, make the dress, import the edited .bvh file and load in the music I was going to use. Using the Action Editor, I then played with the frames in the .bvh animation, duplicated and mirrored them and adjusted the length to fit the length of the music so that the animation would loop seamessly. I also tweaked the weight painting and the mesh vertices to eliminate any body mesh "poke through" of the dress during the animation

 

While the model was UVMapped there was no texture, so the first thing I did was export the UVMap as a guide for creating the texture in Photoshop. Next I baked a "dirty vertex" texture from the mesh and saved this to be used later in Photoshop.

 

4. Texture Creation.

 

I started this by picking a single skin  colour in a base layer then dropping the dirty vertex map on top of it and setting the mode to colour burn, The dirty vertex texture essentially is a shading map of the mesh - highlights/shadow areas. Something like an ambient occlusion map but cleaner. Then using the UVMap as a guide painted the texture for the lips, eyes, boots etc.

 

 

5. Conclusion.

 

And finally exported everything from Blender using the babylon exporter.

 

What I really like about babylon.js is that I can make Blender the heart of scene construction. I can create terrains, caves, buildings, tunnels as well as humanoid figures, even monsters like dragons and trolls. I can control the scene as I build - lay it all out, create my textures then use the babylon exporter to export it all ready for use - and I don't have to worry about missing ";" if I had to code each object by hand :o  Blender becomes my "Unity Editor" for WebGL.

 

Now all I need is some javascript tool to give me functions and blocks of code at the click of a button ;)

 

Hope the above is not overkill - but you did ask Wingy!.

 

Ohh ... and thanks for all the likes and nice comments people!

 

cheers, gryff :)

post-7026-0-13852300-1398443059.png

Link to comment
Share on other sites

  • 2 years later...

Gryff,

I am trying to do something similar by creating a model using MakeHuman and then exporting to Blender.  I can see the viseme in Blender and they seem to work.  What I would like to do is export the model from Blender to BabylonJS and have a good FPS and also trigger individual viseme expression from BabylonJS.  Is this possible?  I am able to export the model from Blender to BabylonJS but I am only getting 9 fps right now.  Also, I believe I read somewhere else that shape keys are not supported.  If you have any suggestions or advice I would greatly appreciate it.

Link to comment
Share on other sites

@Matt Duffield: Hi Matt - welcome to the forum :)

There have been a lot of changes to BJS and to the Blender Exporter(BE) - since I first made this post . A real "blast from the past" :lol:

I have not tried to do anything with the viseme expression stuff but @JCPalmer is working on this through his Tower Of Babel (TOB) extension which has its own exporter. He has a setup for speech - but I can't find the link. Hopefully Jeff will see this and advise. You can also follow what Jeff is doing on this thread IK Workflow. But has now become a  discussion of what Jeff is doing.

As for fps - try this file - the fps is in the top left corner. Player 5

What FPS do you get? I've never had fps values of 9 with any of my animations :o

It was made with BE 4.5.0 which gives much smaller files than 2 years ago - the "Blue Lady" babylon file drops to about 1/3 of its size 2 years ago. See here for some file size comparisons

cheers, gryff :)

 

Link to comment
Share on other sites

@gryff,

Thanks for your response.  I am using BE 4.6.1 and Blender 2.76.  I am sure that it is in the process of exporting from MakeHuman to Blender and all the selections that is make the file so large as well as the frame rate to only 9 fps.  I am testing it in the SandBox on the BabylonJS site.  I am really trying to do something like the following link that I played with in ThreeJS but I would much rather use BabylonJS. 

Here is a link to what I am playing with:

http://jsdo.it/matt.duffield/qqAF

I would love to do the same as this in BabylonJS and get a solid workflow with Blender and facial expressions.

Thanks,

Matt

 

Link to comment
Share on other sites

@Matt Duffield : And that, Matt,  is exactly what Jeff Palmer has been working on. :)

I wish I could find the link. Best I can do for now

He will get a notification from my above post - so hopefully he can help tomorrow. He lives in the US and it is July 4th. ...;)

What fps did you get on that test? What rig did you use for MH export?

cheers, gryff :)

 

Link to comment
Share on other sites

@gryff: I am using the Game engine rig when I export from MakeHuman and I am using the .mhx2 format.  I get 9 fps when I test in the BabylonJS Sandbox.

I am looking forward to what @JCPalmer is working on and learning more how to produce the same thing.  I am in the US on the West coast.  

Thanks again for your quick response,

Matt

Link to comment
Share on other sites

Well, I have never used the vismes from MHX2.  They are just combinations of the 50 shape keys for the face that the import adds when you indicate to.  I had my own combinations, which I mix on the Javascript side.

I am not even using MHX2's shape keys anymore.  I am really trying to limit use of MHX2 import to NO overrides.  Problems / complicated work flow occur because expressions have to be imported using the 137 bone "Default No toes" skeleton.  There are also too many options in MHX2 which do not work in combination.

There is a project to sync meshes from MakeHuman to Blender using client / server.  I added the syncing of skeleton poses part.  (There is a demo video coming out tomorrow I just saw).  Now I can make my own expressions for speech & blinking using MH's Expression Mixer, then save them.

Using the sync server, I move them over along with the 10 expressions I decided to use.  I made a Pose Library with everything.  Now there is single source for speech / expression / blinking.  Here is work flow:

  • Make your character.
  • Export using "Default No Toes" skeleton.
  • Import in Blender, no overrides required.
  • Append "BEING-support" Pose Library from .blend
  • Click "Pose lib to Shape keys" button
  • Select meshes, and click "Archive Shapekeys" to save them to a .TOB file.

Selection_220.png

  • Now, export from Make Human a 2nd time using the skeleton you really want to use this time. 
  • If you did any overrides on import the first time, be sure to the EXACT same thing again. 
  • Select the meshes, and Click "Restore Shapekeys", picking the file you saved before.

Sounds like work, but to get expressions you had to do this.  Now this is all you have to do.  Nothing is done specific for MakeHuman in the actual export.  TOB now has all these generic utilities which are difficult to do in Blender, & saves intermediate files.  In fact, speech / expressions on the Javascript side has been pulled out of the QueueInterpolation extension.  It will be called BEING.  I hope QI to be out by month-end.  BEING will be much later, as I continue to polish.

Link to comment
Share on other sites

Hi @JCPalmer,

Thanks for a thorough explanation. I am very interested in trying out your workflow.  I am very new to this and want to be sure that I follow your steps exactly. 

Do I export to .mhx from MakeHuman?

Do I need to prep the model in MakeHuman prior to exporting besides what you indicated to make it web friendly?

Is it possible to have a video walk-through of your workflow? I am not sure where BEING comes from? I don't know what I need to have installed as dependencies to both Blender and MakeHuman. 

Finally, with the 10 expressions you mentioned, is it possible to represent most of the phonemes for speech programmatically?

Thanks again for all of your help.  I am super excited that what I am trying to do is achievable.

I really appreciate it,

Matt

Link to comment
Share on other sites

Matt,

I've seen you are using mhx2 in your export. I was doing the same in my pipeline. I don't know what is the purpose for what you are building but what you build according to the license you must make publicly available through AGPL3 open source license (which is evil license in my opinion). In blender when you try to import mhx2 format you can see license set to: AGPL3

I pesonaly changed to use MakeHuman v 1.0.2 and mhx export.

Here is a post where the guy, responsible for the license answered me questions about it: 

http://www.makehumancommunity.org/forum/viewtopic.php?f=7&t=13362&start=10

Some license explanations:

http://www.makehuman.org/license_explanation.php

Thomas (the guy who created mhx2 exporter, and mhx by the way) answered me here if it is possible to change the license:

https://thomasmakehuman.wordpress.com/mhx2-documentation/import-into-blender/#comment-373

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...