benoit-1842 Posted February 13, 2015 Share Posted February 13, 2015 Hi everybody !!!!! Before moving on I will like to say that babylon.js is the best community out there and I will always be faithful... But I have found today a library call jthree.js and it looks very interesting because they are doing exactkly what I am trying to do.... Does a webgl guru give me some pointers on how this setup works : http://editor.jthree.jp/?id=VCpG2w Thanx, Benoit Quote Link to comment Share on other sites More sharing options...
jerome Posted February 13, 2015 Share Posted February 13, 2015 everything is clearly detailed here : http://jthree.jp/ Quote Link to comment Share on other sites More sharing options...
RaananW Posted February 13, 2015 Share Posted February 13, 2015 Nihongo o hanashimasu ka? Quote Link to comment Share on other sites More sharing options...
jerome Posted February 13, 2015 Share Posted February 13, 2015 just a small tiny microscopic littleI only can count from one to ten and know little common words : thank you, sorry, hello, go on, stop, good bye, etc(years of martial arts help) Not enough to make bad jokes on a 3D framework forum Quote Link to comment Share on other sites More sharing options...
Dad72 Posted February 13, 2015 Share Posted February 13, 2015 Bof. Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted February 13, 2015 Author Share Posted February 13, 2015 Yeah !!!!!! I know !!!! But there's maybe Japanese user of babylon.js !!!!! But me too my Japanese is a little bit rusty......... This library looks powerful though !!!!! HaPPY SUSHI !!! benoit Quote Link to comment Share on other sites More sharing options...
dbawel Posted February 14, 2015 Share Posted February 14, 2015 Hi, I use the Kinect v2 daily, and animare characters using the BabylonJS framework. I can see what you're doing, but have no idea why either so much is displayed or if all of the elements displayed are essential to animating using the Kinect. My pipeline is 5 minutes of capture and maybe 30 (often 5) minutes to publish online using existing software. Can you provide more info on your process? Cheers. Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted February 14, 2015 Author Share Posted February 14, 2015 hi thanx for answering !!!! What I am trying to do is I capture the joint data of a skeleton with the kinect v2 and I want to implement this animation or data inside an avatar in babylon.js. But I don't know how to grab my data from a json file and put it in my model.... Any pointer will be appreciated... Here's a json file of the data I am receiving... https://drive.google...iew?usp=sharing Thanx for responding, Benoit Quote Link to comment Share on other sites More sharing options...
dbawel Posted February 14, 2015 Share Posted February 14, 2015 Hi, I completely seperate the building and rigging of any character - the characer is it's own easthetic model and the rigging (skeleton) only has to work within the 3D software which has to have an FBX (or .babylon) exporter. Once this is acomplished and the character is deforming the way it was designed, then I import my character into Motionbuilder, and characterize it (if you need any help with this, I can certainly provide the steps to do so.) Once I have my character in MotionBuilder, I then use software made by my long time friends in Amsterdam - the software is named Brekel Pro Body - below is the link to their site: http://brekel.com/ You can use either probody1 or probody2 based upon your version of the Kinect. I prefer the Kinect v2 as it has increased the resolution by a value of approximately 4x - so the results are far greter than the Kinect v1. But it matters far little than you might think for the motion capture. They are both very responsive, and Brekel will provide you with a temp fully functional license upon request for you to test. However, once I have a character in MotionBuilder, I can do so much with it. Ican capture motion and within a few minutes export a .babylon file and publish on our server. Again, I'm very happy to walk you through the steps to do this. Please keep in mind there are strict vertex and polygon restrictions for render in WebGL when building characer models and their children, but otherwise it's a fairly simple task. Once you capture the motion using the Brekel plugin to MotionBuilder, and once the fbx file is written out and converted to .babylon - I am able to produce and publish online as much animation as I can capture and publish - very little other work is involved - once your pipreline is in place. I hope this helps, but please ask for any assistance you might need. I'm alittle busy right now, but will check daily if you might like this pipeline and ask for any assistance. Cheers. Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted February 14, 2015 Author Share Posted February 14, 2015 Hi Thank you for your interest... Yes I have use brekel and the results are great !!!! But you are right there's not a lot of difference in mocap in between the k1 and k2... I have used a software call mikumikucapture and it's working well. But that's not my problem.. Right now we are using a skeleton capture in json ( like the file I have give to you in one of the reply). My challenge is to put those datas at the right bone in the code in b.js and after to be able to have a new set of bone position and animation and be able to change it.... So I have a model in b.js and i want with that model to play animation 1 i.e And after I need to play animation 2 but using only json file and not having to pass by blender or motion builder to build the animation I need to target thev json file to the correct bones inside babylon.js....Hope it makes sense,Benoit Quote Link to comment Share on other sites More sharing options...
dbawel Posted February 14, 2015 Share Posted February 14, 2015 Hi, I have only looked at using the json format to evaluate, and have not produced any animation that I have continued with to work within babylon. This is because there are so many attributes that are required to be declared, and there is no layer of association using software (compiled code) to simplify this process. After more than 20 years of working with motion capture and all of the issues in re-targetting motion, I quickly determined not to further explore the json format. From my experience, you are attempting to animate using a format which will provide a quality of animation which will never be consistant - especially when attempting to re-target a persons motion to a skeleton of different proportions which wil always be the case. I wish I could advise a course for your current pipeline, but I would advise against the direction in which you are currently following. Using MotionBuilder will save considerable time which almost always directly results in saving cost, and the goal is to publish animated characters as far as I can tell. Using the Kinect v2, I can scan, build and bind to a skeleton, retarget, animate, and publish online animated characters within a day. And if I already have a charcter built, the process takes 1 or 2 hours currently. When I purchased the first license of Filmbox (MotionBuilder) way back in 1996, it was quite difficult to re-target animation using their software as it was built to animate light shows - not characters. However I would still choose this first version over the json format. The people who contribute to the json format are quite thorough, but I choose to spend my time on aesthetics rather than an ever changing process of coding my captured motion to a skeleton of considerably different proportions as will always be the case. I wish I could offer more encouragement, but I must tell you of the great pain I have experienced over the years using mocap, and the various options now available to make the process much more simple. I wish you all the best. Cheers. Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted February 14, 2015 Author Share Posted February 14, 2015 I know exactly what you mean. For my personnal work, I work with almost the same pipeline as you...... MikuMikuCapture (way better than brekel...trust me), the motion capture goes into blender or motion builder for retargeting and I export that to babylon.js with no problem.... the problem is sometime you have to work with company constraints and one of them is the animation have to be done with a json file capture with a kinect 2... I don't need a retargeting system in babylon.js but I need that yes, the animation is accurate. So me, I see it has a challenge ..... Thank you very much for your insight, Benoit Gauthier Quote Link to comment Share on other sites More sharing options...
dbawel Posted February 15, 2015 Share Posted February 15, 2015 Hi, Thanks for referring me to MMC. I took a look, and would like to know if the software supports the Kinext2 - as the MMC build appears quite dated from the English site I found to download. If there is an updated version in English, please provide a link as I'd like to compare to Brekel Body. Good luck with the work using json. Cheers. Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted February 15, 2015 Author Share Posted February 15, 2015 No it's a kinect 1 only software but wow !!! The motion capture in bvh is excellent !!! My personnal workflow is mmc (export bvh)--->Blender for retargeting (makewalk plugin) and making my model a beauty -----> export to babylon.js. You were right, with that workflow, I spend less time in the motion capture technicality and more time in the esthetic... And the beauty of my workflow, it's totally free and the results are awesome....If you have any info you can give me about the json file I will be very happy to listen to your advice... Have a good day, Benoit Quote Link to comment Share on other sites More sharing options...
dbawel Posted February 15, 2015 Share Posted February 15, 2015 I'll take a look at the work I did using json, and see if there might be abything useful beyond what is readily online. Thanks for the insight into MMC - I'm not a huge fan of .bvh for my own use, but it has certainly worked for decades and still used in production with the old Activision system (Biomechanics Corp. and now Giant). I'll test the Kinect 1 in MMC vs. Kinect v2 using Brekel, and see if there are substantial differences these days. Also, Brekel is working on newer versions still, and is also a friend - so I can ask for "favors" from time to time. Cheers. Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted July 7, 2015 Author Share Posted July 7, 2015 Hi dbawel !!! I must say that since a couple month I am using the wisdom of this post and it's true that not using a json format is the way to go !!!!! Me I am using the Kinect 1 and it's true that my mocap is fairly good. I think that the Kinect 2 is good but you need a very fast computer, usb 3 etc. Me I love my modest laptop and I am doing great with it with the kinect 1 with Brekel or mmc. So me I am working on a project (kind of) that it's possible for high school student to do mocap with a very limited budget....... I would like your thought on this or if it's possible having advice on how to get good mocap and where to go from there, animation, games etc..... Thanx, Ben Quote Link to comment Share on other sites More sharing options...
dbawel Posted July 8, 2015 Share Posted July 8, 2015 Hi benoit, In revisiting the json fromat, it is more than difficult to use. I believe you are already producing good mocap, and if you use the tools in Motionbuilder, you can refine this further. As for where to go with this knowledge - good question. However, if you want to purse mocap as a vocation, then you'll need to move onto the passive optical systems such as Motion Anaylysis and Vicon. These have expanded beyond capturing biomechanical data, and into other production areas such as camera tracking. In order to get into these areas, you must know Motionbuilder and I also recommend Blade, and take a job as an intern at a mocp facility - one that is actually doing work daily, as most system simply sit waiting for work. Cheers, Quote Link to comment Share on other sites More sharing options...
fenomas Posted July 8, 2015 Share Posted July 8, 2015 Jumping back a moment to jthree, it seems to be more aimed at non-programmers. The getting-started article shows setting up a scene declaratively in an HTML-like language:<goml> <head> <txr id="txr1" src="img/earth.jpg" /> <geo id="geo1" type="Sphere" param="10 64 64" /> <mtl id="mtl1" type="MeshPhong" param="map: #txr1; color: #0ff; specular: #fff;" /> <rdr frame="body" camera="camera:first" param="antialias: true; clearColor: #fff;" /> </head> <body> <scene> <mmd model="model/miku/index.pmx" /> <mesh geo="#geo1" mtl="#mtl1" style="positionY: -10;"></mesh> ...It looks like it's a project with quite different goals from BJS. Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted July 9, 2015 Author Share Posted July 9, 2015 Yeah ! Because the goal I am working on, it's to have an avatar in babylon.js animated by the kinect in realtime. I am investigating a couple of route here..... Thanks, Benoit Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted July 9, 2015 Author Share Posted July 9, 2015 Something a little bit like that is interesting :https://www.youtube.com/watch?v=tDlSlu8IxuQ That's what I want to achieve but I miss a walkthrough or some experience to be able to build that code. Quote Link to comment Share on other sites More sharing options...
dbawel Posted July 9, 2015 Share Posted July 9, 2015 I wish I had time now, but I wrote the code to run reattime mocap from motiobuilder to most any other package. Ask me in 1 month, and I'm sure I can help you with this. benoit-1842 1 Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted July 9, 2015 Author Share Posted July 9, 2015 Thank you very much for this future help...:-) for now is it possible to have a little bit of code that I can put my hands on. And does your work is using the web ? Or it's all non-web stuff ?Thank you,Benoit Quote Link to comment Share on other sites More sharing options...
dbawel Posted July 9, 2015 Share Posted July 9, 2015 I have no code to parse real time mocap to a browser. It is all in different plugins specifically for a rendering package. I could send you a plugin for Motionbuilder that runs peer to peer from one Motionbuilder session to another remotely written in C++, but I doubt many people would be able to read the code. If you're up fpr it, I'm sure I have it on a disk. Let me know, but beware, it took 8 months to write and do the QA. Quote Link to comment Share on other sites More sharing options...
dbawel Posted July 11, 2015 Share Posted July 11, 2015 If you can understand this, then I can post more. Good info though - written 15 years ago but the math is as valid now as was then. /***************************************************************************** * Sources: * Shoemake, Ken, "Animating Rotations with Quaternion Curves" * Computer Graphics 85, pp. 245-254 * Watt and Watt, Advanced Animation and Rendering Techniques * Addison Wesley, pp. 360-368 * Shoemake, Graphic Gems II. * *****************************************************************************/ /***************************************************************************** * Copyright 1997 Jeff Lander, All Rights Reserved. * For educational purposes only. * Please do not republish in electronic or print form without permission * Thanks - [email protected] *****************************************************************************/ #include "stdafx.h"#include <math.h>#include "amQuaternion.h" #define M_PI 3.14159265358979323846#define HALF_PI 1.57079632679489661923 void CopyVector(tVector *dest, tVector *src){dest->x = src->x;dest->y = src->y;dest->z = src->z;} void ScaleVector(tVector *vect, float scale){vect->x *= scale;vect->y *= scale;vect->z *= scale;} void AddVectors(tVector *vect1, tVector *vect2, tVector *dest){dest->x = vect1->x + vect2->x;dest->y = vect1->y + vect2->y;dest->z = vect1->z + vect2->z;} float DotVectors(tVector *vect1, tVector *vect2){return (vect1->x * vect2->x) + (vect1->y * vect2->y) + (vect1->z * vect2->z);} void CrossVectors(tVector *vect1, tVector *vect2, tVector *dest){// COMPUTE THE CROSS PRODUCTdest->x = (vect1->y * vect2->z) - (vect1->z * vect2->y);dest->y = (vect1->z * vect2->x) - (vect1->x * vect2->z);dest->z = (vect1->x * vect2->y) - (vect1->y * vect2->x);} void MultQuaternions(tQuaternion *quat1, tQuaternion *quat2, tQuaternion *dest){tQuaternion v1,v2,v3,vf; CopyVector((tVector *)&v1, (tVector *)quat1); // COPY OFF THE VECTOR PART OF THE QUAT1ScaleVector((tVector *)&v1,quat2->w); // MULTIPLY IT BY THE SCALAR PART OF QUAT2 CopyVector((tVector *)&v2, (tVector *)quat2); // COPY OFF THE VECTOR PART OF THE QUAT1ScaleVector((tVector *)&v2,quat1->w); // MULTIPLY IT BY THE SCALAR PART OF QUAT2 CrossVectors((tVector *)quat2,(tVector *)quat1,(tVector *)&v3); AddVectors((tVector *)&v1, (tVector *)&v2, (tVector *)&vf);AddVectors((tVector *)&v3, (tVector *)&vf, (tVector *)&vf); vf.w = (quat1->w * quat2->w) - DotVectors((tVector *)quat1,(tVector *)quat2); dest->x = vf.x;dest->y = vf.y;dest->z = vf.z;dest->w = vf.w;} /* AN OPTIMIZATION/REORGANIZATION OF ABOVE CODE - NOT AS CLEAR I THINK THIS IS SIMILAR TO GRAPHIC GEMS THOUGH I DON'T HAVE THE REF HANDY THE MATH CHECKS OUT THOUGH */void MultQuaternions2(tQuaternion *quat1, tQuaternion *quat2, tQuaternion *dest){ tQuaternion tmp; tmp.x = quat2->w * quat1->x + quat2->x * quat1->w +quat2->y * quat1->z - quat2->z * quat1->y; tmp.y = quat2->w * quat1->y + quat2->y * quat1->w +quat2->z * quat1->x - quat2->x * quat1->z; tmp.z = quat2->w * quat1->z + quat2->z * quat1->w +quat2->x * quat1->y - quat2->y * quat1->x; tmp.w = quat2->w * quat1->w - quat2->x * quat1->x -quat2->y * quat1->y - quat2->z * quat1->z; dest->x = tmp.x; dest->y = tmp.y; dest->z = tmp.z; dest->w = tmp.w;} // Discussion: Quaternions must follow the rules of x^2 + y^2 + z^2 + w^2 = 1// This function insures thisvoid NormalizeQuaternion(tQuaternion *quat){float magnitude; // FIRST STEP, FIND THE MAGNITUDEmagnitude = (quat->x * quat->x) + (quat->y * quat->y) + (quat->z * quat->z) + (quat->w * quat->w); // DIVIDE BY THE MAGNITUDE TO NORMALIZEquat->x = quat->x / magnitude;quat->y = quat->y / magnitude;quat->z = quat->z / magnitude;quat->w = quat->w / magnitude;} // Uses (X,Y,Z) order void EulerToQuaternion(tVector *rot, tQuaternion *quat){float rx, ry, rz,tx, ty, tz,cx, cy, cz,sx, sy, sz,cc, cs, sc, ss; // FIRST STEP, CONVERT ANGLES TO RADIANSrx = (rot->x * (float)M_PI) / (360 / 2);ry = (rot->y * (float)M_PI) / (360 / 2);rz = (rot->z * (float)M_PI) / (360 / 2); // GET THE HALF ANGLEStx = rx * (float)0.5;ty = ry * (float)0.5;tz = rz * (float)0.5; cx = (float)cos(tx);cy = (float)cos(ty);cz = (float)cos(tz);sx = (float)sin(tx);sy = (float)sin(ty);sz = (float)sin(tz); cc = cx * cz;cs = cx * sz;sc = sx * cz;ss = sx * sz; quat->x = (cy * sc) - (sy * cs);quat->y = (cy * ss) + (sy * cc);quat->z = (cy * cs) - (sy * sc);quat->w = (cy * cc) + (sy * ss); // INSURE THE QUATERNION IS NORMALIZED// PROBABLY NOT NECESSARY IN MOST CASESNormalizeQuaternion(quat);} // A second variation. Creates a series of quaternions and multiplies// them together. Would be easier to extend this for other rotation orders void EulerToQuaternion2(tVector *rot, tQuaternion *quat){float rx,ry,rz,ti,tj,tk;tQuaternion qx,qy,qz,qf; // FIRST STEP, CONVERT ANGLES TO RADIANSrx = (rot->x * (float)M_PI) / (360 / 2);ry = (rot->y * (float)M_PI) / (360 / 2);rz = (rot->z * (float)M_PI) / (360 / 2);// GET THE HALF ANGLESti = rx * (float)0.5;tj = ry * (float)0.5;tk = rz * (float)0.5; qx.x = (float)sin(ti); qx.y = 0.0; qx.z = 0.0; qx.w = (float)cos(ti);qy.x = 0.0; qy.y = (float)sin(tj); qy.z = 0.0; qy.w = (float)cos(tj);qz.x = 0.0; qz.y = 0.0; qz.z = (float)sin(tk); qz.w = (float)cos(tk); MultQuaternions(&qx,&qy,&qf);MultQuaternions(&qf,&qz,&qf); // ANOTHER TEST VARIATION// MultQuaternions2(&qx,&qy,&qf);// MultQuaternions2(&qf,&qz,&qf); // INSURE THE QUATERNION IS NORMALIZED// PROBABLY NOT NECESSARY IN MOST CASESNormalizeQuaternion(&qf); quat->x = qf.x;quat->y = qf.y;quat->z = qf.z;quat->w = qf.w; } void QuatToAxisAngle(tQuaternion *quat, tQuaternion *axisAngle){float scale,tw; tw = (float)acos(quat->w) * 2;scale = (float)sin(tw / 2.0);axisAngle->x = quat->x / scale;axisAngle->y = quat->y / scale;axisAngle->z = quat->z / scale; // NOW CONVERT THE ANGLE OF ROTATION BACK TO DEGREESaxisAngle->w = (tw * (360 / 2)) / (float)M_PI;} #define DELTA 0.0001 // DIFFERENCE AT WHICH TO LERP INSTEAD OF SLERP void SlerpQuat2(tQuaternion *quat1, tQuaternion *quat2, float slerp, tQuaternion *result){tQuaternion quat1b;double omega, cosom, sinom, scale0, scale1; // USE THE DOT PRODUCT TO GET THE COSINE OF THE ANGLE BETWEEN THE// QUATERNIONScosom = quat1->x * quat2->x + quat1->y * quat2->y + quat1->z * quat2->z + quat1->w * quat2->w; // MAKE SURE WE ARE TRAVELING ALONG THE SHORTER PATHif (cosom < 0.0){// IF WE ARE NOT, REVERSE ONE OF THE QUATERNIONScosom = -cosom;quat1b.x = - quat1->x;quat1b.y = - quat1->y;quat1b.z = - quat1->z;quat1b.w = - quat1->w;} else {quat1b.x = quat1->x;quat1b.y = quat1->y;quat1b.z = quat1->z;quat1b.w = quat1->w;} if ((1.0 - cosom) > DELTA) {omega = acos(cosom);sinom = sin(omega);scale0 = sin((1 - slerp) * omega) / sinom;scale1 = sin(slerp * omega) / sinom;} else {scale0 = 1.0 - slerp;scale1 = slerp;} result->x = (float)(scale0 * quat1->x + scale1 * quat2->x);result->y = (float)(scale0 * quat1->y + scale1 * quat2->y);result->z = (float)(scale0 * quat1->z + scale1 * quat2->z);result->w = (float)(scale0 * quat1->w + scale1 * quat2->w);} void SlerpQuat(tQuaternion *quat1, tQuaternion *quat2, float slerp, tQuaternion *result){double omega, cosom, sinom, scale0, scale1; // USE THE DOT PRODUCT TO GET THE COSINE OF THE ANGLE BETWEEN THE// QUATERNIONScosom = quat1->x * quat2->x + quat1->y * quat2->y + quat1->z * quat2->z + quat1->w * quat2->w; // CHECK A COUPLE OF SPECIAL CASES. // MAKE SURE THE TWO QUATERNIONS ARE NOT EXACTLY OPPOSITE? (WITHIN A LITTLE SLOP)if ((1.0 + cosom) > DELTA){// ARE THEY MORE THAN A LITTLE BIT DIFFERENT? AVOID A DIVIDED BY ZERO AND LERP IF NOTif ((1.0 - cosom) > DELTA) {// YES, DO A SLERPomega = acos(cosom);sinom = sin(omega);scale0 = sin((1.0 - slerp) * omega) / sinom;scale1 = sin(slerp * omega) / sinom;} else {// NOT A VERY BIG DIFFERENCE, DO A LERPscale0 = 1.0 - slerp;scale1 = slerp;}result->x = (float)(scale0 * quat1->x + scale1 * quat2->x);result->y = (float)(scale0 * quat1->y + scale1 * quat2->y);result->z = (float)(scale0 * quat1->z + scale1 * quat2->z);result->w = (float)(scale0 * quat1->w + scale1 * quat2->w);} else {// THE QUATERNIONS ARE NEARLY OPPOSITE SO TO AVOID A DIVIDED BY ZERO ERROR// CALCULATE A PERPENDICULAR QUATERNION AND SLERP THAT DIRECTIONresult->x = -quat2->y;result->y = quat2->x;result->z = -quat2->w;result->w = quat2->z;scale0 = sin((1.0 - slerp) * (float)HALF_PI);scale1 = sin(slerp * (float)HALF_PI);result->x = (float)(scale0 * quat1->x + scale1 * result->x);result->y = (float)(scale0 * quat1->y + scale1 * result->y);result->z = (float)(scale0 * quat1->z + scale1 * result->z);result->w = (float)(scale0 * quat1->w + scale1 * result->w);} } Quote Link to comment Share on other sites More sharing options...
benoit-1842 Posted July 13, 2015 Author Share Posted July 13, 2015 Thanx for all those datas it means a lot !!!! Now it's to figure out where every parts is going where thanx, Benoit Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.