Jump to content

jthree.jp, kinect v2, socket etc......


benoit-1842
 Share

Recommended Posts

Hi everybody !!!!!  Before moving on I will like to say that babylon.js is the best community out there and I will always be faithful...  But I have found today a library call jthree.js and it looks very interesting because they are doing exactkly what I am trying to do....  Does a webgl guru give me some pointers on how this setup works : http://editor.jthree.jp/?id=VCpG2w

 

Thanx,

 

Benoit

 

 

Link to comment
Share on other sites

Hi,

 

I use the Kinect v2 daily, and animare characters using the BabylonJS framework.  I can see what you're doing, but have no idea why either so much is displayed or if all of the elements displayed are essential to animating using the Kinect.  My pipeline is 5 minutes of capture and maybe 30 (often 5) minutes to publish online using existing software.  Can you provide more info on your process?  Cheers.

Link to comment
Share on other sites

hi thanx for answering !!!!  What I am trying to do is I capture the joint data of a skeleton with the kinect v2 and I want to implement this animation or data inside an avatar in babylon.js.  But I don't know how to grab my data from a json file and put it in my model....  Any pointer will be appreciated...  Here's a json file of the data I am receiving...

 

 

https://drive.google...iew?usp=sharing

 

Thanx for responding,

 

Benoit

Link to comment
Share on other sites

Hi,

 

I completely seperate the building and rigging of any character - the characer is it's own easthetic model and the rigging (skeleton) only has to work within the 3D software which has to have an FBX (or .babylon) exporter.  Once this is acomplished and the character is deforming the way it was designed, then I import my character into Motionbuilder, and characterize it (if you need any help with this, I can certainly provide the steps to do so.)  Once I have my character in MotionBuilder, I then use software made by my long time friends in Amsterdam - the software is named Brekel Pro Body - below is the link to their site:

 

http://brekel.com/

 

You can use either probody1 or probody2 based upon your version of the Kinect.  I prefer the Kinect v2 as it has increased the resolution by a value of approximately 4x - so the results are far greter than the Kinect v1.  But it matters far little than you might think for the motion capture.  They are both very responsive, and Brekel will provide you with a temp fully functional license upon request for you to test.  However, once I have a character in MotionBuilder, I can do so much with it.  Ican capture motion and within a few minutes export a .babylon file and publish on our server.  Again, I'm very happy to walk you through the steps to do this.

 

Please keep in mind there are strict vertex and polygon restrictions for render in WebGL when building characer models and their children, but otherwise it's a fairly simple task.  Once you capture the motion using the Brekel plugin to MotionBuilder, and once the fbx file is written out and converted to .babylon - I am able to produce and publish online as much animation as I can capture and publish - very little other work is involved - once your pipreline is in place.  I hope this helps, but please ask for any assistance you might need.  I'm alittle busy right now, but will check daily if you might like this pipeline and ask for any assistance.  Cheers.

Link to comment
Share on other sites

Hi Thank you for your interest... Yes I have use brekel and the results are great !!!! But you are right there's not a lot of difference in mocap in between the k1 and k2... I have used a software call mikumikucapture and it's working well. But that's not my problem.. Right now we are using a skeleton capture in json ( like the file I have give to you in one of the reply). My challenge is to put those datas at the right bone in the code in b.js and after to be able to have a new set of bone position and animation and be able to change it.... So I have a model in b.js and i want with that model to play animation 1 i.e And after I need to play animation 2 but using only json file and not having to pass by blender or motion builder to build the animation I need to target thev json file to the correct bones inside babylon.js....Hope it makes sense,

Benoit

Link to comment
Share on other sites

Hi,

 

I have only looked at using the json format to evaluate, and have not produced any animation that I have continued with to work within babylon.  This is because there are so many attributes that are required to be declared, and there is no layer of association using software (compiled code) to simplify this process.  After more than 20 years of working with motion capture and all of the issues in re-targetting motion, I quickly determined not to further explore the json format.  From my experience, you are attempting to animate using a format which will provide a quality of animation which will never be consistant - especially when attempting to re-target a persons motion to a skeleton of different proportions which wil always be the case.

 

I wish I could advise a course for your current pipeline, but I would advise against the direction in which you are currently following.  Using MotionBuilder will save considerable time which almost always directly results in saving cost, and the goal is to publish animated characters as far as I can tell.  Using the Kinect v2, I can scan, build and bind to a skeleton, retarget, animate, and publish online animated characters within a day.  And if I already have a charcter built, the process takes 1 or 2 hours currently.

 

When I purchased the first license of Filmbox (MotionBuilder) way back in 1996, it was quite difficult to re-target animation using their software as it was built to animate light shows - not characters.  However I would still choose this first version over the json format.  The people who contribute to the json format are quite thorough, but I choose to spend my time on aesthetics rather than an ever changing process of coding my captured motion to a skeleton of considerably different proportions as will always be the case.

 

I wish I could offer more encouragement, but I must tell you of the great pain I have experienced over the years using mocap, and the various options now available to make the process much more simple.  I wish you all the best.  Cheers.

Link to comment
Share on other sites

I know exactly what you mean.  For my personnal work, I work with almost the same pipeline as you......  MikuMikuCapture (way better than brekel...trust me), the motion capture goes into blender or motion builder for retargeting and I export that to babylon.js with no problem....  the problem is sometime you have to work with company constraints and one of them is the animation have to be done with a json file capture with a kinect 2...  I don't need a retargeting system in babylon.js but I need that yes, the animation is accurate.  So me, I see it has a challenge :).....

 

Thank you very much for your insight,

 

Benoit Gauthier

Link to comment
Share on other sites

Hi,

 

Thanks for referring me to MMC.  I took a look, and would like to know if the software supports the Kinext2 - as the MMC build appears quite dated from the English site I found to download.  If there is an updated version in English, please provide a link as I'd like to compare to Brekel Body.  Good luck with the work using json.  Cheers. :)

Link to comment
Share on other sites

No it's a kinect 1 only software but wow !!!  The motion capture in bvh is excellent !!!  My personnal workflow is mmc (export bvh)--->

Blender for retargeting (makewalk plugin) and making my model a beauty -----> export to babylon.js.  You were right, with that workflow, I spend less time in the motion capture technicality and more time in the esthetic...    And the beauty of my workflow, it's totally free and the results are awesome....If you have any info you can give me about the json file I will be very happy to listen to your advice...   Have a good day,

 

Benoit

Link to comment
Share on other sites

I'll take a look at the work I did using json, and see if there might be abything useful beyond what is readily online.  Thanks for the insight into MMC - I'm not a huge fan of .bvh for my own use, but it has certainly worked for decades and still used in production with the old Activision system (Biomechanics Corp. and now Giant).  I'll test the Kinect 1 in MMC vs. Kinect v2 using Brekel, and see if there are substantial differences these days.  Also, Brekel is working on newer versions still, and is also a friend - so I can ask for "favors" from time to time.  Cheers.

Link to comment
Share on other sites

  • 4 months later...

Hi dbawel !!!  I must say that since a couple month I am using the wisdom of this post and it's true that not using a json format is the way to go !!!!!  Me I am using the Kinect 1 and it's true that my mocap is fairly good.  I think that the Kinect 2 is good but you need a very fast computer, usb 3 etc.  Me I love my modest laptop and I am doing great with it with the kinect 1 with Brekel or mmc.  So me I am working on a project (kind of) that it's possible for high school student to do mocap with a very limited budget.......  I would like your thought on this or if it's possible having advice on how to get good mocap and where to go from there, animation, games etc.....

 

Thanx,

 

Ben

Link to comment
Share on other sites

Hi benoit,

 

In revisiting the json fromat, it is more than difficult to use. I believe you are already producing good mocap, and if you use the tools in Motionbuilder, you can refine this further.  As for where to go with this knowledge - good question.  However, if you want to purse mocap as a vocation, then you'll need to move onto the passive optical systems such as Motion Anaylysis and Vicon.  These have expanded beyond capturing biomechanical data, and into other production areas such as camera tracking.  In order to get into these areas, you must know Motionbuilder and I also recommend Blade, and take a job as an intern at a mocp facility - one that is actually doing work daily, as most system simply sit waiting for work.

 

Cheers,

Link to comment
Share on other sites

Jumping back a moment to jthree, it seems to be more aimed at non-programmers. The getting-started article shows setting up a scene declaratively in an HTML-like language:

<goml>    <head>        <txr id="txr1" src="img/earth.jpg" />        <geo id="geo1" type="Sphere" param="10 64 64" />        <mtl id="mtl1" type="MeshPhong" param="map: #txr1; color: #0ff; specular: #fff;" />        <rdr frame="body" camera="camera:first" param="antialias: true; clearColor: #fff;" />    </head>    <body>        <scene>            <mmd model="model/miku/index.pmx" />            <mesh geo="#geo1" mtl="#mtl1" style="positionY: -10;"></mesh>            ...

It looks like it's a project with quite different goals from BJS.

Link to comment
Share on other sites

I have no code to parse real time mocap to a browser.  It is all in different plugins specifically for a rendering package.  I could send you a plugin for Motionbuilder that runs peer to peer from one Motionbuilder session to another remotely written in C++, but I doubt many people would be able to read the code.  If you're up fpr it, I'm sure I have it on a disk.  Let me know, but beware, it took 8 months to write and do the QA.

Link to comment
Share on other sites

If you can understand this, then I can post more.  Good info though - written 15 years ago but the math is as valid now as was then.

 

/*****************************************************************************
 * Sources:
 * Shoemake, Ken, "Animating Rotations with Quaternion Curves"
 * Computer Graphics 85, pp. 245-254
 * Watt and Watt, Advanced Animation and Rendering Techniques
 * Addison Wesley, pp. 360-368
 *  Shoemake, Graphic Gems II.
 *
 *****************************************************************************/
 
/*****************************************************************************
 * Copyright 1997 Jeff Lander, All Rights Reserved.
 *  For educational purposes only.
 *  Please do not republish in electronic or print form without permission
 *  Thanks - [email protected]
 *****************************************************************************/
 
#include "stdafx.h"
#include <math.h>
#include "amQuaternion.h"
 
#define M_PI        3.14159265358979323846
#define HALF_PI    1.57079632679489661923
 
void CopyVector(tVector *dest, tVector *src)
{
dest->x = src->x;
dest->y = src->y;
dest->z = src->z;
}
 
void ScaleVector(tVector *vect, float scale)
{
vect->x *= scale;
vect->y *= scale;
vect->z *= scale;
}
 
void AddVectors(tVector *vect1, tVector *vect2, tVector *dest)
{
dest->x = vect1->x + vect2->x;
dest->y = vect1->y + vect2->y;
dest->z = vect1->z + vect2->z;
}
 
float DotVectors(tVector *vect1, tVector *vect2)
{
return (vect1->x * vect2->x) + 
(vect1->y * vect2->y) + 
(vect1->z * vect2->z);
}
 
void CrossVectors(tVector *vect1, tVector *vect2, tVector *dest)
{
// COMPUTE THE CROSS PRODUCT
dest->x = (vect1->y * vect2->z) - (vect1->z * vect2->y);
dest->y = (vect1->z * vect2->x) - (vect1->x * vect2->z);
dest->z = (vect1->x * vect2->y) - (vect1->y * vect2->x);
}
 
void MultQuaternions(tQuaternion *quat1, tQuaternion *quat2, tQuaternion *dest)
{
tQuaternion v1,v2,v3,vf;
 
CopyVector((tVector *)&v1, (tVector *)quat1); // COPY OFF THE VECTOR PART OF THE QUAT1
ScaleVector((tVector *)&v1,quat2->w); // MULTIPLY IT BY THE SCALAR PART OF QUAT2
 
CopyVector((tVector *)&v2, (tVector *)quat2); // COPY OFF THE VECTOR PART OF THE QUAT1
ScaleVector((tVector *)&v2,quat1->w); // MULTIPLY IT BY THE SCALAR PART OF QUAT2
 
CrossVectors((tVector *)quat2,(tVector *)quat1,(tVector *)&v3);
 
AddVectors((tVector *)&v1, (tVector *)&v2, (tVector *)&vf);
AddVectors((tVector *)&v3, (tVector *)&vf, (tVector *)&vf);
 
vf.w = (quat1->w * quat2->w) - DotVectors((tVector *)quat1,(tVector *)quat2);
 
dest->x = vf.x;
dest->y = vf.y;
dest->z = vf.z;
dest->w = vf.w;
}
 
/* AN OPTIMIZATION/REORGANIZATION OF ABOVE CODE - NOT AS CLEAR 
   I THINK THIS IS SIMILAR TO GRAPHIC GEMS THOUGH I DON'T HAVE THE REF HANDY
   THE MATH CHECKS OUT THOUGH */
void MultQuaternions2(tQuaternion *quat1, tQuaternion *quat2, tQuaternion *dest)
{
    tQuaternion tmp;
    tmp.x = quat2->w * quat1->x + quat2->x * quat1->w +
quat2->y * quat1->z - quat2->z * quat1->y;
    tmp.y  = quat2->w * quat1->y + quat2->y * quat1->w +
quat2->z * quat1->x - quat2->x * quat1->z;
    tmp.z  = quat2->w * quat1->z + quat2->z * quat1->w +
quat2->x * quat1->y - quat2->y * quat1->x;
    tmp.w  = quat2->w * quat1->w - quat2->x * quat1->x -
quat2->y * quat1->y - quat2->z * quat1->z;
    dest->x = tmp.x; dest->y = tmp.y;
    dest->z = tmp.z; dest->w = tmp.w;
}
 
// Discussion:  Quaternions must follow the rules of x^2 + y^2 + z^2 + w^2 = 1
// This function insures this
void NormalizeQuaternion(tQuaternion *quat)
{
float magnitude;
 
// FIRST STEP, FIND THE MAGNITUDE
magnitude = (quat->x * quat->x) + 
(quat->y * quat->y) + 
(quat->z * quat->z) + 
(quat->w * quat->w);
 
// DIVIDE BY THE MAGNITUDE TO NORMALIZE
quat->x = quat->x / magnitude;
quat->y = quat->y / magnitude;
quat->z = quat->z / magnitude;
quat->w = quat->w / magnitude;
}
 
// Uses (X,Y,Z) order
 
void EulerToQuaternion(tVector *rot, tQuaternion *quat)
{
float rx, ry, rz,
tx, ty, tz,
cx, cy, cz,
sx, sy, sz,
cc, cs, sc, ss;
 
// FIRST STEP, CONVERT ANGLES TO RADIANS
rx =  (rot->x * (float)M_PI) / (360 / 2);
ry =  (rot->y * (float)M_PI) / (360 / 2);
rz =  (rot->z * (float)M_PI) / (360 / 2);
 
// GET THE HALF ANGLES
tx = rx * (float)0.5;
ty = ry * (float)0.5;
tz = rz * (float)0.5;
 
cx = (float)cos(tx);
cy = (float)cos(ty);
cz = (float)cos(tz);
sx = (float)sin(tx);
sy = (float)sin(ty);
sz = (float)sin(tz);
 
cc = cx * cz;
cs = cx * sz;
sc = sx * cz;
ss = sx * sz;
 
quat->x = (cy * sc) - (sy * cs);
quat->y = (cy * ss) + (sy * cc);
quat->z = (cy * cs) - (sy * sc);
quat->w = (cy * cc) + (sy * ss);
 
// INSURE THE QUATERNION IS NORMALIZED
// PROBABLY NOT NECESSARY IN MOST CASES
NormalizeQuaternion(quat);
}
 
// A second variation.  Creates a series of quaternions and multiplies
// them together. Would be easier to extend this for other rotation orders
 
void EulerToQuaternion2(tVector *rot, tQuaternion *quat)
{
float rx,ry,rz,ti,tj,tk;
tQuaternion qx,qy,qz,qf;
 
// FIRST STEP, CONVERT ANGLES TO RADIANS
rx =  (rot->x * (float)M_PI) / (360 / 2);
ry =  (rot->y * (float)M_PI) / (360 / 2);
rz =  (rot->z * (float)M_PI) / (360 / 2);
// GET THE HALF ANGLES
ti = rx * (float)0.5;
tj = ry * (float)0.5;
tk = rz * (float)0.5;
 
qx.x = (float)sin(ti); qx.y = 0.0; qx.z = 0.0; qx.w = (float)cos(ti);
qy.x = 0.0; qy.y = (float)sin(tj); qy.z = 0.0; qy.w = (float)cos(tj);
qz.x = 0.0; qz.y = 0.0; qz.z = (float)sin(tk); qz.w = (float)cos(tk);
 
MultQuaternions(&qx,&qy,&qf);
MultQuaternions(&qf,&qz,&qf);
 
// ANOTHER TEST VARIATION
// MultQuaternions2(&qx,&qy,&qf);
// MultQuaternions2(&qf,&qz,&qf);
 
// INSURE THE QUATERNION IS NORMALIZED
// PROBABLY NOT NECESSARY IN MOST CASES
NormalizeQuaternion(&qf);
 
quat->x = qf.x;
quat->y = qf.y;
quat->z = qf.z;
quat->w = qf.w;
 
}
 
void QuatToAxisAngle(tQuaternion *quat, tQuaternion *axisAngle)
{
float scale,tw;
 
tw = (float)acos(quat->w) * 2;
scale = (float)sin(tw / 2.0);
axisAngle->x = quat->x / scale;
axisAngle->y = quat->y / scale;
axisAngle->z = quat->z / scale;
 
// NOW CONVERT THE ANGLE OF ROTATION BACK TO DEGREES
axisAngle->w = (tw * (360 / 2)) / (float)M_PI;
}
 
#define DELTA 0.0001 // DIFFERENCE AT WHICH TO LERP INSTEAD OF SLERP
 
void SlerpQuat2(tQuaternion *quat1, tQuaternion *quat2, float slerp, tQuaternion *result)
{
tQuaternion quat1b;
double omega, cosom, sinom, scale0, scale1;
 
// USE THE DOT PRODUCT TO GET THE COSINE OF THE ANGLE BETWEEN THE
// QUATERNIONS
cosom = quat1->x * quat2->x + 
quat1->y * quat2->y + 
quat1->z * quat2->z + 
quat1->w * quat2->w; 
 
// MAKE SURE WE ARE TRAVELING ALONG THE SHORTER PATH
if (cosom < 0.0)
{
// IF WE ARE NOT, REVERSE ONE OF THE QUATERNIONS
cosom = -cosom;
quat1b.x = - quat1->x;
quat1b.y = - quat1->y;
quat1b.z = - quat1->z;
quat1b.w = - quat1->w;
} else {
quat1b.x = quat1->x;
quat1b.y = quat1->y;
quat1b.z = quat1->z;
quat1b.w = quat1->w;
}
 
 
if ((1.0 - cosom) > DELTA) {
omega = acos(cosom);
sinom = sin(omega);
scale0 = sin((1 - slerp) * omega) / sinom;
scale1 = sin(slerp * omega) / sinom;
} else {
scale0 = 1.0 - slerp;
scale1 = slerp;
}
 
result->x = (float)(scale0 * quat1->x + scale1 * quat2->x);
result->y = (float)(scale0 * quat1->y + scale1 * quat2->y);
result->z = (float)(scale0 * quat1->z + scale1 * quat2->z);
result->w = (float)(scale0 * quat1->w + scale1 * quat2->w);
}
 
void SlerpQuat(tQuaternion *quat1, tQuaternion *quat2, float slerp, tQuaternion *result)
{
double omega, cosom, sinom, scale0, scale1;
 
// USE THE DOT PRODUCT TO GET THE COSINE OF THE ANGLE BETWEEN THE
// QUATERNIONS
cosom = quat1->x * quat2->x + 
quat1->y * quat2->y + 
quat1->z * quat2->z + 
quat1->w * quat2->w; 
 
// CHECK A COUPLE OF SPECIAL CASES. 
// MAKE SURE THE TWO QUATERNIONS ARE NOT EXACTLY OPPOSITE? (WITHIN A LITTLE SLOP)
if ((1.0 + cosom) > DELTA)
{
// ARE THEY MORE THAN A LITTLE BIT DIFFERENT? AVOID A DIVIDED BY ZERO AND LERP IF NOT
if ((1.0 - cosom) > DELTA) {
// YES, DO A SLERP
omega = acos(cosom);
sinom = sin(omega);
scale0 = sin((1.0 - slerp) * omega) / sinom;
scale1 = sin(slerp * omega) / sinom;
} else {
// NOT A VERY BIG DIFFERENCE, DO A LERP
scale0 = 1.0 - slerp;
scale1 = slerp;
}
result->x = (float)(scale0 * quat1->x + scale1 * quat2->x);
result->y = (float)(scale0 * quat1->y + scale1 * quat2->y);
result->z = (float)(scale0 * quat1->z + scale1 * quat2->z);
result->w = (float)(scale0 * quat1->w + scale1 * quat2->w);
} else {
// THE QUATERNIONS ARE NEARLY OPPOSITE SO TO AVOID A DIVIDED BY ZERO ERROR
// CALCULATE A PERPENDICULAR QUATERNION AND SLERP THAT DIRECTION
result->x = -quat2->y;
result->y = quat2->x;
result->z = -quat2->w;
result->w = quat2->z;
scale0 = sin((1.0 - slerp) * (float)HALF_PI);
scale1 = sin(slerp * (float)HALF_PI);
result->x = (float)(scale0 * quat1->x + scale1 * result->x);
result->y = (float)(scale0 * quat1->y + scale1 * result->y);
result->z = (float)(scale0 * quat1->z + scale1 * result->z);
result->w = (float)(scale0 * quat1->w + scale1 * result->w);
}
 
}
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...