Jump to content

MRI data in babylon

Recommended Posts

Hi @Deltakosh  @Wingnut,

        Is it possible to load MRI data as a 3D model using BJS. Suppose if I have MRI data of Brain as a sequence of images, can I place all those sequences together and create a model. Also, i want to select individual slices separately after the model is created.




-Raghavender Mylagary


Link to comment
Share on other sites

Hi again RM.  I don't think that is possible yet, with BJS core code.  You would need to write your own code/extension.

I think Blender CAN turn MRI slices into a single mesh, but, when you import it into BJS, it will still be a single mesh, and you will not be able to slice the mesh very easily... while live in a BJS scene.

Let's pretend that you COULD remove 50% of the from-Blender full-skull mesh (removed the camera-facing 50% of the mesh).  Let's also pretend you have 100 frames of MRI scans.

After you/user 50% halved the mesh, you would need to apply the 51st frame of the MRI image data... as a texture... on that new flat-side of the skull.

Similarly, if the user removed 25% of the Blender-created whole-mesh (somehow, maybe with BJS CSG)... then you would need to apply the 26th frame of the MRI data... to that flat face of the skull mesh.  This is because the single skull mesh from Blender... is likely hollow.  No internals to the skull.

@Nesh108 is working on a "sprite thickener" which uses edge-detection to find edges, and then... makes a real mesh... from a transparent-background or black-background image/sprite.  This seems somewhat applicable to YOUR project.

Do you see the "artifacts" (gray debris) that happen in space... left of the eyeballs... about 2/3 into your animation?  I think that might screw-up Nesh's edge finder.  I think Nesh's edge-finder might need VERY clean edges... and I don't know if that MRI data has edges that are clean enough.

And Nesh's thing definitely takes some code.  I'm not sure if this is the correct approach for your cross-sectioned visible-skull project.

Thickness of each slice-mesh is very important.  Nesh's thing ONLY thickens sprites.  I don't think he ever planned on "stacking" these extrudes... in an attempt to recreate a skull's roundness.  Maybe his code is ready for that challenge, but, I know it was never intended to do such things as MRI slices.  Perhaps he will comment.

I did some web reading about this, looking at folks/companies who are building MRI-to-3D viewers.  That is when I saw someone's video... using Blender to create a model from MRI data. 

BUT... it wasn't ONE mesh-per-slice... it was one mesh for ALL slices, I think.  Ideally, you would need one slightly-extruded mesh per-slice.  So, for 100 frames of MRI data, Blender would return 100 flat mesh, which could then be z-scaled/thickened (if necessary) and stacked/unstacked in a BJS scene.  Quite a long procedure, and not ideal.

Perhaps others will have more ideas.  Sorry that I only have bad news.  I will keep thinking.

Link to comment
Share on other sites

Having got to a difficult place with my project I though yours was interesting. Converting each slice to  meshes representing parts of the head is way way beyond me. However like @Wingnut I too thought about the project Nesh108 was working on so had a go, albeit a simplified version, using just one plane per slice. Used Adobe ImageReady to separate out the slices and realised that the 'black' areas would need to be transparent. As loading images into a canvas creates CORS problems I have had to host my attempt in github. Its a starter rather than any finished project but I present it for what its worth,


When the head ( and i use the term loosely)  has loaded type a number from 0 to 127 in the box in the top left and that slice will pop out.

Lots of rough edges ( and they are very noticeable)


Link to comment
Share on other sites

Coooooool!  Not bad at all.  In fact, I pretty much love it!  Way nice.

Remember that damned bird that's been flyin' around here?


That's something.  It almost feels like the MRI data should be converted to JSON... with middleware... and not "live".  Then import the .json (just like the bird shape).

Or... hmm... bring it in as "morph targets"!  (woah, I just blew my own mind!)  :wacko:

MorphTargets are new to BJS, and it would be very fun to play with those.  hmm.  This would be a mis-use or bastardization of morphTarget data, but I don't see any cops around.  ;)

For each frame of a "morph animation", the "delta" is applied.  (What?)  At NASA and maybe other places, a "delta" is a "tweak"... a change-from-normal or change-since-last-change.  Often, at bedtime aboard the shuttle, ground control would say "Okay, we have no deltas for you, so, goodnight".  It meant... "we have no switches, knobs or other settings/tasks for you to perform."  :)  No deltas.  This would also imply that a "delta kosh" is a kosh of change-since-last-time.  An adjusting-Kosh.  A tweaker Kosh.  (What a boring story, Wingy, geez)

This is what morphTargets are, as best I understand them.  Each frame has the same amount of points, and each morphTarget is the amount-of-change-in-vertex-position... since the last frame.  The "delta"... the change amount and direction of the vertex (since last time).  In a way, a vert scaling (I think).  I think the BJS particleSystem uses a scalePerTime thing... for flying the particles. 

Picture this.  Spray particles for awhile, then freeze 'em.  Now make a mesh from all the outer-edge-particle-positions of the particle-cloud.  Enclose and volumize the cloud.  Wow!  Cooooool.  The water fountain freezes solid... instantly!  It's so... DISNEY!!!  heh

Anyway, I hope my use of the term "delta" doesn't collide with any other usages of that term.  Other terms... wholeheartedly welcomed.

And... you know... using morphTarget-like data packed into a JSON file...and then treated like morphTarget data by BJS... is not necessary.  But... it seems to be an interesting workflow and allows us to play-with morphTargets.  Maybe fun!

But again, I think MRI-to-JSON-BirdDataFrames ... should be done with middleware... pre-processing.  Get it to json before the import to BJS... perhaps.

John, if you needed to run "cleanup" software on your model(s)... you would want it running at full machine-language speeds, I would think.  Cleaning that with smart JS... might take about a week before observing scene.isReady.  :)

Link to comment
Share on other sites

@Raghavender Mylagary: You might want to look at 3D slicer. You can load images and create a 3dmodel which can be automatically cleaned up and exported in an .stl format

3DSlicer and stl export

@JohnK ; Yes the images can be noisy which means clean up can take some time.

Been a long time since I used magnetic resonance - first thing I ever saw was a magnetic resonance image of a lemon, and wondered where it would go.

cheers, gryff :)

Link to comment
Share on other sites

@Raghavender Mylagary heya! (thanks for the ping @Wingnut)

You know, I actually worked for a company using MRI scans back in the day, so I was going to suggest some sort of solution but @JohnK's example is already pretty much there! Good job, it's a pretty sweet "starter" :D 

@Raghavender Mylagary starting from JohnK's example, adding a slider (or even an automatic play forward/backward) would be trivial and then you just need to start tweaking the background, add some GUI and you are done!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Create New...