Jump to content

Vieweing a z-stack and surfaces


ulisse
 Share

Recommended Posts

Dear all,
I am new to this forum and to this topic so I apologize in advance for any wrong terminology.
I am writing an html5+js application displaying a 3d image that have been acquired as a serie of 2d slices, each at a different depth, from a microscope.
Over that "z-stack" I would like to add also some meshes/rendered surfaces (computer created geometries).
Do you think that this is possible using Babylon.js?
The idea is to show something like this image where the red part is the z-stack and the green part are rendered meshes.

4495972_gallery.png

Many thanks and best regards,
Ulisse.

Link to comment
Share on other sites

Hello Deltakosh and thankyou for your reply.
Actually, I have already an algorithm to generate surfaces (as a set of triangles) from 2d slices by applying segmentation and object reconstruction.
The most difficult part to me seems to visualize the stack of 2d slices for which I Cannot find an example,
Can you suggest me something to look at for this purpose?

There is a number of commercial software such as Imaris, Volocity or Metamorph (all very expensive) and Open Source alternatives like FIJI, ICY and ImageJ, all of them based on VTK and OpenGL and distributed in classical desktop applications.
I am using some of them (both commercial and open source) but I would like to provide this functionality in a html5 application and for this reason I am looking for a way to visualize my raw data with overimposed surfaces using babylon.js.

Link to comment
Share on other sites

Hi Ulisse, welcome to the forum.  I'm no expert, but I want to tell you that BJS basic shapes... are generated-by and/or stored-upon a VertexData object.  There are many methods on a BABYLON.Mesh class... to work-with (get/set)... a mesh's associated VertexData object (its data arrays).  Often, programmers start with a blank mesh, then "stock" a vertexData object with data, and then apply the vertexData to the blank mesh.

Here is a playground scene that does some basic mesh plotting (with little helper boxes turned-on).

http://www.babylonjs-playground.com/#1UHFAP#67

You can see the blank mesh created in line 5, and a vertexData object being applied to that blank mesh... in line 101.

Feel free to edit, run, save, get zip, anything you like... in the Babylon Playground.  You cannot hurt anything... have fun.

-------------------------

Another thing to look-at... might be Extrusion.

Can you answer this?  Will the amount of vertex points around the outer edge of each slice... remain the same... for all cross-section slices (of a single object)?

In other words, can/will the surface resolution (sampling detail?) change... slice-to-slice?

As far as I know, if the amount of vertices remains the same.... slice-after-slice, then the project will be a bit easier.  Extrusion is an option, perhaps. 

Again, I'm no expert.  Just thinking about things.  :)

Link to comment
Share on other sites

On 10/21/2016 at 6:04 PM, Wingnut said:

 

Many thanks for your reply!
The example in the playground is perfect!! I have the position of each vertex and the connections between them (triangles).

For the second part, I think that the surface resolution can be the same for all the objects in the volume but they are with a strange shape so in different slices (or planes) I have a different amount of vertices.
I am working with two photon microscopy data which are nothing than a set of parallel images acquired at a different depth.
Imagine to have an apple sliced in many different planes, each plane will have a different shape.

Best regards,
Diego Ulisse Pizzagalli.

Link to comment
Share on other sites

My pleasure.

Can I ask some questions?  I hope so.

When these images are "gathered", is the distance between each "sample" (image)... adjustable?  Let's use the apple example again.  Let's pretend the slices travel along the Z-axis (depth).  Can you set the z-resolution when making the slices?  (How many images per inch of z-travel into the apple).

This value... this z-axis granularity or resolution... would determine the "thickness" of each slice... once in 3D land, yes?

If not-calibrated, our 3D representation of the apple... might appear to be 16 inches thick.  :)

Ok, this takes me to another question.  Do the images support transparency/alpha (transparent background)? 

(I'm really really not experienced with textures AT ALL, sorry).  I have not used BJS extrusion much/any, either.  But... I have done a little playing with heightMaps and displaceMaps... which are high-subdiv planes that get their elevations... from gray-scale images.

http://www.babylonjs-playground.com/#1CCIP1#4

That is a simple displacement map, with some code "borrowed" from BJS framework... and moved into the playground editor... to be hacked-on.  :)

I guess displacing is somewhat similar to extruding.  How to clip-off the "slag", though, huh?  How do we make the displaceMap/heightMap... no longer be "square"... but instead have vertices mapped along the edges of the non-displaced part of the sample?  hmm.

https://www.youtube.com/watch?v=jYbrQ-0djt0

The Red Green Show... look at the section from 0:25 - 0:40 (the show's opening animation).  See those "cutouts"?  I love them!  Those "cutouts" could be done with image textures mapped-onto BJS planes (with alpha/transparent backgrounds)... but... they are cooler IF they have SOME thickness.  That thickness... is the same value as your z-resolution per slice.  (slice thickness).  Silhouettes! 

"Took a walk down past your house... late last night.  All the shades were pulled and drawn... way down tight.  From within a dim light cast... two silhouettes on the shade.  Oh what a lovely... couple they made."

Sorry, I accidentally broke-into singing an old 50's song, there.  :) 

Can you get, or do you have... the... umm... "alpha channel" for each slice... so we can make a black and white silhouette image from each slice?  (Sorry for my likely-wrong terminology, here)

Ideally, just two colors... black and white.  No grays.  Now we have an image that we could do some displaceMap or heightMap experiments-with.

Also, you might hate to hear this, but this might be a job for a custom shader. 

What if...  hmm.  What if you never really made any mesh... never plotted any vertices... but the GPU made it LOOK LIKE you did?  Have you looked around on the web... for an "image thickener" shader?  hehe.  I wonder if such a thing exists.  I bet it does.  You didn't plan on shooting photon torpedoes at these composited mesh-slices, did you?  :)  You won't be needing intersectsMesh, or other video game mesh-collision things, right?

So, hmm.  Perhaps you could FOOL the user into thinking each slice-image has some thickness... with a shader.  hmm.  (PS.  I don't code shaders... they scare me.)  The slice images would still be best... if the background was all-white or all-black.  Then the shader could avoid thickening (or displaying) that portion of the image.

I'm just talking (out my butt, as usual).  :D  Thinkin'.  I love that Red Green Show intro... with the silhouette image "cutouts" or whatever they are called.  I wonder if such things... have an "official" name.  hmm.  Comments welcome from all... if that's okay-with ulisse.  I should go learn about Extrusion... see how it works.  Talk soon... party-on.  One more goofy displaceMap playground?  Okay.  http://www.babylonjs-playground.com/#P9UZG#9  Found via our trusty playground searcher.  Yay!

Link to comment
Share on other sites

9 hours ago, Wingnut said:

 

Hi and thanks again for your detailed post. I try to reply to your points:
1) Yes, all the slices (or cutouts) have the same tickness and I know it. The voxel size is the same for each point in the 3d volume (say that each voxel is like a parallelepiped of size 0.5um, 0.5um, 3um).
2) Our microscope acquires images as grayscale and it has different acquisition channels, so I have a grayscale img for red, a grayscale img for blue, a grayscale img for green, a grayscale img for infrared and so on..
An alpha map is usually defined by asking the user to set a threshold on the background resulting in all the voxels with an intensity below that threshold to be transparent and all the voxels above that threshold being colored.
I am visualizing these images in Matlab where I am combining the channels to make an aRGB 3d image and using a 3d volume viewer function (vol3d) to display this but I would like to go for a web based approach.
3) If I understood correctly, the idea would be to have some "cutouts" and then to extrude them to have a tickness, isn't it? If that is possible I think that it is really great!!
4) About shaders and meshes: It sounds good to use a shader but I really have 0 expertise in that. I need to visualize some computer generated geometries on top of "image points" because such geometries are created by the commonly used cell detection and surface reconstruction algorithms.

As you can see I have really 0 expertise in 3d computer graphics and game development, I am more on the algorithmic part related to image processing, machine learning and data mining..
For this reason I am using commercial imaging software to display data (raw 3d images) and results (computer generated geometries) which however lag like crazy despite using a resolution of 512px x 512px x 30slices on a workstation with 256GB of RAM, 24 Xeon cores and an nVidia quadro card with 8GB GDDR5 of graphic memory and 2880 CUDA cores!! (I haven't said before that our data are also acquired in time but this would be another story, let's start from static images first).
For this reason I was thinking that from a game developer community it may be possible to get something to improve the visualization of these biomedical data which have far simpler geometries and textures than a 3d game in my opinion.
Thanks again and best regards,
Ulisse.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...