JCPalmer

Members
  • Content count

    2,367
  • Joined

  • Last visited

  • Days Won

    9

Everything posted by JCPalmer

  1. JCPalmer

    Blender > .babylon Vertex Groups

    In the middle of something today. Will look at this soon.
  2. JCPalmer

    Blender > .babylon Vertex Groups

    Saw the pg, and downloaded last file, but did not diff it to the repos. As it is the largest source file in the add-on, I would need to diff to find the lines added / changed. Not a big deal in Netbeans. I do not a use for right now in my own work, but now that I know it is an option uses might come up. If you are thinking about a PR, a checkbox in the custom properties (default false) would definitely be needed, since if you are not using, it could really increase the size of the export file.
  3. JCPalmer

    Status of Camera rig VR

    I am not sure there is really a need to be able to record (webm) when using the VR rig. One of the few 3D video on YouTube has gotten 2.8 million views, so I would definitely want to get some of that. Not seen any videos with a barrel distortion. I do not have the hardware. Can you even watch stuff with a "phone on your head"? I recall something during the winter Olympics. If trying to record with the VR rig is a waste of time, please let me know. If this is desirable output, basically nothing comes out when canvas.toDataURL('image/webp', quality); is called on Chrome when this rig is used. I suspect that it is due to the fact that each of the sub-cameras writes to it own viewport, and that is not taken into account. Really glad this would not work for the stereo rigs I added, or I might have screwed myself. When I tried the newer WEBVR rig, it did output, but only a single screen. Is the VR rig just around for backward compatibility? Might it be ok to mess with the old VR rig, maybe add another post process, if I can think of a way? FYI, @Wingnut & @Deltakosh, I cringe every time RIG_MODE_STEREOSCOPIC_SIDEBYSIDE_PARALLEL is mentioned as a solution when some device does not support webVR. The stereo rigs are interleafing, meaning that in the doubled dimension only every other pixel is printed. It gives each side a squished look, but on a 3D TV it widens out to the original dimensions. Something for these devices would need to crop the left & right side of each image to really work.
  4. JCPalmer

    Status of Camera rig VR

    Thanks, but no, I had not. I can say that I am all about control & that looks to have none (no frame rate, no quality, no resolution). I just completed using toDataUrl(). It is the only one in which you can control quality. In the 1.6 sec clip below, the .webm + .wav files combined size is a MASSIVE 8,858 kb. That is a lot for so small a clip, but when the multi-pass VP9 codec convert & sound track merge is done by ffmpeg, it is only 277 kb. As I am merging the consolidated soundtrack afterward anyway, giving ffmpeg the most crisp frames as a source to encode as VP9 or H264 is very desirable. It takes a lot of RAM, but I have 16 gb & room for 16 more. The annotations in the cropped black bars were supposed to be just a joke, but it is really helpful to bake settings right into the video during dev. You can easily mix your files without knowing. Am now starting to work on a clip with actual talking, work on recording code is done, unless I fine something. The alpha for VR is probably in the cameras, not background thinking about it. Going to throw VR under the Bus. Actually, YouTube can show 360 videos. Not going to attempt this right now, but wonder about having a rig with say 300 cameras & viewports. The VR distortion on the combined, probably wrong for this though. side-by-side-vp9.webm
  5. JCPalmer

    Render 8192 x8192 px one frame and save

    Maybe try something like this. I do something like this to make WEBM videos of arbitrary resolution via canvas.toDataUrl('image/webp', quality). toDataUrl works with 'image/jpeg' too. In the playground, I could not get my canvas sizing to obey, but this does work outside of PG. I also do not know how to write the .jpg file correctly. It is commented out. If you un-comment the afterrender registration, it takes the capture & puts it on a new page.
  6. JCPalmer

    Status of Camera rig VR

    I came so close to getting a completely successful test of Canvas.captureStream on FireFox, . Whether on Chrome or Firefox, the VR rig worked fine: In either case though, you cannot specify a codec. Firefox puts out VP8, but chrome does not even put out a true WEBM file. It has an MP4 codec. The killer is you cannot set the size of the capture in code. It is whatever the physical size of the canvas is on the screen. It makes sense, but that is a problem which cannot really be overlooked. Am going to stick with toDataUrl() method, and table the VR rig for now, unless some knows how to size a physical canvas (probably need to create the canvas in code). I have a 30" high res display (2560 x 1600), so could not do UHD (3840 x 2160). Do not know if that is a real problem or just imaged. Code I use to size canvas: // make videos of an exact size, regardless if looks weird on screen function sizeSurface( width, height) { const canvas = engine.getRenderingCanvas(); canvas.width = width; canvas.height = height; // may not have auto resize; if it does no harm doing it again engine.setSize(width, height); };
  7. JCPalmer

    Status of Camera rig VR

    I am going to do a quick test on chrome or Firefox using the HTML Canvas.captureStream() instead of toDataUrl(). I tried it earlier, but got strange results. If this method does give good results for the VR rig, I think I have found a way to get around the issue I have with this method. That issue is it is realtime-based. It is much faster than toDataUrl(), because it just passes a memory pointer of the canvas to a browser background thread. But, you cannot use it to directly render at a true, dependable, settable frame rate. And also, not to a frame rate might be greater than your scene can be rendered at a given resolution on your machine. An example is a complicated scene, with many meshes using 2 sub-cameras & postprocessing for 3D, say @ 1080 or Ultra-HD resolution. A think a commandline program ffmpeg has an option which allows you to over write whatever the capture said the time was with a fixed increment. Now you can capture at perfectly timed frames at given points in time, regardless of when they actually render. I need to merge the final consolidated audio file with the video file anyway. The option is:
  8. JCPalmer

    Playground saved scenes hosed

    I wanted to go back to a scene, & drinking some on the 4th, but the Url reverted back to one without the stuff at the end. At first, thought who deleted my fuckin scene. After I calmed down, I tried others, ones in documentation, all the same result. Think there be a problem not specific to me. Can anyone get a saved scene? Time for a nap.
  9. JCPalmer

    Status of Camera rig VR

    Hold on, something that just happened. I changed my test to find the start of data from 'VP8 ' to 'VP8 '. let keyframeStartIndex = webP.indexOf('VP8 '); The video in not now all black. The area in the background is black & jagged, but that's a start. I am really thinking this has to do with alpha of the clear color. My checks for 'VP8X' are still successful, so BOTH must be in the file when using VR rig. (Am going to have to check the quality 1.0 thing again too). Still similar question, can the alpha be taken out of clear, or is this also being used by the edges of what is rendered, so will not matter?
  10. JCPalmer

    Status of Camera rig VR

    Same question as before about app or framework code? FYI, in a WEBP image, there can be 3 different codecs, VP8, VP8L, & VP8X. VP8L is for lossless. That is what I get when I set the quality to 1.0. Not sure what VP8X is, but when I change to the VR rig, that is what is coming out, not 'VP8 ' like all the other rigs! One of these 3 strings are always found in the dataUrl. 8 bytes later the data starts. I am currently just checking for 'VP8' in the data. The WEBM video format takes 2 codecs 'VP8' & 'VP9'. Something is causing a 'VP8X' to be generated. At least now I think I know the reason the video is black. The border seems to be the only difference.
  11. JCPalmer

    Status of Camera rig VR

    As a test, yes, anything without any alpha would do. Can this be made in application code, or only in the framework?
  12. JCPalmer

    Status of Camera rig VR

    Ok, I looked at the webm file through a webm viewer program. All the frames are there. The byte size of the frames is what I am expecting. I modified my test scene to do the same trick as the playground, namely take the last frame of the video, make a document out of it, & just write over the web page. It is coming out fine. Doing some completely random changing just to get something "to pop", I changed the scene clear color to grey from black, but when the video is played all the frames are still just Black. Got me to thinking though. Just what exactly IS that white stuff that surrounds? Is it even stuff? Can we make it actual stuff in dev to see if a webp image gets generated, so a webm video understands it? I can do a few more tests like look at the file through a hex file reader. Doing that helped me see that setting the quality to 1.0, changed the format from VP8 to VP8L. This did not work. Got all black there too. That is why my quality slider only goes to 0.99. Hmm? I'll look at this area again too.
  13. The animations show exported in the log file. There are 9 Blender Actions, which convert to BABYLON.AnimationRanges: processing action Action_01_Armature.001: in[0 - 40], out[0 - 40] processing action Idle_Armature: in[-54 - 114], out[50 - 164] processing action Idle_Foot_R.png: in[0 - 1], out[170 - 171] processing action Idle_UpperBody.png: in[0 - 1], out[180 - 181] processing action NewCollection_Armature.001: in[0 - 20], out[190 - 210] processing action Restpose_Foot_R.png: in[0 - 1], out[220 - 221] processing action Run_Armature: in[-20 - 38], out[230 - 268] processing action Run_Foot_R.png: in[0 - 1], out[280 - 281] processing action Run_UpperBody.png: in[0 - 1], out[290 - 291] I am not sure how this all fits together. One thing is in Blender no object actually "Owns" an action. The only way you set an action to only export for a specific mesh is to name the action in the format 'mesh-name'. Have never tried this on a scene where both meshes & an armature both had actions, so all actions are going to export for an armature. Not really a big problem to ignore them as long as you start the action for the armature. I notice that all the actions not on the armature are only 2 frames, so maybe delete them till you figure out which of the 3 armature actions you wish to run.
  14. JCPalmer

    Status of Camera rig VR

    Demo made. It only runs on Chrome. I tried to simplify to the maximum, so no webm is made, just a single 'image/webp', which is displayed on a new page. So you can see the source code, I commented out the after render registration. Upshot is the demo worked, which is pretty good to me. It did not solve the problem of why this rig is not generating, but I now know I was looking in the wrong place. Assumptions reset now in progress. FYI, if you replace the rig with RIG_MODE_STEREOSCOPIC_SIDEBYSIDE_PARALLEL, then an error is generated " e.StereoscopicInterlacePostProcess is not a constructor". This works in the last stable version, so the version in dev has broken it.
  15. JCPalmer

    Playground saved scenes hosed

    Burp.. I am assuming that you dressed yourself as well before going to the school, or this day may still have complications in store! Thanks!
  16. JCPalmer

    Status of Camera rig VR

    I think I'll postpone any demos till the PG can save them. I am using a scene level After-render. The stereo rigs also use a post process, and they work. That is why I zeroed in on the different part, the viewport feature.
  17. I have made a form for inputting parameters for webm audio/video recordings. I have broken out the top level StackPanel apart from the full screen AdvancedDynamicTexture. When either the "Rehearse" or "Record" buttons are pressed, you cannot still have the form visible or this happens: Clearly not acceptable to be in the video itself, & a little annoying for rehearsing. The only way I found to temporarily get rid of it was to dispose the texture. After the scene is done, I took the StackPanel and put it in a new AdvancedDynamicTexture, just 2 lines. The buttons are connected to NOTHING now though. Using a Mesh for the texture will allow it to be hidden, but then I need a second camera in ortho mode & setup layer masks. Is there any way to avoid going that route?
  18. JCPalmer

    Button dis-connected

    @dad72, always good to have a number of ways of equal effort. From a tie breaker standpoint, the code reads slightly more straight forward by setting the visibility of the top level StackPanel.. Though I suppose the downside of that is you have to have a top level control. I definitely need one anyway, since I need vertical stacking with some horizontal.
  19. JCPalmer

    Button dis-connected

    Yep, it is completely gone when hiding the StackPanel, and when it is visible again, the buttons still work!
  20. JCPalmer

    Button dis-connected

    advancedDynamicTexture´╗┐ does not have a isVisible, but StackPanel does. Will have to see what happens if I hide the top level StackPanel
  21. It errors when you are building a radio button, & possibly other controls, and want to pre-assign whether it is checked, before showing it or connecting / adding it to a container (_host). You get a 'executeOnAllControls' of undefined, here. I am getting around this by directly assigning _isChecked. Since this is being written in typescript & that is private it must be: radio["_isChecked"] = true; Thinking, checking if _host is not null might be a good idea. True, you could screw it up by checking multiples of the same group in initialization, but it should be obvious that this is developer error.
  22. I have made a GUI "form" to exercise a webm video & sound recording process I have made. I had just written it as a 180 line StackPanel right in a javascript file specific to the directory of the test scene. If this was bundled as Typescript right with the other code, then it could be injected to any scene I write with very little code & screwing around. I added the GUI.d.ts file, but you get error TS2694: Namespace 'BABYLON' has no exported member 'GUI'. when you assign the type BABYLON.GUI.StackPanel to an object. Any ideas on how to fix this?
  23. JCPalmer

    BABYLON.GUI in Typescript

    Actually, neither, but you got me to think of the answer. I was just putting all d.ts files in a source sub-directory. I only used ///references for other source files to help it figure out how to order them. For most projects I just use an include: with a "src/**/*.ts" in the tsconfig.json. The QI extension, where this is going, is too complicated for an include. There I use a files: array that all the files in the order that works. I added the d.ts file there & problem solved.
  24. JCPalmer

    Flat Mesh from Blender

    Related to doing it in Javascript, there is only the overhead during the conversion. After that it is just another mesh. It will have more vertices, but that is what the exporter used to. @V!nc3r, didn't you mention a way before to add this modifier on all meshes?
  25. JCPalmer

    Flat Mesh from Blender

    The explicit flat shading by the exporter was retired in 5.6 in favor of placing an Edge Split modifier on the desired meshes, as was said in the post you reference. Did you leave the split angle at the default 30 degrees? What happens when you drop it down to 0?