JCPalmer

Members
  • Content Count

    2510
  • Joined

  • Last visited

  • Days Won

    11

Everything posted by JCPalmer

  1. This sounds like an error message trying to be generated off of the producer tag in a .babylon file, and no producer tag found. gltf probably does not has a producer tag. https://github.com/BabylonJS/Babylon.js/blob/master/src/Loading/Plugins/babylon.babylonFileLoader.ts#L26
  2. First, to be clear, the animations are on each of the bones not the skeleton. I would check the .log files to see if either the skeletons a have an action assigned / exported. Would look like this for RUN: If this export had an AnimationRange copied to it, the FPS in the bones would from the avatar blend. I manually changed the the FPS of the first bone of each. Looks like that synced the root bone. Either delete the strays actions or make sure the avatar .blends match on FPS
  3. Find where on the hard drive is a file named AlbedoTransparency.png. If you do not have one, why do you expect to export it? If it is not in the models directory, then put it there.
  4. Ok, I thought I had tried that. See it now. Have thought some more. Since both EEVEE & Cycles use the same nodes, I do not really need both to bake. When there is something to cause baking to be required, I can just always just switch to Cycles, if not already. EEVEE is just designed to render much faster, but perhaps it is being accomplished in part by not having overhead which would be required to bake. On the question of PBR materials that need to be baked. If you look inside the "Bake Type" dropdown, there is now roughness & environment types. That means if Diffuse was selected it would not have roughness or the environment included.
  5. Post your log file. Is this your .blend, or did your download from some where? If the source file does not exist, then get it in the place Blender says it should be, or change where the model looks.
  6. If Shape keys, then morph targets are exported. Check your log file.
  7. Sorry, I do not know what a Blendshape is, so unless this is a shapekey, it is definitely not being exported.
  8. I am thinking that I might make an AbstractMaterial class, which would handle baking, assuming that the API is the same regardless whether EEVEE or Cycles. Fingers crossed, since baking was renderer dependent before, which sucked. Then make 2 sub-classes, STDMaterial & PBRMaterial, assuming a PBR can use bake textures. Big problem in trying to detect if baking is the same across renderers is I cannot find any user interface on baking, just output. Any one have a clue? This is the Render Tab, where baking used to be: FYI, this is not an old build. I just got tired of looking at those awful new default colors, so I set theme back to 2.79
  9. copyAnimationRange() basically just copies from an Animation to another. It does NOT change the framePerSecond member of those Animation objects. They are probably not the same. If these skeletons were exported from Blender, then make sure that they have this set the same.
  10. Just an update as to how the removal of game engine was resolved. 2 new custom properties were added to mesh. This actually makes them easier to find, rather than switching renderer. It always was kind of scavenger hunt. Also, they used to be defined for a material, not a mesh. There will be minor differences from before, limited to multi-material meshes only. Before you could turn off culling on a sub-mesh level, but that will no longer be possible. Doubt people are doing this though.
  11. Depends. Do you mean dimensions are too large for hardware? In that case, get a tool like gimp to reduce dimensions is probably the only option. If you means they technically would load, but the sum of them demand too much RAM, then you could compress them. See multi-platform compressed textures to reduce memory footprint.
  12. Unless that consistency is to NOT be all PBRMaterials. Personally, I see no need for PBRMaterials in my work. I am not to this point yet, but am thinking a scene level, "no PBR materials" check box. Principled is setup to do "everything". If you are not doing PBR, you should not have to pay for it. I am successfully "reading" the currently active node tree recursively. Am doing this in the TOB code base on 2.79. The 2.80 convert work is being done on the EEVEE branch of the JSON exporter. At some point they will "merge". Neither is ever going to be published with PBR for 2.79. This is just a vehicle to work in parallel.
  13. @V!nc3r,what if both the metallic & roughness are zero? Shouldn't a standard material be be created? If yes, then would using the specular field also then be consulted?
  14. I see there can be a projection texture. Not sure that is what Blender is really doing at this time, though.
  15. There are multiple ways, I think. The way to do it on the JS side is. A blender way would be to position the sword where it is to be held. Make sure the origin matches the character mesh. Parent sword to armature, or add modifier. Weight paint entire sword to the bone desired.
  16. In process of mapping Blender shader nodes, 83 of them. Not sure if I should throw away 'ShaderNodeOutputLamp', or just put it aside.
  17. BJS does not have constraints. The trackTo has a couple applications with exported cameras, but it is converted not supported as a constraint.
  18. Maybe it is just not going to work, or you are doing something. Maybe try this pg, if it works, then perhaps something you are doing. Just giving a point of reference to get some clue.
  19. Yes, what makes the b & w interesting is that there is a degree arg, where 1 is full. var postProcess = new BABYLON.BlackAndWhitePostProcess("WelcomeToWonderLand", 1.0, camera); I actually use it in a transition of a scene, right after load, from b & w to color by animating that property, like the Wizard of Oz. It works well. Maybe first try just getting things a little dull. It should be easy. Being a post process means it would work with ANY scene without modification other than adding that single line. It is also a really small shader. You might copy it & change the property to a vector. That way you could change the degree by color channel. Might look weird to some, but if say you have a problem with red, just change that channel's degree to 1, and leave green & blue at zero.
  20. perhaps use the B & W post process
  21. Using the concept of layermask, you can also have at least 4 simultaneous scenes in the same place at the same time. A camera has a layermask, as do meshes. You could set all the meshes for one scene to a mask. The meshes for "another scene" can have another mask. If you set the mask of the camera to one or the other, then a different "scene" will be shown. If you set the camera's masks to a bitwise or of both, then both will show. If you are familiar with Blender, same concept. One Blend file / scene. If you wish to show only certain meshes put them on the same layer. If multiple layers are selected, more than meshes for one layer with display.
  22. As far as clones, these are just meshes sharing geometry. Disabling one has no effect on others. Also, there is no such thing as a MASTER clone, which means the first mesh can also be deleted with no effect on others. In general, these "tricks" as you call them are due to people perceiving that they are operatiing on meshes. In the GPU / reality however, the primary thing be operated on is materials which translate into a vertex / fragment shader pair. This difference of what is really being done to how you THINK you are operating, is causing your disconnect of what should be the way "things should be". For instance, having vertices in the CPU does not mean anything to the GPU until a material is created for them to be used as data. This compiling of shaders / GPU programs is what causes the latency. As @JohnK says, adding a mesh & disabling gets the shader programs compiled, so everything is ready when you want that mesh (really material) to be seen. Not straight forward, unless you look at it from the GPU's point of view.
  23. I know @Sebavan was trying to do something with angular in this regard, and stopped. Think the difficulty factor is going to be way up there.
  24. Though a webgl framework might help you display your findings or take actions with those findings, getting the findings through photo analysis is the really hard part. This has little, if not nothing, to with webgl, though once you know webgl might benefit. BabylonJS does zero photo analysis, I think. Getting data from WebRTC and then passing it around is going to be slow. If you are going to use native capabilities anyway, you should probably follow their examples of retrieving the camera data. I will say using Cordova is probably not going to help you without a lot of work. When you access the camera in Cordova using the common plug-ins, those just call either the videocam or photo app for a given OS. When you close those, control returns to your javascript. If you write your own plug-in to access the hardware directly, which I am doing, then you have 2 new problems: It is platform dependent, so you will have to code for each OS. You are probably going to have to do your analysis in the plug-in itself. Reason is that Cordova plug-ins can only return strings. Turning one to base64, passing it back, converting back to an image will slow you down to about 5 fps. This is even before you start trying to work with the data. You might write a Cordova plug-in which accesses each platforms native AR offerings, or scourer the net for a plug-in which already does. The same problem of passing the camera data back in string format is still going to be the bottle neck. Using WebRTC on browser, and a second native app which also accesses the camera is probably the only realistic way today.
  25. scene.executeWhenReady( () => { console.log('blah'); } ); is also available. I think it works better with Append() rather than ImportMesh(). This leaves all the accounting to the framework. Edit: If you are having problems with memory and want to serialize, then you would want to put the next Append inside of the callback: const sections=[ "background-site", "section-1", "section-2", "section-3", "section-4", "section-5", "section-6", "section-7", "section-8", "section-9", "section-10", "section-11", "section-12", "section-13", "section-14", "section-15", "section-16", "section-17", "section-18", "section-19", "section-20", "section-21", "section-22", "section-23", "section-24", "section-25", "section-26", "section-27", "section-28" ]; let i = 0; function loadSection() { if (i == sections.length) allLoaded(); else { BABYLON.SceneLoader.Append("", "assets/site/", sections[i]+".babylon" ,scene, () => { i += 1; loadSection(); }); } } function allLoaded() { // normal post loading } // actually start it loadSection();