JCPalmer

Members
  • Content Count

    2510
  • Joined

  • Last visited

  • Days Won

    11

Everything posted by JCPalmer

  1. Meshes are a subclass from AbstractMesh which is a subclass of TransformNode, but no TransformNode is not directly instanced. Meshes without geometry do not cause any draw calls, so performance & memory are almost the same. TransformNodes are a recent refactoring of the code base, done so in a backwardly compatible way. I see no benefit for me to change anything, but can change the exporter who does.
  2. For your playground, I am pretty sure you must have exported without shape key groups, or you would not have been able to mesh clone factory. You cannot share geometry across mesh clones and morph the vertices on the CPU. Plus based on your url, I kind of cheated & checked . The mesh extends BABYLON.Mesh, not QI.Mesh. Wow this is not even ES6 classes for meshes being generated yet. TOB in repo is older than I thought. While my long talk about shape keys still creates normal lines of their own, this is not the issue here. This would mean nothing I am planning would help. For meshes with multi-materials, Blender vertices are duplicated when exported (.babylon and .js exporter) when a blender mesh face is right next to another with a different material. Blender assigns materials by face, and the 2 adjacent faces can share the vertex. BJS requires multiple vertices when in that case. So this is not an exporter problem. I am not sure why you need to compute normals, but if you separate your mesh by materials, then parent mesh might fix. This would not increase overhead much, since each material is still going to be a draw call. Either that or bake the different materials, so that it is not a multi-material mesh.
  3. Ah yes, I am well aware of the lines coming from border of Tower of Babel shape key Groups. See the line under the chin, here. As you might know, Unlike BJS morph targets, which are the entire mesh, you can have multiple groups on the same mesh like face, left / right hands, etc. In addition to being much smaller than the whole mesh, also being on the CPU vs GPU allows many more targets do to vertex shader parameter thingie limits. I kind of doubt that you could have 24 GPU based "targets" like for the Automaton tester scene. What you are using is pretty old. I have not posted in about a year. I am actively working on QI 2.0 & TOB 6.0. TOB 6.0 will now require QI, which makes the generated code more clear, and pushed some code out of the export & into QI. TOB 6 will also be for Blender 2.80, where materials with textures must be made using nodes. Unlike BJS, I will be breaking a lot of compatibility. Forcing all meshes to be QI.Mesh sub-classes means I can redo how mesh clone factories are implemented with almost none of the code implemented in the generated code. More like code generated calling a QI runtime. The problem here, I believe, is that for each state of a shape key group, ComputeNormals is called to get the corresponding end point normals. As to not interfere with other shape key groups, the same vertices isolated for the positions are used to cull the vertices used for the normal state. Then every frame of the morph, the states of the 2 involved (prior & next) of normals are interpolated and pasted into the normal of the entire mesh, & sent up to the GPU. I think the list of affected vertices for a group needs to be expanded to be a little bigger to also include those which are also shared by faces of the ones which have a position change. This list of vertices is determined by the exporter, and I am looking to see if this fixes the problem for TOB 6.0. I do no know when this will be published. I am also splitting time with a Kinect data capture functionality for a Blender addin for the next MakeHuman release. If you only have the one group, and just a keys, you might just use the .babylon exporter / morph targets. The export will be bigger, but should work for a low number of keys.
  4. Well, the PR for adding isPBR as a parsed value for setting up a skybox is complete. But, I am currently working on the Javascript exporter, so I needed to looked up the constructor for a HDRCubeTexture, and I just saw the gammaSpace arg. This is to be false for PBR & true for STD. Think I need another PR to promote when the parse parameter is defined, so it can be used in both. Something like: if (parsedData.environmentTexture !== undefined && parsedData.environmentTexture !== null) { // PBR needed for both HDR texture (gamma space) & a sky box var isPBR = parsedData.isPBR !== undefined ? parsedData.isPBR : true; if (parsedData.environmentTextureType && parsedData.environmentTextureType === "BABYLON.HDRCubeTexture") { var hdrSize: number = (parsedData.environmentTextureSize) ? parsedData.environmentTextureSize : 128; var hdrTexture = new HDRCubeTexture(rootUrl + parsedData.environmentTexture, scene, hdrSize, true, !isPBR); if (parsedData.environmentTextureRotationY) { hdrTexture.rotationY = parsedData.environmentTextureRotationY; } scene.environmentTexture = hdrTexture; } else { var cubeTexture = CubeTexture.CreateFromPrefilteredData(rootUrl + parsedData.environmentTexture, scene); if (parsedData.environmentTextureRotationY) { cubeTexture.rotationY = parsedData.environmentTextureRotationY; } scene.environmentTexture = cubeTexture; } if (parsedData.createDefaultSkybox === true) { var skyboxScale = (scene.activeCamera !== undefined && scene.activeCamera !== null) ? (scene.activeCamera.maxZ - scene.activeCamera.minZ) / 2 : 1000; var skyboxBlurLevel = parsedData.skyboxBlurLevel || 0; scene.createDefaultSkybox(scene.environmentTexture, isPBR, skyboxScale, skyboxBlurLevel); } I see that the cubeTexture route has gamma space, but depends on the file extension. For my exporters, just always going to force HDRCubeTexture route. Is this look right? Should I modify what's new again, or is the last entry still going to cover this?
  5. Ok, starting to generate output (javascript), at least for the environment texture & sky box. Everything is not right yet, but still significant. I am looking at your principled picture, @V!nc3r. You seem to have a single texture for for multiple channels (occlusion, roughness, & metallic). I understand that these occupy different colors of the texture, but not how you would actually get such data into one texture? More over, that multiple to one relationship was not in the prior exporters. Am in the process of that final piece for principled Node, and generation of JS from that. I just want to make sure before the JSON exporter starts to get modified that this can be expressed there too? Do you @Deltakosh? Also, Occlusion does not seem possible in Cycles, so that probably cannot be done.
  6. Well, glad your actions are coming through now. I kind of doubt whether multiple AnimationRanges can be blended inside of BJS. Remember an AnimationRange is just a named frame range inside of the one animation that can be exported. I actually use my own exporter with connection to my own JS animation system, so I do not do this. I might not be the person to ask what can be done with this stuff upon import. You can call each animationRange on demand. There is also something called an AnimationGroup in BJS. I have no idea what it is for.
  7. The exporter exports Actions. If you combine actions using NLA Editor, I see another action being created. Is 'combined.walk.punch.action' not being exported? Look at & post your export .log file to see. I see a lot of other actions. Are you set to export 'Only Currently Assigned Action' or all. If doing all & not starting via the named range, all this other stuff is going to get run too.
  8. I can answer the Cordova part. Fullscreen should work if you add preferences in the config.xml file, like: <?xml version='1.0' encoding='utf-8'?> <widget ... <platform name="android"> ... </platform> <platform name="ios"> ... </platform> ... <!-- added manually --> <preference name="Orientation" value="landscape" /> <preference name="Fullscreen" value="true" /> </widget> You can of course choose give some space to elements other than a canvas. On Android, I seem to remember first time user gets a choice.
  9. I believe that this only works "html buttons", not clicks in a canvas, which what the GUI is. I wanted to do that too, but do not think it is going to work. If you say this works on desktop, it might be browser dependent. Are you trying this on iOS?
  10. Couple of things: If you are going to develop, I assume your hardware is up to the job. RAM is cheap. Next, probably whatever IDE you are using already. Reason being very low learning curve. Typescript is now available as plug-in all over the place. I have used Eclipse with Python & Typescript plugins. Right now I am using Netbeans, which has a very nice Typescript editor plugin. It also has good html, javascript, NPM & GIT integration. I also write a a python plug-in. What I especially like I can have multiple projects open at once making Blender exporter & loading typescript code in side by side viewing. First thing to check does your current IDE just need a Typescript plugin?
  11. I know little about collisions, but for reference, this is how it is supported in Blender exporter. When you select a mesh the last property tag, physics, needs some stuff. Rigid Body must be checked, then one of the imposter shapes must be checked (Box, sphere, mesh, Capsule, Cone, cylinder, or Convex Hull). Maybe try 'mesh'. This probably does not solve whatever your problem is, but at least now you know that you can actually put the imposter into the .babylon file.
  12. Also, found that this code probably is not in use for the skybox part, anyway. The lines in FileLoader are: if (parsedData.createDefaultSkybox === true) { var skyboxScale = (scene.activeCamera !== undefined && scene.activeCamera !== null) ? (scene.activeCamera.maxZ - scene.activeCamera.minZ) / 2 : 1000; var skyboxBlurLevel = parsedData.skyboxBlurLevel || 0; scene.createDefaultSkybox(undefined, true, skyboxScale, skyboxBlurLevel); } The environmentTexture arg for createDefaultSkybox is being check that is valid, which undefined is not: Scene.prototype.createDefaultSkybox = function(environmentTexture?: BaseTexture, pbr = false, scale = 1000, blur = 0, setGlobalEnvTexture = true): Nullable<Mesh> { if (!environmentTexture) { Tools.Warn("Can not create default skybox without environment texture."); return null; } Seems that scene.environmentTexture from a few lines above could be passed. hard coding pbr being true is unfortunate. Wish there was a parsed value for that.
  13. Ok, been playing some more, does it make sense that the equivalent of the HDRTexture vs createFromPrefilteredData in Blender / Cycles is the color_space parameter of the Environment Texture? In the picture above, it is set to Color, but here is the dropdownlist: Should it be createFromPrefilteredData when Non-Color Data?
  14. Well, re-activating topic since I found new stuff. I am currently reading node trees now, and this what it looks like from Blender 2.79 / cycles. For .babylon (JSON), there is actually both facilities for environment textures & skybox right in babylonFileLoader.ts. Sorry, but I do not know what some of this stuff is. I just got this hiker cave file from https://hdrihaven.com/hdris/. blender works with this. Is this a BABYLON.HDRCubeTexture? If so, what is the hdrSize, an arbitrary number that is a power of 2, or the size of the texture? I an not sure how or if I can get a texture dimension from Blender. Next would be what is environmentTextureRotationY? Would anything need to be different when using PBR vs not? I hope not, since that is all that is serializable.
  15. No clue what this is. A submesh is when you have multiple materials on a single mesh. If Mesh.Merge does that now, think it must be downstream of export.
  16. Thanks. Something weird though, I cannot find "createDefaultEnvironment´╗┐´╗┐" in the scene.ts file. I tried the one in Github. Finally found it in sceneHelpers. I see EnvironmentHelper has no serialized options, or a parse method. That would mean the JSON exporter would not be able to set the option (overrides actually) of a default environment. Think I'll just try to see if environment texture through for now. Possibly add Blender custom properties later.
  17. While I still want to know what that is, I found it because I was looking at scene level environment texture. Is there functionality to cause a sky box to be added without actually creating the mesh yourself? Could this maybe be added, where the programmer is still responsible for the radius? Blender can do this, but I do not think it actually needs a box.
  18. scene has a member, _environmentBRDFTexture, which is not referenced in the scene.ts, but it is public. Is this a typo?
  19. LoadFile is what gets run for things like a .babylon or sound file. The second arg is a success function callback. You cannot really do any synchronous file loading except for script & other files listed in a html file. Even then, it is not really synchronous, just performed before control is passed. Do Something like: BABYLON.Tools.LoadFile("./assets/data/dictionary.txt", (data) => { this.dictionary = data; console.log(data); });
  20. JCPalmer

    Star

    Now at 128 by 128 webm. the conversion F'ing worked! Thanks again.
  21. JCPalmer

    Star

    Well got the scene generating a vp8 codec WEBM. Below in HD in 2.39:1 aspect ratio. I need to add another resolution (128x128), or just temp change the code for 480, then I will have a good candidate: avatarHD.webm
  22. JCPalmer

    Star

    If you look at the repo history, I had an initial commit with blue background. It did not work, but thanks for thinking about it. I have seen a .gif that worked which had transparency. The thing in common with those which have worked is that they all have very few colors. Think I will try to generate a .webm to an unpublished scene, based on my Blow Me Baby scene. Then try to follow your converter process. It is Saturday, cold & rainy (grabbing beer).
  23. JCPalmer

    Star

    Thanks! I had tried doing this earlier, but my .gif would never animate on this site as an avatar. I had seen others who had a .gif. What did you use to go from .webm to .gif?
  24. A .fbx is a data & directions file, just like a .babylon or .gltf. The directions of the .fbx must have said that the texture is in a file named AlbedoTransparency.png. So, the fbx importer made the appropriate setting in the blender scene / unsaved .blend. I do not know if a fbx can even have embedded textures inside them. Go to where or whom you got the .fbx, and get or ask for the file. If they or it do not have it, then this question is probably solved with the answer that you are screwed.