jdurrant

Members
  • Content count

    47
  • Joined

  • Last visited

  1. Works great @RaananW ! Thanks for your help. If I'm not mistaken, camera.getTarget() is a defined function. Why not have it return the average of the two getTarget() values? In case it helps others, here's a working example: https://www.babylonjs-playground.com/#7DYN70#12 All the best.
  2. I've used camera.getTarget() to advance a character forward, as described here: It's worked well with a FreeCamera, but I can't get it to work with the WebVR camera. camera.getTarget().subtract(camera.position) always returns the same vector regardless of what I'm looking at through the VR headset. It was a bit tricky to demonstrate in the playground, but I managed: https://www.babylonjs-playground.com/#7DYN70#11 That demo starts with a FreeCamera attached to the canvas. Open up the JavaScript console, and you'll see that it's logging the camera.getTarget().subtract(camera.position) vector every second. If you move the camera looking direction with your mouse, you'll see that the "looking vector" changes in the console. Now the same code, but with a WebVR camera (HTC Vive): https://www.babylonjs-playground.com/#7DYN70#10 In the console, the "looking vector" is always the same, regardless of what I'm seeing in the HTC Vive headset. As always, thanks for your help!
  3. Thanks, all. It seems I was using a slightly out-of-date version of babylon.js. Both the online playground and my local copy were 3.1-alpha, but apparently not the same 3.1-alpha. Using BABYLON.SceneLoader.Append, rather than BABYLON.SceneLoader.Load, also seemed to be important. Hope all this helps someone else. All the best.
  4. Thanks for your help, @Deltakosh . The playground scene you posted works perfectly. (Remarkable scene, BTW, to view in the HTC Vive!) I see both the scene and the controllers. But I continue to have troubles. Here's what I've done to debug so far: STEP 1: I took your exact code, except I swapped in my .babylon file: http://playground.babylonjs.com/#E0WY4U#16 Success! I could see my (unlit) scene as well as the Vive controllers. STEP 2: I created my own local playground. My server is on localhost:8000, started via python: python -m SimpleHTTPServer 8000 Here's my playground framework, which I assume is similar to yours: var canvas = document.getElementById('renderCanvas'); var engine = new BABYLON.Engine(canvas, true); jQuery(document).ready(() => { var scene = createScene(); engine.runRenderLoop(function(){ scene.render(); }); }); I then copied the EXACT same code for the createScene function into this playground environment. The 3D scene appears in the Vive, but no controllers! STEP 3: As a sanity check, I then changed my local-playground createScene function only slightly, to load your sponza scene into my local playground. The 3D scene appeared, but again no controllers. STEP 4: Could there be something wrong with my server that's getting in the way? I uploaded my code to a publically available server. See https://durrantlab.com/tmp/babylon_test/ Great 3D scene in the Vive, but no controllers. TENTATIVE CONCLUSION There's something you're doing at the level of the playground environment that's making the controllers appear in your playground but not in mine. You can see my code here (which, again, does not show Vive controllers, even though it's identical to what I posted at http://playground.babylonjs.com/#E0WY4U#16): https://durrantlab.com/tmp/babylon_test/main2.js Thanks so much for your help, guys! I've long been a huge babylon.js fan. Both your library and your amazing, supportive community are just fantastic. All the best.
  5. Thanks for your help with this, @dbawel . But I haven't experienced your same problem with the Vive controllers. They have always connected for me in other applications, including web-based apps such as those posted at https://mozvr.com/ . They have also always connected in babylonjs apps for me when I create the scene (including the camera) from scratch. It seems to be something specific to trying to load an external .babylon file and then trying to use controllers. I wonder if I'm replacing the .babylon-file camera with the WebVR camera incorrectly... Thanks.
  6. I recently got my hands on an HTC Vive, and I can't even express how excited I am about this technology! Babylon.js works well with the device too. I'm just thrilled! I have run into what I think might be a bug, though. When I load a .babylon file that I exported from Blender and try to hook in a WebVR camera, the controllers don't show up. I can't figure out how to open up an external .babylon file in the Playground, unfortunately, but here's my code: function makeWebVRCamera(scene, position) { var metrics = BABYLON.VRCameraMetrics.GetDefault(); var camera = new BABYLON.WebVRFreeCamera( "deviceOrientationCamera", position, scene, false, // compensate distortion metrics ); window.scrollTo(0, 1); // supposed to autohide scroll bar. return camera; } function startLoop(engine, scene) { engine.runRenderLoop(function(){ scene.render(); }); } function addLight(scene) { var light = new BABYLON.HemisphericLight("light1", new BABYLON.Vector3(0, 1, 0), scene); light.intensity = .5; } function createSceneFromBabylonFile(canvas, engine) { BABYLON.SceneLoader.Load("", "babylon.babylon", engine, (newScene) => { var webVRCamera = makeWebVRCamera(newScene, newScene.activeCamera.position); // Wait for textures and shaders to be ready newScene.executeWhenReady(() => { jQuery("#renderCanvas").click(() => { // Now remove the original camera newScene.activeCamera.detachControl(canvas); if (newScene.activeCamera.dispose) { newScene.activeCamera.dispose(); } // Set the new (VR) camera to be active newScene.activeCamera = webVRCamera; // Attach that camera to the canvas. newScene.activeCamera.attachControl(canvas); // This won't work if desktop-based vr like htc vive. So this command also run on play-button click. }); addLight(newScene); startLoop(engine, newScene); }); }); } function createSceneFromScratch(canvas, engine) { window.scrollTo(0, 1); // supposed to autohide scroll bar. var scene = new BABYLON.Scene(engine); var webVRCamera = makeWebVRCamera(scene, new BABYLON.Vector3(1.8756, 3.4648, 8.517)); jQuery("#renderCanvas").click(() => { // Set the new (VR) camera to be active scene.activeCamera = webVRCamera; // Attach that camera to the canvas. scene.activeCamera.attachControl(canvas); // This won't work if desktop-based vr like htc vive. So this command also run on play-button click. }); addLight(scene); var box = BABYLON.Mesh.CreateBox("Box", 4.0, scene); startLoop(engine, scene); } jQuery(document).ready(() => { var canvas = document.getElementById('renderCanvas'); var engine = new BABYLON.Engine(canvas, true); // createSceneFromBabylonFile(canvas, engine); createSceneFromScratch(canvas, engine); }); There are two functions for creating a scene: "createSceneFromBabylonFile" and "createSceneFromScratch". createSceneFromScratch (not commented out in the code above) works great. I can see my scene in the HTC Vive, and the Vive's controllers are also visible. createSceneFromBabylonFile (commented out in the code above) also works well. The 3D scene appears in the Vive correctly. But there are no controllers visible in the scene, even when I'm holding them in my hands. I haven't for the life of me been able to figure out how to make those controllers appear when I load an external .babylon file. (Note: putting newScene.activeCamera.initControllers() after the camera is attached to the canvas was not effective.) Thanks for your help!
  7. Texture caching

    Thanks, @RaananW ! Very helpful.
  8. Texture caching

    I realize this question is very old, but I'm interested in doing the same thing. Did you ever find a solution? Thanks!
  9. @JohnK , you had the right idea. @RaananW , I see now that visibility wasn't the right tool. I was thinking that two meshes, each with 0.5 transparency, should not let any background through when viewed aligned, since 0.5 + 0.5 = 1.0. But it makes sense that it would be multiplicative rather than additive. 0.5 * 0.5 = 0.25, so 25% of the background should go through. Perhaps the effect I was looking for is like Photoshop's "overlay" filter. Not sure... @JohnK , your solution wasn't perfect for my actual case because, unlike the playground scene I created, my actual scene involves transparent meshes. The mesh in front has a transparent region through which you can see the mesh in back. Inspired by your idea, though, I ended up sequentially fading the mesh in back completely in, and then fading the mesh in front completely out. I'd hoped to do the fading of both meshes simultaneously, but that's pretty close to what I was looking for. Thanks, guys! Great game engine, by the way!
  10. I want to seemlessly fade between two meshes with very similar materials. My thinking was that if the sum of the visibility values on each was always 1.0, there would be no transparency visible through the both of them together. But in practice that's not the case. I made this playground to illustrate: https://playground.babylonjs.com/#69K17Z#2 Note that the visibility on one of the grass planes goes down exactly as the visibility on the other plane goes up, such that the two visibilities always sum to 1.0. However, during the transition, the fire plane in the background can briefly be seen through the two grass textures. Does anyone know how to fade between two meshes without briefly revealing what's in the background? Thanks! ~Jacob
  11. Video textures not respecting UVs?

    I was too quick to post my question... I found the answer here: For reasons I don't understand, BABYLON.VideoTexture includes a parameter to flip the video across the Y axis. I had that value set to true. So my line: let videoTexture = new BABYLON.VideoTexture("video", ["baked.mp4"], scene, true, true); should have been: let videoTexture = new BABYLON.VideoTexture("video", ["baked.mp4"], scene, true); Hope this helps someone else!
  12. I'm struggling with video textures. I created a simple icosphere in Blender, unwrapped it, and exported it to a babylon file. I then loaded that babylon file in the browser and positioned the camera directly at the center of the sphere. I added a video texture to the sphere's emissive texture, but it doesn't render correctly. It's almost as if it's not respecting the UVs: I naturally thought the UVs must have gotten messed up somehow, but when I create a regular texture using a single frame from the video, it looks perfect: Here's my TypeScript code: declare var BABYLON; declare var jQuery; if (BABYLON.Engine.isSupported()) { // Get the canvas element from our HTML above var canvas = document.getElementById("renderCanvas"); // Load the BABYLON 3D engine var engine = new BABYLON.Engine(canvas, true); BABYLON.SceneLoader.Load("scene/", "babylon.babylon", engine, function (scene) { // Wait for textures and shaders to be ready scene.executeWhenReady(function () { // Attach camera to canvas inputs scene.activeCamera.attachControl(canvas); // Start rendering video texture let sphere = scene.meshes[0]; scene.activeCamera.position = sphere.position; scene.activeCamera.rotation = {x: 0.312831706486495, y: -1.1043217385267734, z: 0.0142} var mat = new BABYLON.StandardMaterial("mat", scene); mat.diffuseColor = new BABYLON.Color3(0, 0, 0); mat.specularColor = new BABYLON.Color3(0, 0, 0); mat.backFaceCulling = true; mat.diffuseTexture = null; let videoTexture = new BABYLON.VideoTexture("video", ["baked.mp4"], scene, true, true); let nonVideoTexture = new BABYLON.Texture("baked_texture34.png", scene); let useVideoTexture = true; mat.emissiveTexture = videoTexture; sphere.material = mat; // Because sphere is only thing in scene. // Once the scene is loaded, just register a render loop to render it engine.runRenderLoop(function() { scene.render(); }); // On click, switch textures jQuery("body").click(function() { this.useVideoTexture = !this.useVideoTexture; if (this.useVideoTexture) { mat.emissiveTexture = this.videoTexture; } else { mat.emissiveTexture = this.nonVideoTexture; } this.sphere.material = mat; }.bind({ sphere: sphere, mat: mat, videoTexture: videoTexture, nonVideoTexture: nonVideoTexture, useVideoTexture: useVideoTexture })); }); }, function (progress) { // To do: give progress feedback to user }); // Watch for browser/canvas resize events window.addEventListener("resize", function () { engine.resize(); }); } Thanks for your help with this!
  13. Blender shape keys and MorphTargetManager

    Thanks for your help, JCPalmer. Just to clarify, you mean I should save the different shape keys as separate meshes (in blender) and then programmatically create a morphTargetManager? Can you provide a code outline? Also, would you mind pointing me in the direction of your implementation of shapekeys? It seems the one I'm using now isn't ready for prime time. Thanks.
  14. I'm using the preview release of Babylon (3.0. alpha) and the 5.3.-1 exporter in Blender. The Blender scene includes shape keys attached to a mesh called "Cloth": When I export to a babylon file, this information appears to be saved correctly. There is a "MorphTargetManager" entry in the file JSON that looks like this: "MorphTargetManager": { "id": "Cloth", "targets": [ { "name": "Draped", "position": [lots of numbers...], "influence": 0 }, { "name": "testtest", "position": [lots of numbers...], "influence": 0 } ] } I then load the babylon file with BABYLON in the browser, but I can't for the life of me figure out how to access this saved MorphTargetManager. Here are some things I tried: BABYLON.scene.morphTargetManagers; ==> [] BABYLON.scene.getMorphTargetManagerById("Cloth"); ==> null BABYLON.scene.getMorphTargetManagerById(0); ==> null (After setting the "Cloth" mesh to clothMesh...) clothMesh.morphTargetManager; ==> undefined (but present when I inspect the mesh object...) I feel like all the parts are there, I just can't figure out how to access the MorphTargetManager that I think should be present when I import the babylon file. Any help you can offer would be appreciated! Thanks!
  15. Very handy function! But it seems it has to be called early in the load process. See https://www.babylonjs-playground.com/#H52JTC#7 In that example, the first resize works great, but when I try to resize the textures again later (on click), it doesn't work. I show a settings page when the user starts my app. The scene begins to load immediately in the background. One of the options on the settings page is to use low-res textures. But by the time the user presses "Start" on that page, the load process in the background has progressed to the point that the texture resize doesn't work. Does that make sense? Hopefully the PG example shows what I mean... Thanks!