• Content Count

  • Joined

  • Last visited

  • Days Won


Everything posted by brianzinn

  1. I have a playground to reproduce: What I want to do is take a photo by generating a texture from a sprite, but with these lines not commented out (it works without those lines, but not like a "mirror"): // uncomment these on line 74 and it works videoSprite.anchor.x = 1; videoSprite.scale.x = -1 I just get a blank screen texture. I have tried to revert scale/anchor before calling `app.renderer.generateTexture(videoSprite)` without success (maybe I can manually trigger a frame render?). If you are trying out the playground it will ask for video permissions - the video is just put into a texture and nothing further. Thanks.
  2. You can also add onAnimationEnd param: scene.beginAnimation(vehicle, 0, length, false, 10, () => { heartFlight(vehicle, move_trajectory, index + 1); }); If you added any points during the animation, they would get added to the end. You would need code to restart a completed animation or perhaps a queue push/pop as @bghgary suggests. I added some smoothing, so the speed is consistent on different length path segments. If you don't want a steady speed then you can look at slerp on total length or rebuilding animation when points are added (start at current pos) and using the built-in easing functions. cheers.
  3. I've played around with the GamePad API for GearVR and Daydream controllers and after figuring out how to debug them the coding part was easy. The gamepad API you can check yourself what is being returned. Looks like your controller is not supported by gamepad API, based on what you are saying though. Do you get any devices returned at all when connected? Check out this post for how to debug Android devices (over USB or WiFi). Very useful for figuring out what is happening with gamepad
  4. Look at the vertex position data to see why (ie: 852 is far from world center): console.log('pp:', mesh.getPivotPoint()) console.log(mesh.getVerticesData(BABYLON.VertexBuffer.PositionKind))
  5. The !X is a "communication administratively prohibited" (I think ipv4) that Digital Ocean uses. Try 'tcptraceroute or traceroute -T'.
  6. It's not because the camera is further from the player when you have more players? When you attach a sound to the mesh (ie: sound.attachToMesh(mesh)) then the sound comes from the location/distance of the mesh. I load quite a few sounds in my game attached to many meshes and have not found any issues. Any way you can reproduce as a PG otherwise?
  7. I don't know how 3D text library looks in Unity, but you can use vectororize-text with PolygonMeshBuilder (note the webpage to demo is gone, but vectorize-text is still on github). edit: I didn't know about MeshWriter - would go with that for sure
  8. From @mkaouri "If the iOS 11 showing the front camera, you've to make sure that faceMode constraint in artoolkit.min.js is set to 'environment' to use the back camera of your mobile."
  9. hi @MarianG, I didn't really follow the AR.js - from my recollection it was relying on another engine renderer. If you are not specifically after AR.js, do the jsartookit examples no longer work Also, BabylonJS 4.0 has AR via WebXR, if that meets your needs (it's very early market for browser support). I haven't been following close enough, but otherwise if you want a demo of that (like magic window) check here:
  10. try like this
  11. The link from my first response was to the exact line that I thought does that (!mesh.isEnabled() ... continue;), from my understanding ( So, why does isVisible == false keep it in the rendering loop, while setEnabled(false) not? I would think the difference would be hardly noticeable. Is there somewhere else I should have referenced or a performance penalty I am not seeing? Thanks.
  12. Have you tried dialing back some settings, like disabling webgl2 in engine options or engine.getCaps().highPrecisionShaderSupported = false; It may be possible that the BabylonJS engine needs to account for a bug in the the browser/driver as @bghgary has suggested. When I look at your first screenshot it looks like z-fighting with 2 planes - those misplaced triangles look like a face that does not belong. I've tried mesh merging and was happier with more fine grained control using meshes generated manually (from data files). I was originally inspired by some code from this project (/scripts/PlanetChunckMeshBuilder), which is efficient and produces minimum vertices (ie: skipping occluded faces) :
  13. If you look in the inspector (ie: open PG click "inspector" button), then you can see the number of vertices/faces (ie: mesh.getTotalVertices()) - along with materials, these are in memory. To answer your question, though, I am not aware of a way to measure memory usage of a single mesh - I just go by poly count as a basic measurement. If you have materials with large textures would also have an impact. Clones vs instances will also reduce memory footprint, if you have multiple. If you need to "unhide" meshes quickly then disposing them is not a good option! You may find this useful, but they will remain in memory:
  14. This is what you are after: mesh2.setParent(null); That will unparent the mesh, while maintaining rotation and position in world co-ordinates (instead of local to the parent). If it doesn't work just post a PG. I've got lots of similar versions of it in my game If you have physics on your crane cargo this is where you would call .wakeUp(), too (ie: get crane cargo to fall). Cheers.
  15. Yes. The "active" meshes are the ones that slow down your scene. Notice here where invisible (isVisible == false and visibility ==0) are not included: If you dispose() a mesh instead of hiding it, then I would think it would give you a very marginal speed increase, but with the slow cost of re-importing. I would suggest to keep the meshes, if memory usage is not a concern. Best thing is to do a speed comparison, as empirical evidence (ie: FPS/engine.getDeltaTime()) does not always show what would seem intuitive.
  16. Are you calling enableInteractions() after setting the raySelectionPredicate? That will overwrite your own predicate - otherwise your predicate should take precedence. Otherwise are your distances large - like over 100?
  17. I don't know if the 3DS export instead to glTF will help (yet - maybe it's too early). I have not looked into that exporter, but did read an interesting article Adobe posted 2 weeks ago that may be of interest to you. Check out the section on lighting (KHR_lights_punctual, EXT_lights_image_based ) :
  18. @Sebavan looks like a custom C++ implementation of OpenGL ES 2.0. For Android they use a JSBridge (based on Cordova) - iOS bridge is based on WebViewJavascriptBridge. I looked into their project a bit - no commits since April and issues are not getting replies. Going to give it a miss for now, but will be checking in to see if they progress, since it's a great idea - maintaining their own engine I think is a lot of effort even to get to ES 3.0. It ties is so nicely with Weex - I hope they invest in this project further
  19. You may not be interested in the implementation details, but I'm going to show you why they don't have a rotation property and how to get around that. That's because the 3D controls only have position, which delegates to the 'node' member variable: You can use linkToTransformNode() as ssaket explains. Here is the code for linking, as you can see it just adds a parent (which is best for positioning multiple controls in 3D together): You can directly rotate 3D controls by accessing their 'node' property!! 'node' is only available after they are added to manager. Here I am rotating the button 45 degrees.
  20. Has anybody written anything for GCanvas with BabylonJS on a mobile device? Looks promising... "At this moment, we have already supported 90% of 2D APIs and 99% of WebGL APIs." If nobody has tried it out - I'll post back here when I get a chance to, which will take a while... Cheers.
  21. Here is an open source game running on heroku and mlab (javascript server with mongoDB. websockets for 4 player peer communcations):
  22. My initial goal was to go cheap and serverless and something I can run locally for development. I did not end up with a free solution - I am running Azure functions for my API and a relational DB (it was much cheaper than DocumentDb). I may switch my website (App Service) to be serverless at some point - Azure now allows websites to run from their CDN (like AWS S3 has for ages). Not currently doing proper analytics, but I do track things like levels attempted/completed/etc. In retrospect it may have been cheaper and easier to just host a website somewhere - hope that helps a bit I think the best would be an affordable usage based turnkey solution that can be configured to store and query different data structures with user management/authentication.
  23. I would guess something with pivots, if there are multiple meshes at position {0,0,0}. In which case the real location would be the opposite of the pivot? Those depend on how your OBJ file is generated: Easy way to test is to rotate one of the meshes and see how it rotates or have a look in the inspector.
  24. In addition to what Sebavan has said, have a look at assigning an AdvancedDynamicTexture to a mesh 4th parameter: You will see from the comment that on the variable 'onlyAlphaTesting' that alpha testing will be used, but not alpha blending. Also note how different textures are assigned to the material (ie: hasAlpha = true doesn't assign opacityTexture, but does assign diffuseTexture). HTH.
  25. There's a (public hidden) rebuild method on textures as well. I use it for HMR. Also, rebuild() on the pipeline processing among others