• Content Count

  • Joined

  • Last visited

About Christoph

  • Rank

Profile Information

  • Gender
  • Location

Recent Profile Visitors

809 profile views
  • DiV

  1. Check this one, if all you need is static tiles. I did bench the "lookup" on very old devices some time ago (ipad2) and it was faster than batching tiles like you do.
  2. Mea culpa. My head hurts and I guess I messed up the question. Lemme try again I want to use a websocket-server so users have the ability to exchange messages, a simple chat, that's the untrusted user input part. Because those messages are sent by users and could contain malicious input that's broadcasted to other users, filtering on the server side is must because of this, else it will lead to XSS. But I was not able to find anything related to xss, webgl / html5 canvas because they are used in babylonjs for text rendering and that's the place the user input is used at and where XSS could happen.
  3. Hi, I've been wondering how to deal with user input while using babylon js and where to pay attention. My idea was to use websockets + babylonjs to render some text so the users have the ability to talk to each other. However while I guess that the webgl part is rather safe (because no user input enters the DOM) I wonder how safe the html5 canvas and it's "fillText()" and "strokeText()" are.
  4. Check 2.1.c You are allowed to use them in 3D models, however you need to put a proper license on the model. I think the main reason is that you do not redistribute their stuff with some copyleft license.
  5. #2 Their license is among the more permissive ones.
  6. In order to force your application into your defined AA you should try to change "enhance the application setting" to the last one that's something like "disable application settings". My Nivida Settings are in german so I don't know how it's called exactly in the english version. That should override the application's settings and use your own. However flickering is usually your depth buffer. Note: The size of your room does not matter for the buffer. You can test if that's the case by moving "flickering" objects a bit away from each other, so they do not overlap or are too close together.
  7. How about vertex colors or uv's and one Triangle ? if you set vertex colors for your triangle alle the same it should be filled with that color. If you set - for example - it's vertices correctly you can have a line with increasing thickness: A possible triangle would be: x = 0, y = 2, z=0 x = 100, y = 3, z=0 x = 100, y = 1, z=0 This would draw a "line" that's 2 units "thick" at it's end and is 100 units "long". If you want to use such a thing you could construct all kinds of lines, like adding a "rounded" head and so on. However this really depends on your needs. By supplying "uv's" you can map anything you want on your triangle.
  8. I'm not sure what you look exactly for, however as far as reducing vertices is concerned you can go either for a bump map / normal map or for the opposite direction and look into displacement mapping (displacing vertices with a texture).
  9. Is your data sorted, so you get a stream of 20k points per second and all you do is throwing away the oldest 20k points ?
  10. for (var i = 0; i < mscene.meshes.length; i++){ if (mscene.meshes.isVisible == false) continue; var bounds = mscene.meshes.getBoundingInfo(); That's 99% your culprit, you get your min/max x,y,z values based on different meshes in your scene. Or in other words: you are distorting your camera by using something like: Left = -5 Right = +10 Top = +6 Bottom = -9 You want to use instead: scale = 5 Left = - scale Right = scale Top = scale Bottom = -scale This will capture your "world" from -5 to +5 around the camera's position and "squeeze" it into it's output. If you use however different scales for x and y you will distort your output. Edit: Here is a quick "fix" - Lines 171-173.
  11. If your sphere consists of some kind of geometry with uv mapped textures, distorting the geometry should as well distort the displayed texture. According to the API of PIXI You should (for example) specify both vertices and their UV coordinates, if you want to have this performant you need to distort the verticies in your vertex shader as well: Please note that I have not tried this.
  12. You could use a sprite sheet instead, so you can change to many different "sprites", spriteA.png would be cell 1, spriteB.png would be cell 2.
  13. Ah I see, so the solution is to add the texture as additional opacity texture.
  14. After experimenting with u/v scales and offsets (I want to use planes for sprites in my project), I notices that you cannot use any alpha on your emissive textures, the background will be black: Quick Demo, swap the emissiveTexture to diffuse Texture and the Alpha works: According to the documentation this should work however ?
  15. I don't think so, I confused in that post active and total meshes so the Edit I made is not correct