Jump to content

Exca

Members
  • Content Count

    371
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by Exca

  1. You can check the network tab in browser debugger (f12) to see if the textures are visible there and with a result of 200 (http ok). Also using pixi loader is a good idea, as then the assets are already loaded when you are going to play the animation. Otherwise it might happen that images are still loading when the animation is about to play.
  2. Oops, just noticed that I forgot to write Math.max and Math.min. I edited the above response to correct. But basically this is how it works: Let's say you have a value of 10 and your allowed range is 2-5. First you take the maximum of range start (2) and your value (10). This returns 10 and if you had for example -1 as value then the value would be 2. So basically it sets the return value to range start or the value itself. Then do the same to end of range with minimum to get a value that's less than 5 and you end up with a value that's in wanted range. You could also imple
  3. In javascript the scope of the function depends on how it's executed. For example if you have a class and you call it by myClass.foobar() then that is executed in the scope of the class. But if would do something like var func = myClass.foobar; func(); it would execute outside of the classes scope. Basically same thing happens when the event handler is run as you pass the info about the function only and not the scope. Old way of doing this was that you would do something like this: var that = this; function handler( ){ that.foobar(); } window.addEventListener("something", hand
  4. If you can't edit the source material to be POT you could render the original texture into a rendertexture that is POT and then create a texture that uses that POT rendertexture as basetexture and you would have an image that is POT, even if the sprite isn't and mipmapping would be possible.
  5. Can you check from your browser debuggers network tab that the frames are loaded when the playback fails?
  6. The scope of the event handler function is incorrect. You can either bind the function to be in the scope of the class or use an arrow function to get the same effect. So doing the callback like: private onButtonClick = () :void => { console.log("clickt on button"); let btnTest = this.createMenuButton(); // <<-- Error: this.createMenuButton is not a function console.log(this.uiContainer); // <<-- Error: Undefined this.uiContainer //Launch a event to show a sprite... } Would fix it.
  7. When you handle the movement just add clamping to values that don't allow it to go any further. So in your case if I assume that at 0,0 the container is at topleft of canvas then you could do something like this: function clamp( min, max, value ) { return Math.min(max, Math.max(min, value)); } const maxBorder = 30 container.x = clamp( maxBorder, container.width - canvas.width - maxBorder ); container.y = clamp( maxBorder, container.height - canvas.height - maxBorder); Might be an error in the logic somewhere, dont have a proper testbed currently. In that code you would
  8. Calculate the angle between mouse and the object and then set that to your objects rotation.: const mouse = renderer.plugin.interaction.mouse; const dx = mouse.x - object.x; const dy = mouse.y - object.y; object.rotation = Math.atan2( dy, dx); Written from memory without testing so not 100% sure everything is correct. But that's the basic idea. Also depending on what your objects "forward" direction is, you might need to add an offset to atan2 result.
  9. Progressive web apps (PWA) are one way to create apps with web technologies without using wrapping. If on the other hand wrapping is needed, then cordova based ones are still the most common ones.
  10. I was thinking of doing that in 2d, but just doing it so that it looks like 3D. For example if you animate 5 ropes with varying points you can get a semi3d looking set. Or if you create a mesh that you rotate and squash a bit and then move the vertices up/down you can get a mesh that looks like it was orthogonal 3d without doing any real 3D stuff.
  11. Depending on the style that could be done pretty easily with either deforming a plane mesh (with gpu or cpu, first option being much faster) or using ropes and then just moving points along the x-axis.
  12. You could extract the frames from canvas with toDataUrl or use the extract plugin to get the actual pixel data of a single frame. Then gather all of the frames and encode them into a video with some encoder, dunno if those exist in the browser, most likely someone has made one. You can do stacked rendering with multiple different ways. - Have 2 containers that move with different speeds. - Use overlapping canvases with each having their own renderer and move them. - Have each of your object in the scene have a depth value and move everything based on that. - Use rendertextures
  13. The currentTime is a value that starts increasing when you start the context. That's why the timing is done with current-start calculation. https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/currentTime 80ms sounds like it is not calculating correctly. Havent used pixi sound myself so not really sure what might be wrong with the progress event. If the audio is played without webaudio (which shouldnt happen on a modern browser unless explicitly told to do so) then those delays sound possible.
  14. Usually with Webaudio you could use audiocontexts currentTime and store the starting time when song is started. From that you can calculate position = context.currentTime - soundStartedAt. This should have no delay at all. Are you using pixi sound? Checking that source code the progress event looks like it should be correct timing. Can you see if you are using webaudio internally or htmlaudio? The later one does not have exact timing available.
  15. Easy way on how to get better looking downscaling is to find out the points where your game starts looking bad and then instead of scaling down the game containers / elements, you keep that good looking resolution and scale down the canvas element. This way the downscaling algorithm is not impeded by webgl limitations but can use the one that browser uses natively. Little bit hacky way, but it's a pretty well working workaround that can be achieved with small effort.
  16. You could make a filter that takes the world texture as input and the light mask as input and then just draw the world if light has value at that same position. And otherwise keep value hidden. Something like this: vec4 world = texture(worldTex,uv); float light = texture(lightTex,uv).a; //Using only one channel for light, this could also be light color + alpha for intensity gl_FragColor = mix( vec4(0,0,0,1), world, light);
  17. Do you have only canvas on the page? Then using lighthouse wont give very much detail as it has no components to analyze canvases. Only the dom-side of things, loadspeeds and stuff like that. Most likely the LCP is for the canvas element and for some reason that fails to be analyzed. It should go similarly as with images. Do you have an example on the site which fails?
  18. How do you use it and what sizes are your textures? How complex is the scene you render to RT? There's no additional hidden costs in rendering to rendertexture vs. rendering to screen, other than the additional memory usage.
  19. Just listen for wheel event on the window/document/element https://developer.mozilla.org/en-US/docs/Web/API/Element/wheel_event
  20. You have basically two options. 1. Render the expensive stuff into a separate rendertexture and use that as you would any other sprite. Rerender the rt when things change. 2. Use two canvases. Update the expensive canvas only when needed. [Edit] For 1 you can use cacheAsBitmap = true to create rt of a container and use that instead of the whole render list. Though I'd suggest using a custom handling with own RT to handle this as debugging cacheAsBitmap can be a nightmare if there's some errors.
  21. I dont think the regular tilingsprite supports mirrored rendering out of the box. You could create a custom shader for it that detects if it's on an odd or even tile and then flips the uv-coordinates depending on that.
  22. Check the network tab if you are still getting the cors error. To fix that you would need to either run the game on the same domain as the images to ignore cors-rules or have the server send a crossOrigin header that applies to your domain or just a wildcard. The renderers null error might just be due to asset not being loaded as cors blocks it.
  23. How many sprites you have? There's a batchsize limit. When that is reached the current batch is rendered and new one is starting. You can change that size with PIXI.Settings.SPRITE_BATCH_SIZE. Default is 4096.
  24. 2048x2048 is always a safe bet. It's supported by virtually all devices that can run webgl. There used to be a site that collected statistics on different device data but it seems to be gone from the internet. You could use gl.getParameter( gl.MAX_TEXTURE_SIZE ) to get largest dimension the device supports. Maximum I would go is 4096x4096 and for cases where device says lower use either a downscaled version or have multiple textures.
  25. The bottleneck when rendering squares is just the amount of squares you would need to render if you used basic sprites. Rendering the squares to rendertexture and then rendering that texture to screen would make the frames where rendering doesnt change faster. But when the rendertexture needs to be rerendered then it would still take some time. And in webgl the whole frame gets repainted every time. If you are sufficient with having pixels be the squares and dont need anything fancy like borders / textures to squares then you could use additional 2d canvas and do the game of lif
×
×
  • Create New...