themoonrat

Members
  • Content Count

    274
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by themoonrat

  1. No. You'll have to implement those yourself via tweening the alpha property of containers.
  2. Ok, there's a couple of crazy things there In input.js, you are joining the pointerdown and pointerup event EVERY FRAME. You only join the event once. In initialize.js, the function you are passing does not exist in that file scope Button[index].on( 'pointerdown', onClickButton(index) ); The way you use the function callback also won't work. You need to supply a reference to a function, or an anonymous function, as that parameter. You are actually trying to call that function there and then
  3. Use a breakpoint and look within the event object after adding a pointermove event renderer.plugins.interaction.on( 'pointermove', ( event ) => { console.log( event ) } ); All the info you need is in there
  4. renderer.plugins.interaction.on( 'pointerdown', ( event ) => { console.log( event ) } );
  5. Take a look at the examples page which demonstrates usage https://pixijs.github.io/examples/#/demos/interactivity.js
  6. There is a big performance advantage to have as few images loaded and used as possible. Therefore games tend to use 'sprite/texture atlases', in which lots of individual images are combined into one big one. Then the libs can use the json data to work out which part of that big image contains the smaller image needed to be displayed. Don't worry, it's not hand done, there are lots of tools to generate this all for you, like TexturePacker. Pixi can use the json format exported by that tool natively.
  7. Check the official pixi wiki page on github: https://github.com/pixijs/pixi.js/wiki Most specifically: https://github.com/pixijs/pixi.js/wiki/v4-Performance-Tips The main thing tho; profile profile profile! Use Google Chrome profiler to see where performance is being lost. It's very very hard answer questions of performance in an online forum because every game and every scene is different. Only you can find out where things are slow by using the tools available
  8. @JeZxLee you are using an old version of PIXI, v4.0.0. This issue does not exist in the latest version, 4.5.3
  9. The refresh rate is the monitor will effect fps.... the browser may decide to not send a new requestAnimationFrame if the monitor won't be refreshing to display any changes
  10. The documented setting for forcing the canvas renderer works when creating a pixi application or auto detect renderer. If it's not working for you, you're not using the setting correctly!
  11. Just google "Why is WebGL faster than Canvas" WebGL gives PIXI direct access to the GPU, and as such it can choose how to batch up certain commands for the GPU to run in an efficient manner. A simple example being, if you are drawing the same sprite over and over, you don't need to keep telling the GPU the details of the sprite every time, you just say here's your current texture, now draw here, here, here, here and here. Therefore WebGL is very very quick at drawing the same texture multiple times at once. Other tricks like masking and filters can be done totally on the GPU too. The Canvas API does not give you direct access to the GPU. You are using a general purpose simple API provided by the browser; every command you make will use some CPU to use this API, and it cannot optimize how to send the draw commands to the GPU. In the above example, each draw call is it's own thing; the batching up of super fast drawing of the same texture cannot occur. In a large generalization, the way to get performance up in the Canvas renderer is to make less draw calls. So, your game WILL ALWAYS be slower in Canvas renderer than WebGL renderer, and there's nothing you can do about that. It's not unreasonable to have a 2 code paths on occasions, one for clients using WebGL, and one for using Canvas. The WebGL version has all the bells and whistles and is the actual true vision of the game. Canvas mode (for me at least) is 'let the player play and experience the game', which comes at the cost of lower resolution textures, lower resolution game, less (or no) fancy effects. Other io games are doing this because they're written their games to perform well on the Canvas renderer. It's like making a game for a console. If you write a game targetted at a PS4, then run it on a PS3, then it's not going to perform well. You often have to tweak the game vision to match the hardware you are targetting.
  12. Use the prepare plugin to upload the textures to the gpu before you wish to show them (ideally include this within your loading process)
  13. I find that if I open 2 games side by side, requestAnimationFrame itself only fires 30 times a second on each tab rather than the usual 60, even though my PC could easily handle both at 60. The browser itself throttles how often it sends requestAnimationFrame and there's nothing you can do to stop that. I would record a time before and after you call renderer.render( stage ), and see how much time in ms each render is taking. Also measure how long between each requestionAnimationFrame request has come in. Here you can compared the difference. If the render time is only 2 ms, but there's 33ms between rAF calls, then it's not your game that is causing the lower frame rate. If your game render time takes 20ms, then you are over budget for a 60fps game and thus you have optimisations to do
  14. Indeed. I personally keep a count of how many textures I've passed to prepare, then count how many callbacks I've received when they've done, voila a percentage.
  15. Pretty much slot game on sky vegas or any other website played on an ios device will have this 'swipe up to play' mechanism. You detect to 'stop' by looking at the window.innerHeight compared to the window.screen.height.
  16. Indeed, the only way is to simulate scrolling down the webpage, as ios safari lessens the top bar in that scenario. To simulate scrolling down the webpage, you need to put an overlay div on top of your game, that is taller than your game, to give enough height to scroll down.
  17. For tweening: https://github.com/Nazariglez/pixi-tween (I have my own fork on github I use which adds promises and various bug fixes) For particles: https://github.com/pixijs/pixi-particles with it's amazing editor https://github.com/pixijs/pixi-particles-editor Extra WebGL filters: https://github.com/pixijs/pixi-filters And easy enough to integrate to a pixi workflow, the best lib for sound, imo: https://github.com/goldfire/howler.js
  18. You want to use the prepare plugin in pixi renderer.plugins.prepare.upload() You can pass through a sprite, texture, base texture etc. and it will upload any base textures to the gpu, so when it comes to be used there is no minor decoding lag
  19. You need to pass in the context to use in the third parameter for event listeners. this.sprite.on('mousedown', this.onClick, this);
  20. The ticker uses requestAnimationFrame in the background. Internally, it makes a record of the last time requestAnimationFrame fired, and compares it with the latest time the requestAnimationFrame fired. It compares those 2 times, looks at your target frame rate (60fps by default) and therefore knows what the delta is. When you add your function to the ticker, when requestAnimationFrame comes through, the ticker does it's calculations, then calls your function, also passing through the delta as it's first parameter.
  21. So, I'd _really_ recommend using the built in pixi stuff for this as Jinz suggested, as it'll give you all the information you need to use , and again I'll link to the example page; please read the code, it has comments explaining exactly what you want to know: https://pixijs.github.io/examples/#/basics/basic.js Note how it's added a function to the ticker ... this could be your function to do all your game update stuff. Also note how it multiplies the rotation speed by the delta. The delta here is your critical friend in helping you deal with different framerates on different devices. If your game is running at 60fps, and your target frame rate is 60fps, then the delta will be 1. So, in this case, multiplying your position change by 1 does nothing. But imagine the game is only running at half speed, 30fps. Well, the delta sent through now is 2. This is just what you need! You're function is being hit half the amount of times, so it needs to move your sprite double the amount each time.
  22. themoonrat

    Get all sprites?

    Yeah... think of rendertextures like when editing an image in photoshop, and you have all these layers, and when you come to save a png or jpg it asks if you're ok flattening everything into 1 layer. That's your rendertexture, so it's not going to show up in the scene graph all of the things that went on to make it up. And there is a chrome plugin inspector https://github.com/bfanger/pixi-inspector & https://chrome.google.com/webstore/detail/pixi-inspector/aamddddknhcagpehecnhphigffljadon?hl=en I found it didn't play nicely and got confused by renderTextures, for which I have my own hacked version of to solve, but the above might play out of the box nicely for you (the github version seems newer than the chrome store version, but didn't work for me at all)
  23. themoonrat

    Get all sprites?

    function recursiveFindSprites( spriteArray, displayObject ) { if ( displayObject.texture ) { spriteArray.push( displayObject ); } if ( Array.isArray( displayObject.children ) ) { for ( let i = 0; i < displayObject.children.length; ++i ) { recursiveFindSprites( spriteArray, displayObject.children[i] ); } } }; let spriteArray = []; recursiveFindSprites( spriteArray, renderer._lastObjectRendered ); Slightly hacky in that it's looking for a .texture property, which will also include Sprite like items like Text and such, but that's the if statement to fine tune for your needs
  24. So, how I have this setup is I call PIXI.loaders.Resource.setExtensionLoadType( 'mp3', PIXI.loaders.Resource.LOAD_TYPE.XHR ); PIXI.loaders.Resource.setExtensionXhrType( 'mp3', PIXI.loaders.Resource.XHR_RESPONSE_TYPE.BLOB ); For each audio format I have. I prefer to load data via XHR and as BLOBs because alongside a returned data blob is the size of it; useful info! Bonus tip 1: Prefer m4a over mp3. Better compression, better sound, better sound loops, same compatibility Bonus tip 2: Also use ogg, and check browser compatibility for which one to load. You need both to cover all browsers You'll need to create your own bit of 'middleware' to deal with audio. If you see https://github.com/englercj/resource-loader/blob/master/src/middlewares/parsing/blob.js - this comes included by default to handle blobs in certain formats, but it doesn't handle audio. It's very simple to create a new version of that file that works in the same way but just handles audio. You don't need to create a new Audio tag like that creates a new Image tag... all you just need to create an object url using the data blob. You can use this new parsing functionality into PIXIs loader by doing something like PIXI.loader.use( handleAudioBlob ); Now, put a callback on the loader for when an asset has loaded, and if it's an audio file, it'll already have the blob as a src from your custom middleware... you can use this as a src for a new Howl. After doing doing this, don't assume that it's immediately loaded, there's an audio decoding process that goes on ones a new Howl is created, but you know this is done via a callback 'onload'