themoonrat

Members
  • Content count

    136
  • Joined

  • Last visited

About themoonrat

  • Rank
    Advanced Member
  • Birthday 11/25/1982

Contact Methods

  • Website URL
    http://www.moonrat.co.uk
  • Twitter
    themoonrat

Profile Information

  • Gender
    Male
  • Location
    UK

Recent Profile Visitors

1,111 profile views
  1. I find that if I open 2 games side by side, requestAnimationFrame itself only fires 30 times a second on each tab rather than the usual 60, even though my PC could easily handle both at 60. The browser itself throttles how often it sends requestAnimationFrame and there's nothing you can do to stop that. I would record a time before and after you call renderer.render( stage ), and see how much time in ms each render is taking. Also measure how long between each requestionAnimationFrame request has come in. Here you can compared the difference. If the render time is only 2 ms, but there's 33ms between rAF calls, then it's not your game that is causing the lower frame rate. If your game render time takes 20ms, then you are over budget for a 60fps game and thus you have optimisations to do
  2. Indeed. I personally keep a count of how many textures I've passed to prepare, then count how many callbacks I've received when they've done, voila a percentage.
  3. Pretty much slot game on sky vegas or any other website played on an ios device will have this 'swipe up to play' mechanism. You detect to 'stop' by looking at the window.innerHeight compared to the window.screen.height.
  4. Indeed, the only way is to simulate scrolling down the webpage, as ios safari lessens the top bar in that scenario. To simulate scrolling down the webpage, you need to put an overlay div on top of your game, that is taller than your game, to give enough height to scroll down.
  5. For tweening: https://github.com/Nazariglez/pixi-tween (I have my own fork on github I use which adds promises and various bug fixes) For particles: https://github.com/pixijs/pixi-particles with it's amazing editor https://github.com/pixijs/pixi-particles-editor Extra WebGL filters: https://github.com/pixijs/pixi-filters And easy enough to integrate to a pixi workflow, the best lib for sound, imo: https://github.com/goldfire/howler.js
  6. You want to use the prepare plugin in pixi renderer.plugins.prepare.upload() You can pass through a sprite, texture, base texture etc. and it will upload any base textures to the gpu, so when it comes to be used there is no minor decoding lag
  7. You need to pass in the context to use in the third parameter for event listeners. this.sprite.on('mousedown', this.onClick, this);
  8. The ticker uses requestAnimationFrame in the background. Internally, it makes a record of the last time requestAnimationFrame fired, and compares it with the latest time the requestAnimationFrame fired. It compares those 2 times, looks at your target frame rate (60fps by default) and therefore knows what the delta is. When you add your function to the ticker, when requestAnimationFrame comes through, the ticker does it's calculations, then calls your function, also passing through the delta as it's first parameter.
  9. So, I'd _really_ recommend using the built in pixi stuff for this as Jinz suggested, as it'll give you all the information you need to use , and again I'll link to the example page; please read the code, it has comments explaining exactly what you want to know: https://pixijs.github.io/examples/#/basics/basic.js Note how it's added a function to the ticker ... this could be your function to do all your game update stuff. Also note how it multiplies the rotation speed by the delta. The delta here is your critical friend in helping you deal with different framerates on different devices. If your game is running at 60fps, and your target frame rate is 60fps, then the delta will be 1. So, in this case, multiplying your position change by 1 does nothing. But imagine the game is only running at half speed, 30fps. Well, the delta sent through now is 2. This is just what you need! You're function is being hit half the amount of times, so it needs to move your sprite double the amount each time.
  10. Yeah... think of rendertextures like when editing an image in photoshop, and you have all these layers, and when you come to save a png or jpg it asks if you're ok flattening everything into 1 layer. That's your rendertexture, so it's not going to show up in the scene graph all of the things that went on to make it up. And there is a chrome plugin inspector https://github.com/bfanger/pixi-inspector & https://chrome.google.com/webstore/detail/pixi-inspector/aamddddknhcagpehecnhphigffljadon?hl=en I found it didn't play nicely and got confused by renderTextures, for which I have my own hacked version of to solve, but the above might play out of the box nicely for you (the github version seems newer than the chrome store version, but didn't work for me at all)
  11. function recursiveFindSprites( spriteArray, displayObject ) { if ( displayObject.texture ) { spriteArray.push( displayObject ); } if ( Array.isArray( displayObject.children ) ) { for ( let i = 0; i < displayObject.children.length; ++i ) { recursiveFindSprites( spriteArray, displayObject.children[i] ); } } }; let spriteArray = []; recursiveFindSprites( spriteArray, renderer._lastObjectRendered ); Slightly hacky in that it's looking for a .texture property, which will also include Sprite like items like Text and such, but that's the if statement to fine tune for your needs
  12. So, how I have this setup is I call PIXI.loaders.Resource.setExtensionLoadType( 'mp3', PIXI.loaders.Resource.LOAD_TYPE.XHR ); PIXI.loaders.Resource.setExtensionXhrType( 'mp3', PIXI.loaders.Resource.XHR_RESPONSE_TYPE.BLOB ); For each audio format I have. I prefer to load data via XHR and as BLOBs because alongside a returned data blob is the size of it; useful info! Bonus tip 1: Prefer m4a over mp3. Better compression, better sound, better sound loops, same compatibility Bonus tip 2: Also use ogg, and check browser compatibility for which one to load. You need both to cover all browsers You'll need to create your own bit of 'middleware' to deal with audio. If you see https://github.com/englercj/resource-loader/blob/master/src/middlewares/parsing/blob.js - this comes included by default to handle blobs in certain formats, but it doesn't handle audio. It's very simple to create a new version of that file that works in the same way but just handles audio. You don't need to create a new Audio tag like that creates a new Image tag... all you just need to create an object url using the data blob. You can use this new parsing functionality into PIXIs loader by doing something like PIXI.loader.use( handleAudioBlob ); Now, put a callback on the loader for when an asset has loaded, and if it's an audio file, it'll already have the blob as a src from your custom middleware... you can use this as a src for a new Howl. After doing doing this, don't assume that it's immediately loaded, there's an audio decoding process that goes on ones a new Howl is created, but you know this is done via a callback 'onload'
  13. pixi.text

    This is cool Very useful to give to the artists to then give text styles they'd want us to use. The same request was also posted to github there and someone also went away to do a similar thing: https://github.com/pixijs/pixi-text-style Maybe the 2 of you could work together?
  14. pixi.text

    I'd just play around with the style using the pixi examples page https://pixijs.github.io/examples/#/basics/text.js
  15. Use SoundJS on old libs, Howler on new libs. I'd recommend Howler. It's worked more consistently across a range of devices and browsers for me, especially with sound sprites and especially with IE. The fact that it can also support extra effects is the icing on the cake