themoonrat

Members
  • Content Count

    254
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by themoonrat

  1. For mobile devices, some of those events above are not classes as valid events to unlock audio or unlock full screen API. touchstart or touchend is usually what you need
  2. themoonrat

    WebP

    It's all about what the browser supports
  3. call .update() on the texture (the texture that is created with the canvas as the source) ?
  4. Less telepathic, more your friendly neighbourhood @ping
  5. So, what I do for my button class is something like this this.on( 'pointerdown', this._onPointerDown, this ); PIXI.renderer.plugins.interaction.on( 'pointerup', this._onPointerUp, this ); So, the down part looks for just the button being pressed down. But the up bit looks at the interaction manager global 'up' event. This way, if someone presses down on a button, then moves the mouse away from the button and releases, the 'up' still gets fired for this button. A quick overview of the pointdown and pointerup functions _onPointerDown( event ) { if ( !this._isDown && event.data.isPrimary ) { this._isDown = true; // do down stuff } } _onPointerUp( event ) { if ( this._isDown && event.data.isPrimary ) { this._isDown = false; // do up stuff } } So first thing we do is track ourselves if that button itself is down or up... that way on a global up, if the button is not down, nothing happens The other thing is we check against 'isPrimary', so we're looking at the primary mouse button, or the first touch event. You don't have to have that, and it stops a 2nd touch effecting a button press... but restricting it can make your life easier!
  6. When I saw the title I thought I'd done a pr for it at some stage! It wasn't accepted into pixi official because there were worries about the text api becoming a bit crazy. The hope in the future is that you could pass perhaps a PIXI.Gradient into the fill properties, and thus no need for separate stop point and gradient type settings every time it's an option. But I use the contents of that PR in my games on my own pixi fork
  7. Just to second using Howler... it's the best audio lib out there IMO. Also second using m4a and ogg. mp3's have some issues in browsers with seeking and looping nicely, in my experience. I'd also recommend having different compression settings, which are selected based on the device. Mono 96kbps might be good enough for mobile devices, but maybe supply Stereo 128kbps (or above) for PC players where memory is unlikely to be an issue, and better speakers almost certainly going to be used.
  8. http://pixijs.io/examples/#/demos/texture-swap.js
  9. You shouldn't be using multiple renderers. Just use multiple containers, with the parent most container having the mask on it, seeing through to the container below it
  10. Nopes, you've done the right thing with the fromUrls You don't want to cast it... they're different things. 'BaseTexture' and 'VideoBaseTexture' contain data about the actual image/video. A 'Texture' is some information about what part of that base image / video to use. For example, if you have a spritesheet... that whole image is the BaseTexture, but you just want to draw from certain sections of that image, and those would be the Textures pointing to different parts of the BaseTexture. So Sprites use a Texture, which contains information on how to use it's BaseTexture
  11. Could you supply an example or fiddle with which we could take a look?
  12. And make sure to request it within a callback for a valid mouse or touch event https://github.com/sindresorhus/screenfull.js/ Helps cover the various browser prefixes
  13. There's PIXI Tween Original: https://github.com/Nazariglez/pixi-tween My version: https://github.com/themoonrat/pixi-tween Typescript version: https://github.com/LOTUM/pixi-tween
  14. Ok, it might be because you're not adjusting the movent of the sprite by the delta Take a look at the example http://pixijs.io/examples/#/basics/basic.js and notice how it multiplies movement by delta, to account for slight inaccuracies in timing of requestAnimationFrame. Lastly, after that, if there are still issues... just host the page somewhere and access it via your browser. I make games to be accessed via the browser and the S7 runs like a dream. Maybe it's something to do with phone gap?
  15. It's most likely because of your resolution property. As an example, if the width and height of the window was 1280x720, but the devicePixelRatio returned 3, internally, pixi is now trying to render the screen at 4k resolution. Ouch! Don't use device pixel ratio is my advice for games
  16. You could also consider adding your events at the renderer level, and do some maths to translate to your container. ie. let location = new PIXI.Point(); renderer.plugins.interaction.on( 'pointerdown', ( event ) => { container.worldTransform.applyInverse( event.data.global, location ); }
  17. @Gerente You may be interested in https://github.com/pixijs/pixi-ui - which I think achieves some of what you are trying to achieve (it has the concept of being able to anchor a container to the edge of another container)
  18. PIXI supports 9 sliced sprites too (http://pixijs.download/dev/docs/PIXI.mesh.NineSlicePlane.html) and you might have more success with filters on sprites than on Graphics. As always with WebGL, performance is best when you are making consecutive draw calls using the same base texture. Filters will always break that batching, so if you could put one filter over all of your sprite draw calls, rather than 1 filter per line, then that'd help
  19. A Container is nothing but a collection of other display objects, so cannot be visually represented itself. You need a Graphic or a Sprite (both of which inherit from Container) to even give the container dimensions in the first place
  20. It's not what you want in terms of an automation tool that works with pixi, but there is a chrome plugin that does detect internally rendered display objects that may be of use https://github.com/bfanger/pixi-inspector
  21. I'd take a guess that pixi, via the WebGL renderer, is uploading that texture to the GPU once (it's something the lib has control over) I'd then take a guess that the canvas renderer is uploading that texture to the GPU every frame. But just because WebGL _could_ do it, doesn't mean you should. Take a look at https://webglstats.com/webgl/parameter/MAX_TEXTURE_SIZE - 99.9% of gpus devices out there support 4096x4096 texture size, yet only 29.9% support the next size up, 8192x8192. It probably works on your pc because you have a discrete GPU in it.
  22. It ground to a crawl for me too, and I was able to get a profile on it, and the delay is all in the GPU At the bottom you have a canvas that is 10000x10000 This is way too big The maximum texture size I'd recommend is 4096x4096... but in my games I often play it safe with 2048x2048 for the canvas renderer because of the time it can take to upload large textures to the gpu
  23. Can something like that be popular? Sure! A Dark Room was a hit and modern-ish take on that... buttons instead of typing commands, but it was still a 'Text Adventure' imo But the best advice I can give is... would you enjoy making it? If you're a hobbyist, and your target is to make a popular game, I think there's more chance of ending up disappointed (or not finishing it) than if you set out to make a game you want to make and have passion for.
  24. I'd recommend loading the game up in Chrome, with dev tools open, and looking at the network tab. After the game has loaded, you'll be able to sort by type, and see what textures the game has loaded, which then gives you a fair idea of how they are used
  25. Again, for the example you showed, those 'particles' are just part of the sprite animation of the winning symbol.