dmko

Members
  • Content count

    68
  • Joined

  • Last visited

  1. hrm I'm a little worried about using a different fork... I kinda just want to `npm install pixi` and be off to the races... I hope your features make it in! Pixi 5 is a ways away though, no? it's still alpha?
  2. Depends on what we mean by "removed"... conceptually it could mean "removed from parent" OR "removed from scene" I agree that there should be different events for each of those
  3. Pixi v5 has everything I need!!!
  4. Yeah - that technically works in this small example, but in general is there no way for a DisplayObject to detect when it's been removed from the scene graph unless it was explicitly from a removeChild()?
  5. Is there a way to get children to do something when they've been removed indirectly? Here's a jsbin to show what I mean - only the parent notifies when its been removed, but I'd like the child to notify as well: https://jsbin.com/lolicamohi/edit?js,console,output
  6. I forgot to followup on your erase mode update... thanks for doing that btw! Is this the same answer? (will be native in v5 only)
  7. and that looking at my old code I already knew that?
  8. oh, it seems touch events can be intercepted globally - https://github.com/pixijs/pixi.js/pull/2658 ?
  9. v5?! woo! OK... so if making these changes it needs to be done manually via changing the prototype somehow?
  10. @ivan.popelyshev - is this something that's been added? i.e. to automatically wire certain PIXI primitives (e.g. Sprite) to call a top-level function (such as is done at the top of this thread, by changing the prototype?
  11. scratch that.... I need to do a lot more learnin! Just bought this book: https://www.manning.com/books/functional-reactive-programming highly recommended!
  12. Just to add - instead of like a virtual-pixi-graph thing, which is kinda crazy, maybe just pushing the commands down, so like "addSprite(texture, parent)" or a serialized version of that... So basically it'd be like this, where "viewInfo" is probably little more than the stage and some utility helper functions to test touch inputs etc. inputStream .map(input => getCommands(viewInfo, input)) .map(cmd => sendToPixi(viewInfo, cmd)) .observe(newViewInfo => updateViewInfo(newViewInfo));
  13. This is kindof a continuation of the discussion in this thread Though it could branch off elsewhere (like not using streams, using sodium, whatever) - so I figured I'd open a new one Basically - if one wanted to take the functional programming or functional reactive programming approach, how does PIXI fit in the picture? It seems to me like there are are basically 2 options: 1. Something like Cycle.js - where we hook the "output" of PIXI back into the inputs (also via observables/streams). 2. Have a strict unidirectional flow from input->logic/data->view. The first one is sortof dealt with in that linked thread above - and I'm not so sure I like it... here I'm asking more about the second.... How could the second approach be done cleanly? More specifically: 1. Is it crazy to keep everything in some sort of structure that simply gets re-rendered on every frame? I mean I guess that's what PIXI's doing under the hood... but in this case it wouldn't be to render it would be to drive a single custom "drawGameObjects()" which would effectively do something like removeChildren/addChildren/renderTextures() It could be a little smarter of course and maybe just update the diffs, which wouldn't be too difficult if the data structure is strict... but still, curious if that's just like a really bad idea or not so bad at all 2. Similarly - with this idea we'd lose touch events... would need to sift the global touch events in the logic/data part of the code and wouldn't benefit from PIXI's automated on() event detection stuff. At a glance that's awful - but if sticking to bounding boxes and things, it's actually possible to be much cleaner that way... think for example where clicking one object should change the state of some other ones. All the logic/data needs to have is the layering, position, size, and rotation of each object which it would need to have to pass to the "renderer" anyway, so, not so awful... Is there some other approach I'm missing? If this approach is taken, is it better just to drive WebGL directly? (my guess is no PIXI is way more than just a pixel-pusher / touch listener! handling context, easy to use api for textures, batching stuff, sprite sheets, etc...) FWIW, I found this code sample for RxJS to be super clean and informative... much easier to understand than the elm / haskell / youtube videos / etc.: https://github.com/Lorti/rxjs-breakout/blob/master/app.js (the author claims they are a beginner with this approach but it looks fantastic to me!)
  14. Oh wow... I'm just coming off of watching the Dr Boolean and funfunfunction stuff... and scratching my head on how to apply this to PIXI The examples here are great! Any more like this? Next up I gotta learn about Observables/streams.... seems "most" is a popular framework for that?
  15. n/m... seems InteractionManager itself will dispatch the events more reliably for some reason... I dunno why, but this approach seems to work better than just listening on the sprite itself - and it's cross-platform Maybe this base class will help others too: export class DragContainer extends PIXI.Container { private touchPoint: PIXI.Point = new PIXI.Point(); private dragOffset:PIXI.Point = new PIXI.Point(); //if listenerTarget is provided, then dragging only starts when that is touched (though this container itself is what moves) //screenLimit sets the boundries for draggable area constructor(private renderer: (PIXI.WebGLRenderer | PIXI.CanvasRenderer), listenerTarget?:PIXI.DisplayObject, public screenLimit?: PIXI.Rectangle) { super(); if(listenerTarget === undefined) { listenerTarget = this; } listenerTarget.on('pointerdown', (iEvent:PIXI.interaction.InteractionEvent) => { this.clearListeners(); this.updateTouchPoint(iEvent.data); this.dragOffset.x = this.x - this.touchPoint.x; this.dragOffset.y = this.y - this.touchPoint.y; renderer.plugins.interaction.on('pointermove', this.onDragMove, this); renderer.plugins.interaction.on('pointerup', this.onDragEnd, this); renderer.plugins.interaction.on('pointeroutside', this.onDragEnd, this); this.emit('dragStart'); }); } onDragMove(iEvent:PIXI.interaction.InteractionEvent) { this.updateTouchPoint(iEvent.data); let targetX:number = this.touchPoint.x + this.dragOffset.x; let targetY:number = this.touchPoint.y + this.dragOffset.y; let allowMove:boolean = (this.screenLimit === undefined) ? true : this.screenLimit.contains(targetX, targetY); if(allowMove) { this.position.set(targetX, targetY); this.emit('dragMove'); } } onDragEnd(iEvent:PIXI.interaction.InteractionEvent) { this.clearListeners(); this.emit('dragEnd'); } clearListeners() { this.renderer.plugins.interaction.off('pointermove', this.onDragMove, this); this.renderer.plugins.interaction.off('pointerup', this.onDragEnd, this); this.renderer.plugins.interaction.off('pointeroutside', this.onDragEnd, this); } updateTouchPoint(iData:PIXI.interaction.InteractionData) { iData.getLocalPosition(this.parent, this.touchPoint, iData.global); } }