• Content Count

  • Joined

  • Last visited

About elsemieni

  • Rank
  • Birthday 06/09/1991

Contact Methods

  • Website URL
  • Twitter

Profile Information

  • Gender
  • Location

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hello! I have an issue that I didn't understand too much. I'm trying to build a tiled-isometric-map loader for Phaser, where I have many issues about perfomance. So I will investigate different techniques to make them more lightweight for CPU/GPU trying to use some techniques like render the map layers in cropped RenderTextures (just draw the visible area of the map at once), using sprite-populated SpriteBatches/Groups as source (I tested with both). In simple words, the logical behind this is the following: Populate the SpriteBatch/Group with tile-based sprites from a cache-array (to prevent creating/destroying each time). Render that SpriteBatch/Group into the RenderTexture, clearing it before that. Cleaning the SpriteBatch/Group, putting the tile-based sprites back to the array and removing from the SpriteBatch/Group (without destroying them, off course). Create a Phaser.Image that show the RenderTexture in the screen. (Or create once, the texture are updated anyways). I tested that with one RenderTexture and works fine. The issue comes if I write more many of them (Assuming that 1 RenderTexture is equivalent at 1 Layer of the scenario), the screen starts showing y-inverted versions of the RenderTexture at random times. A important fact that this just occurs in WebGL mode (in Canvas the behaivour is the correct). I'm using Phaser CE 2.7.3. I coded a short example of the issue (each column is a different RenderTexture with correspondent Sprites), you're free to see and debug them : Canvas: http://elsemieni.net/inni/testCanvas/ WebGL: http://elsemieni.net/inni/testWebGL/ Some idea of what's going on there? Thanks in advance
  2. Iorana! I have just a little question about onInputOut behaivour in mobile/touch devices. I have this scenario in Phaser 2.6.2: A small lovely Sprite with inputEnabled activated to handle events, connected with events.onInputOver and events.onInputOut, just for hover actions. No problems there. If I test it in desktop, the signals are dispatched correctly if I move the mouse over the sprite and then move it away from it. But in mobile/touch the behaivour is different. Obviously, you can't "move" a touch Pointer like a mouse, but anyways if you tap the sprite events.onInputOver is dispatched. BUT, after that, if you tap other place somewhere in the screen there's not any events.onInputDown signal, resulting the game acting if there's still hovering the sprite. But keeping pressed the touchscreen and move Pointer it dispatch the desired signal. I know one can make workaround to handle this (if it is mobile check in the next tap, if the pointer is inside bounds of the sprite: If not, manually dispatch events.onInputOut signal), but I have the doubt in mind if this is the correct behaivour of the signal or not. (Probably not, but there's always a bug possiblity). I checked the docs and there's not much thing to say about the event: As yoy can see, the docs don't cover this particular case. If someone explains me this well if this must be work or not... Thanks in advance
  3. elsemieni

    Multiple visual sub-outputs?

    Hi, I have a strange doubt. It's experimental anyways, so it doesn't matter too much if there's no a reasonable solution for this. Let me explain this in a simple way: I'm trying to make Phaser interact with other image-processing libraries, in the way that can "feed" multiple visual outputs. Here's a little example of what I'm trying to explain: To try to make that I came with the idea to use BitmapData to render the necessary stuff into it and take the canvas context. The results? Slow perfomance. Okay, so after that I switched to RenderTexture that is a lot much faster to render things, but yes: You need to handle with matrix transformation to do basic effects like transformations, any-anchor rotation, etc... And well... I concluded that I'm re-invent the wheel trying to implement basic features that there's already in Phaser, but that's not avaiable for the restrictions that BitmapData/RenderTexture has. Doing some FX to a sprite that takes 3-4 lines in normal-output Phaser, takes 30+ lines trying to render them in a one of those "alternative outputs". So I came with the idea... Practically I need a "Phaser that renders in another stuff that's not the main output". Something similar like this. At simple researching I saw there's no a direct way to do this (maybe I'm wrong?). Also, I looked for a possibility to run multiple instances of Phaser in a once, but that's really convenient in pefomance-cost talking? Maybe one-instance can handle all of this and I'm not noticed? Yes, it's kinda specific. Maybe the solution still relies in BitmapData's/RenderTextures, but I'm curious about this. Thanks in advance PD: The motivation of this is for try to "build" a mid-complex scene in Phaser that it renders in a "sub-output" in a Canvas to pass it to ThreeJS to put it in a Shape and renders it back to Phaser main output. Maybe there's more straighfoward manners to do that.
  4. Hi! I'm trying to create a Phaser.Sprite using a canvas-based texture (another attempt to connect ThreeJS with Phaser, if you're interested), but I have some issues. My first attempt before doing all of this was forcing ThreeJS render into a Phaser.BitmapData, but maybe you already thinking it was a non-optimal solution (with small canvases works fine, but if you increase the resolution, the perfomance will slow down, aaaaand you're limited to take frames from only 2d-non-webgl context). Faster solutions? Yes, maybe Phaser.RenderTexture was a good idea, but I never find a way how to render a canvas into it. (Since canvas is not a PIXI.DisplayObject). So, after researching a while, I came with the idea of render the canvas more directly to the sprite, doing this ... sprite = new Phaser.Sprite(game, x, y, null); //There's a canvas named "renderHere" where stuff is rendered. sprite.setTexture(PIXI.Texture.fromCanvas(document.getElementById("renderHere"), PIXI.scaleModes.DEFAULT)); ... it works BUT it justs render the fist frame. Yes, I need to update it. I found that in some Github issues, people mentioned that I need to refresh the textures in PIXI doing this: PIXI.texturesToUpdate.push(sprite.texture); But Surprise!: PIXI.texturesToUpdate is undefined!!! . (I'm using Phaser v2.4.6 with Pixi.js v2.2.9 if that's a clue). I'm thinking it is a issue about PIXI versions and such things, but I don't know really. So... Any ideas where texturesToUpdate was? O there's another workaround to refresh the texture? Or there's a direct way to render canvases in RenderTextures? Or maybe I need to quit programming and be a night exotic dancer? Thanks in advance