Search the Community

Showing results for tags 'rendertexture'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Coding and Game Design
  • Frameworks
    • Phaser 3
    • Phaser 2
    • Pixi.js
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



Website URL





Found 41 results

  1. Hi, I need to create an image in run-time and then convert it to a base64 string. I can create the image with Phaser.GameObject.RenderTexture, but does anyone know how to convert it to a base64 string? Thanks a lot. Tomas
  2. Hello, The answers to my questions may seem evident, please consider I'm a complete beginner to HTML and JS Phaser coding. I am having two problems with my code. The first is that renderTexture doesn't seem to work. I am trying to make a trail of my player, like the "trail" example on I actually copied the code needed, but it does not work. Here is the line causing the problem: rt = this.make.renderTexture({ x: 0, y: 0, width: 800, height: 800 }).setOrigin(0, 0); in the create function. And here is what the console says: Uncaught TypeError: this.make.renderTexture is not a function at Scene.create ((index):301) at SceneManager.create (phaser.js:51412) at SceneManager.loadComplete (phaser.js:51329) at LoaderPlugin.emit (phaser.js:2622) at LoaderPlugin.processComplete (phaser.js:111210) at LoaderPlugin.removeFromQueue (phaser.js:111190) at LoaderPlugin.processUpdate (phaser.js:111171) at (phaser.js:8947) _____________ The second issue affects the collision boxes of my obstacles. Those are static sprites, and the physics are "MatterJS". I have defined the shape of the hitboxes in PhysicsEditor, and have done exactly like in their tutorial to link the JSON to the sprites, but it doesn't work: the collision is not only triggered when the player enters the parts in the shape, but also in the rest of the rectangle that should not be considered. Furthermore, all physical properties I've set in the editor don't work. Then, I tried to do all of it without physicsEditor: I added required setters to do the work. While the physical properties are now correct, the hitboxes are still the whole rectangles instead of the chosen parts. Here is the code: //example var ob12 = this.matter.add.sprite(800-211, -6353, 'ob12').setStatic(true).setSensor(true).setFriction(0,0,0).setInteractive(new Phaser.Geom.Polygon([ [ { "x":132, "y":291 }, { "x":337, "y":498 }, { "x":423, "y":2 } ], [ { "x":50, "y":789 }, { "x":239, "y":979 }, { "x":337, "y":498 } ], [ { "x":0, "y":1220 }, { "x":423, "y":1639 }, { "x":239, "y":979 } ], [ { "x":239, "y":979 }, { "x":423, "y":1639 }, { "x":337, "y":498 } ], [ { "x":337, "y":498 }, { "x":423, "y":1639 }, { "x":423, "y":2 } ] ]), Phaser.Geom.Polygon.Contains); Thank you, Vainly
  3. Hey I am building quite huge tile map from hexagons. Basically for performance reasons I create single render texture and add a lot of sprites,and texts to it. It takes quite a lot of time to do so, but I get higher FPS and that texture is pretty much static. However there is another layer on top of this, which is dynamic, if I add another render texture to the game, FPS drops, so if I render everything to single texture FPS is still OK, however when dynamic part changes I need to rerendering whole texture and that takes too much time. So I was thinking to generate static texture and cache/clone it in the memory, and compose with with dynamic texture, the composition would take a bit of time, but end result - single texture would let me keep stable FPS. So when dynamic texture changes I would need to only render dynamic texture and compose it with already cached static texture, but not sure is it possible at all?
  4. Hi, first post and very new to PIXI! I'm trying to figure out how positioning works and how to do it properly. I have a few followup questions which I'll take later. Goal: Having a centered "mask area" where I can count the "unmasking progress" But first off; here is a fiddle. As you can see I have a centerContainer which I add all my sprites and bind all interactivity to. I center all the sprites relative to the app width and height values. I create the renderTexture with the same width and height as onTopOfMask and imageToReveal (400x400). The renderTextureSprite is not positioned and it results in only a corner being "unmasked" (clearly visible if I comment out the masking, it's positioned top left). If I position renderTextureSprite (fiddle) the same way as I've done with the sprites it works but then the brush/unmasking has some weird offset. 1. Is that the way to properly center the sprites etc or is it better to center the container instead? (like so: fiddle) or any other way? 2. When positioning, why does the mask have a weird offset? Fiddling around but not getting any wiser so help is greatly appriciated!
  5. Hebbe

    Clear Rect in Render Texture

    Hi! I'm using pixi 4.5.3 and can't find a way to clear out an rectangle from my transparent render texture. Seems that all blend modes have gl.ONE_MINUS_SRC_ALPHA in them Is there any hack around this? I have a large renderTexture with mostly static bitmap text and I want to update only parts that have changed for better performance. Is there any other way to update only part of an rendertexture while retaining transparency? Thanks!
  6. I have this simple jsfiddle that displays the problem that I'm trying to solve. It looks like the render texture get's cropped based on the position of the object passed to the renderTexture.render function. Is there a way around this?
  7. Hi, I'm trying to work out how to create a shape with graphics, cache it and add it to a sprite to use repeatedly I was trying to recreate the intro background on this video (see below, not the video thumbnail but the first thing you see when you actually play it). I realised drawing all the star graphics and fills directly over time would probably be too slow, so it would be better to cache the star & it's trail as a bitmap and then create multiple sprites from that. I want to create the graphic early on and pull it out of the cache later. I'm getting an error though. The key is there but it can't find the texture. Does it need longer to store before trying to pull it out again? Obviously I could use a PNG but i'd prefer to be able to generate the graphics at runtime thanks for any advice. J // draw graphics to renderTexturevar renderTexture = this.add.renderTexture(graphics.width, graphics.height);renderTexture.renderXY(graphics, 0, 0, true);// add renderTexture to'starRT', renderTexture );// check starRT is in cacheconsole.log('starRT')) // true// Error: Phaser.Cache.getImage: Key "[object Object]" not found in Cache.var sprite =,0,'starRT'));sprite.x=128;this works fine though, but obviously it's not being pulled from the cache var sprite2 =,0,renderTexture);
  8. Apologies if this has either been asked before, or I'm misunderstanding how to use RenderTextures I have a renderTexture that's outputting the contents of a Phaser.Image(), which in turn contains an instance of Phaser.BitmapData(). In my update() loop, I want to update the position of some geometry drawn in the BitMapData, but those changes are never seen in the RenderTexture. What would be the reason for this? Like I said I could be misunderstanding how & why you might want to use a RenderTexture, but this portion of my code will be quite expensive & updated regularly - so it seemed like a good idea to pass this logic out to the GPU if possible, which is what a RenderTexture allows, right? Am open to some better ideas if I do indeed have the wrong idea! Below is the code: it's written in TypeScript and the class extends Phaser.Sprite(), so hopefully it makes sense even to those only familiar with JS. As you can see in my update() I'm redrawing a with a new x position, then rendering back to the RenderTexture again. However, the circle never moves from its original position of 0. If I console.log out the x value, it's clearly update per tick. constructor(game: Phaser.Game, map: any) { const texture = new Phaser.RenderTexture(game, map.widthInPixels, map.heightInPixels, key); const bitmap = new Phaser.BitmapData(game, 'BITMAP', map.widthInPixels, map.heightInPixels); super(game, 0, 0, texture); this.bitmap = bitmap; this.image = new Phaser.Image(game, 0, 0, this.bitmap); this.texture = texture; // Other code to add this .Sprite() to the stage. } private update() { const bitmap = this.bitmap; bitmap.clear();, 100, 50, 'rgb(255, 255, 255)'); this.texture.render(this.image); x += 10; }
  9. timetocode

    Programmatic creation of a spritesheet

    I'm cutting up some large images into puzzle pieces, and I'd like to create spritesheets out of them directly from JavasScript (as opposed to cutting them up and saving them as images which are then loaded as a spritesheet). So far I've made a puzzle cutter in regular canvas. I did these operations in canvas instead of pixi because they are pretty easy in canvas. I can then draw all of these puzzle pieces to a new canvas, and via PIXI.RenderTexture I can begin to create a spritesheet. Ideally I'd like to gain access to a syntax like new PIXI.Sprite.fromFrame(puzzleId + '-' + puzzlePieceIndex) but I'm not sure how to achieve this. I also generate some other important images, like a stroked outline of the puzzle piece which I will later use in-game to great a highlight effect. I've been inspecting the objects created when importing a spritesheet via texturepacker. All I've figured out so far is that pixi creates a hash of image names matched to frame data... basically the name of an image as the key, and a rectangle, within the global pixi object. Is there some trick to this? Can I just create the strings and frame data and shove it all in there an expect it to work? E.g. 'nebula-2830' : { x: 500, y: 500, width: 48, height: 48 } (or whatever). How do I then actually wire it to the texture that I created via the render texture? The metadata created by texturepacker seems overly complicated for what I need here. Also, perhaps this doesn't change anything, but I might try for some really giant puzzles.. puzzles whose images are 4096 px squared or bigger -- for these I would cut them into multiple textures as only certain hardware would cope with those dimensions as is. Here are some images of the puzzle cutter's output in regular canvas: Thanks!
  10. Hello! I have an issue that I didn't understand too much. I'm trying to build a tiled-isometric-map loader for Phaser, where I have many issues about perfomance. So I will investigate different techniques to make them more lightweight for CPU/GPU trying to use some techniques like render the map layers in cropped RenderTextures (just draw the visible area of the map at once), using sprite-populated SpriteBatches/Groups as source (I tested with both). In simple words, the logical behind this is the following: Populate the SpriteBatch/Group with tile-based sprites from a cache-array (to prevent creating/destroying each time). Render that SpriteBatch/Group into the RenderTexture, clearing it before that. Cleaning the SpriteBatch/Group, putting the tile-based sprites back to the array and removing from the SpriteBatch/Group (without destroying them, off course). Create a Phaser.Image that show the RenderTexture in the screen. (Or create once, the texture are updated anyways). I tested that with one RenderTexture and works fine. The issue comes if I write more many of them (Assuming that 1 RenderTexture is equivalent at 1 Layer of the scenario), the screen starts showing y-inverted versions of the RenderTexture at random times. A important fact that this just occurs in WebGL mode (in Canvas the behaivour is the correct). I'm using Phaser CE 2.7.3. I coded a short example of the issue (each column is a different RenderTexture with correspondent Sprites), you're free to see and debug them : Canvas: WebGL: Some idea of what's going on there? Thanks in advance
  11. Covacs

    GetPixel trouble.

    Hi there, im develong a game with ink. They are some planets and you shoot ink on them and then you conquer them. I use a temporal bitmapdata do draw render texture planets on it to use bitmapdata getpixel method to get if the planet is completely of a color (or almost). Im wondering if theres a better method because this is so laggy. Theres a way to get a pixel from renderTexture? conquer: function(p) { //I draw planet renderTexture on bmd bitmapdata to allow the getpixel function bmd.draw(p.capaPintura,0,0,(*2),(*2)); //Points of the planet to get pixels. col = []; col[0] = bmd.getPixelRGB(10,; //izquierda - centro col[1] = bmd.getPixelRGB(((*2)-10),; col[2] = bmd.getPixelRGB(,10); //centro - arriba col[3] = bmd.getPixelRGB(,((*2)-10)); if (col[0].rgba === col[1].rgba && col[2].rgba === col[0].rgba && col[0].rgba === col[3].rgba){ var c = 0; if(col[0].rgba === 'rgba(0,0,255,1)') c = 1; var colorPlanet = []; switch(c) { case 0: colorPlanet[0] = 0x550000; //Shadow color colorPlanet[1] = 0xAA0000; //Shadow color2 colorPlanet[2] = 0xFF0000; break; case 1: colorPlanet[0] = 0x000055; colorPlanet[1] = 0x0000AA; colorPlanet[2] = 0x0000FF; break; } //Shadow color pintData.beginFill(colorPlanet[0]); pintData.drawCircle(0, 0,*2); p.capaPintura.renderXY(pintData,,; //Shadow color 2 pintData.beginFill(colorPlanet[1]); pintData.drawCircle(0, 0,*2); p.capaPintura.renderXY(pintData,*1.1,*.9); //Planet color pintData.beginFill(colorPlanet[2]); pintData.drawCircle(0, 0,*2); pintData.endFill(); p.capaPintura.renderXY(pintData,*1.25,*0.75); } } Thats the code, i use it to check if the planet points (col[0], col[1], col[2] and col[3]) are of the same color, then if they are same color y draw on the planet renderTexture (p.capaPintura) colors of the conquerer ink. this is the planet after being conquerer by the blue team, the code works, but i have to use the conquer function once every time a planet is hit by ink or it wont work.. and its so laggy, theres a method to do it just with render textures?, i have to draw the planet renderTexture on bmd just to use getPixel, so if the planet is bigger then bigger is the lag i get because of that draw... .
  12. whizzkid

    get pixeldata in a webgl canvas

    Hi people, I'm porting something from flash where I have a drawn map with roads, and I want to be able to select each road individually with the mouse. now in Flash I could simply use the hittest on the shapes of the roads, but in PIXI I can't. My solution was to draw all the roads, each with an unique RGB color, and do a get pixel on your mousecoords to see which road you've selected. Now my problem is, how do i get pixeldata from my sprite when I'm using a webgl renderer ? What I've come up with so far is creating a separate canvasrenderer and a separate stage to render my map to, and from that canvas (renderer.view) get the 2d context and get pixeldata.. // create separate canvas and canvasRenderer for this pixel mapvar canvasStage = new PIXI.Stage(0x000000);var canvasRenderer = new PIXI.CanvasRenderer(1024, 598, null, true);// create a texture that holds the mapvar texture = new PIXI.RenderTexture(1024, 598, this.canvasRenderer);texture.render(map);// create a sprite on our separate stage, otherwise the texture won't be renderedvar textureSprite = new PIXI.Sprite(texture);canvasStage.addChild(textureSprite);// render the stagecanvasRenderer.render(canvasStage);after that is done, I can use this to get my pixels var pixelData = canvasRenderer.view.getContext(("2d")).getImageData(posX, posY, 1, 1).data;Is this the correct way to do it? or is there a better way? regards, Martijn
  13. Yo! I've started developing a game and I've been learning a hell of a lot about WebGL for lighting and shadows, and I'm hoping to implement some sort of lighting pipeline, following a technique like this: This means I'll need to process textures in a fragment shader, and then process scene using the resulting bitmap in another shader. Just to test, I tried rendering a part of a game world to a render texture. This worked fine without any shaders (Phaser/PIXI filters) applied. The render texture is displayed using a Sprite on the right hand side, just so I could see the result. Cool. However, if I add filters to what I'm rendering, the perspective gets all messed up. For example, my lighting fragment shader: So instead of scratching my head over this, having done quite a bit of Googling the past couple of days, I thought I'd ask for help here. Is there a way to render to a render texture with shaders applied correctly? Is there a typical approach to setting up a pipeline like this? Essentially I'm looking to perform transformations as described in the above article so I can start casting some sweet shadows, but I can't quite see how yet. The perspective is fine if I apply shaders to the stage instead, but of course then I have no shader processing for the resulting render texture:
  14. HI, I am enthralled by the pixi render texture tutorial, but I'm getting an error from the update() function: I've changed the variable names, but I've triple checked that they are consistent. Here's some of my create() code to give an idea what I'm doing. I was hoping to use the render-texture tutorial on top of my working code. Is there a way to get the renderTexture working within an existing p2 game? Thank you. //game.stage.backgroundColor = "#f2f2f2";, 0, 450, 800); game.physics.startSystem(Phaser.Physics.P2JS); // create two render textures.. // these dynamic textures will be used to draw the scene into itself render_texture1 = game.add.renderTexture(450, 800, 'texture1'); render_texture2 = game.add.renderTexture(450, 800, 'texture2'); current_texture = render_texture; // create a new sprite that uses the render texture we created above output_sprite = game.add.sprite(225, 400, current_texture); // align the sprite output_sprite.anchor.x = 0.5; output_sprite.anchor.y = 0.5; stuff_container =; stuff_container.x = 450/2; stuff_container.y = 800/2; // now create some items and randomly position them in the stuff container for (var i = 0; i < 4; i++) { var item = stuff_container.create( Math.random() * 400 - 200, Math.random() * 400 - 200, game.rnd.pick(game.cache.getKeys(Phaser.Cache.IMAGE)) ); item.anchor.setTo(0.5, 0.5); } // used for spinning! count = 0; // Turn on impact events for the world, // without this we get no collision callbacks game.physics.p2.setImpactEvents(true); game.physics.p2.updateBoundsCollisionGroup(); game.physics.p2.gravity.y = 0; game.physics.p2.restitution = 0.7;
  15. hilary_craven

    Pixi.js Dev-v4.0.0 Branch

    Hello, I've been playing around with what is on pixi.js dev-4.0.0 branch and have found everything to be working except RenderTexture and BaseRenderTexture. I just want to ask if these are still in development... or if I'm possibly using them wrong? RenderTexture.create = function(width, height, scaleMode, resolution) Overall awesome work on pixi-4 so far! We're ready to upgrade our game to v4
  16. We recently moved to Pixi.js v3 and we discovered that all the render textures were being rendered without anti-aliasing. All other Sprites and BitmapText object were just fine, and did look like they still had anti-aliasing. Has Pixi v3 explicitly disabled anti-aliasing for RenderTextures? Or is this a bug? See My JSFiddle Test Case Note: the Red Text is the RenderTexture.
  17. I try to render a container over a renderable texture, but the container transformations simply do not apply on the sprites inside. You can see it happens by taking the following code: var renderer = PIXI.autoDetectRenderer(800, 600,{backgroundColor : 0x1099bb}); document.body.appendChild(renderer.view); // create the root of the scene graph var stage = new PIXI.Container(); // create a new Sprite using the texture var texture = PIXI.Texture.fromImage('_assets/basics/bunny.png'); var bunny = new PIXI.Sprite(texture); // move the sprite to the center of the screen bunny.position.x = 200; bunny.position.y = 150; // create render texture and sprite var render_texture = new PIXI.RenderTexture(renderer, 800, 800); var render_texture_sprite = new PIXI.Sprite(render_texture); stage.addChild (render_texture_sprite); // create a container and add sprite to it var container = new PIXI.Container(); container.addChild(bunny); // these transformations will NOT apply, for some reason.. container.scale.x = 100; container.position.y = 100; // ???? // start animating animate(); function animate() { requestAnimationFrame(animate); render_texture.render(container); renderer.render(stage); } And paste it here Is this a bug? or expected behavior? How do I make the container apply its transformations on the sprite? Thanks!
  18. Hi! I have been trying to get a second camera into my scene, so that this cameras view will overlay my main cameras view but just in the bottom left corner. Think if i wanted to create a mini-map, but to achieve this all i was doing was putting a camera way above the land looking down. Maybe best way to explain is please look at my example. This is what i am trying to achieve. A second camera is looking at the same object but from a different angle. However, when you move the main camera with the mouse so that the objects in it are now positioned over the bottom left overlay, the objects go over top of my camera2 viewport since the objects are closer to camera1 then they are to camera2(z-buffering). How can i achieve this so that my bottom left overlay is always on top? is there a better way than view ports? like rendering the camera2's view to a texture that just overlays the canvas? Also, i had to create a third camera to force a white background on the viewport. Maybe there is a better way to do this? Thanks for any advice on how to proceed!
  19. In one state I'm using Phaser.RenderTexture, game.make.bitmapData and they reduce framerate in the next state. Is there a way to get rid of them completely? 1) I'm switching the state. 2) Using .destroy() for bitmapData, RenderTexture and sprite which uses RenderTexture. What else can be done? Remove them from cache? Thanks in advance. P.S. using CANVAS mode
  20. I'm struggling how to manipulate renderTexture while blitting. As of now, I have a burnout in a tilesheet, and the burnout animates when the arrow keys are pressed. My logic is to create a renderTexture of the burnout, have the renderTexture leave a copy of the burnout sprite behind the original burnout. This would mimic a car burnout. The problem I'm having is that the renderTexture sprite is moving with the original burnout and not staying behind. I have a burnout circle running around the screen instead of staying stuck on the stage where it was burnt out. My problem is trying to figure out how to make the renderTexture burnout stay in one position on the stage. Also I posted this in Coding and Game Design and realized it is more fitting in this forum. How can I delete the post in Coding and Game Design so there are not duplicates? <script src="pixi.min.js"></script><script src="spriteUtilities.js"></script><script src="keyboard.js"></script><script>//Test that Pixi is working//console.log(PIXI);var renderer = PIXI.autoDetectRenderer(970, 418, {antialias: false, transparent: false, resolution: 1});document.body.appendChild(renderer.view);var stage = new PIXI.Container();renderer.backgroundColor = 0xeeeeee;renderer.render(stage);loader = PIXI.loader .add("nasSkidtileSheetv2.png") .load(onAssetsLoaded);var framedSkid, renderTexture, sprite, state;var renderPosition = new PIXI.Point(100, 100);function onAssetsLoaded() { // Burnout sprite var skid = new PIXI.Sprite(PIXI.loader.resources["nasSkidtileSheetv2.png"].texture); PIXI.utils.TextureCache["nasSkidtileSheetv2.png"]; var skid = PIXI.utils.TextureCache["nasSkidtileSheetv2.png"]; var su = new SpriteUtilities(PIXI); var skidFrame = su.filmstrip("nasSkidtileSheetv2.png", 271, 269); framedSkid = new PIXI.extras.MovieClip(skidFrame); stage.addChild(framedSkid); framedSkid.animationSpeed = 0.5; framedSkid.x = 0; framedSkid.y = 0; framedSkid.vx = 0; framedSkid.vy = 0; // Burnout Copy renderTexture = new PIXI.RenderTexture(renderer, renderer.width, renderer.height); sprite = new PIXI.Sprite(renderTexture); stage.addChild(sprite); sprite.x = 0; sprite.y = 0; sprite.vx = 0; sprite.vy = 0; //Capture the keyboard arrow keys var left = keyboard(37), up = keyboard(38), right = keyboard(39), down = keyboard(40); //Left arrow key `press` method = function() { //Change the cat's velocity when the key is pressed framedSkid.vx = -5; framedSkid.vy = 0;; }; //Left arrow key `release` method left.release = function() { //If the left arrow has been released, and the right arrow isn't down, //and the cat isn't moving vertically: //Stop the cat if (!right.isDown && framedSkid.vy === 0) { framedSkid.vx = 0; framedSkid.stop(); } }; //Up = function() { framedSkid.vy = -5; framedSkid.vx = 0;; }; up.release = function() { if (!down.isDown && framedSkid.vx === 0) { framedSkid.vy = 0; framedSkid.stop(); } }; //Right = function() { framedSkid.vx = 5; framedSkid.vy = 0;; }; right.release = function() { if (!left.isDown && framedSkid.vy === 0) { framedSkid.vx = 0; framedSkid.stop(); } }; //Down = function() { framedSkid.vy = 5; framedSkid.vx = 0;; }; down.release = function() { if (!up.isDown && framedSkid.vx === 0) { framedSkid.vy = 0; framedSkid.stop(); } }; state = play; gameLoop();}function gameLoop() { renderTexture.render(framedSkid); renderTexture.render(framedSkid, renderPosition, false); sprite.texture = renderTexture; // render the stage container renderer.render(stage); requestAnimationFrame(gameLoop); state();}function play() { //Apply the velocity values to the sprite’s position to make it move framedSkid.x += framedSkid.vx; framedSkid.y += framedSkid.vy; sprite.x += framedSkid.vx; sprite.y += framedSkid.vy;}</script>
  21. Sharpleaf

    Pixi V3 RenderTexture

    Can I make a Rendertexture from a PIXI.Container with several PIXI.Container's in it? So, a texture from a Container of Containers? I'm trying it, but I cant get any results to show up. I've got islandStage.addChild(cloudsBack);islandStage.addChild(map1);islandStage.addChild(container);islandStage.addChild(guest_island);islandStage.addChild(cloudsFront);stage.addChild(islandStage);In my animation loop I have islandShape = new PIXI.RenderTexture(islandStage, islandStage.width, islandStage.height);cloudShadowSprite.texture = islandShape;But when I throw cloudShadowSprite to the outer container (which we call "stage", not to be confused with the older versions of PIXI's stage container), There's nothing that I can see... The resulting render texture it supposed to be quite large (over 1000 in both width and height). There are no errors in the code. Eventually, I want to turn cloudShadowSprite into a mask of another container all together. Then, large "cloud shadows" will roll over through this new container. Since the new container will have a mask made from shape if whatever is going on in islandStage's sub containers, the "cloud shadows" should only appear on the appropriate images (and not the background which would be too "far away" or off in the horizon to get a shadow). Can anyone tell me what I'm doing wrong? I haven't seen a PIXI v3 RenderTexture example. Thanks!
  22. Hi, I have 2 render textures swapping to create a trail, a bit similar to this concept does anybody know why my trail never fades out.. I guess it's because the value never actually reaches 0? you can see there is grey residue of the sprite everywhere I've adjusted the second image in photoshop to show the trails as they're not always easy to see on a monitor, but that can depend on the colour used thanks J
  23. Hi, As per my example here , I'm trying to draw multiple sprites to a render texture to animate a background. I need to be able to scale and rotate the Sprite image but nothing else (possibly Sprite has overheads I don't need?) I currently use 26 Sprites in a group and RenderXY the group to a texture in my update I could use filled polygons and track the points over time and redraw but I assume Graphics is slower My PNG for the Sprite is 128x512 and I scale them from 0.1 to 1.7. In chrome it hovers around 59-60 but mobile performance is around 30 I'm just wondering if there's a technique that performs better? From what I understand phaser is batching my sprites anyway. If I use BitmapData draws my graphic doesn't come out with antialiasing which is no good here ... Essentially I'm using the PNG to simulate a vector animation, so hard pixels don't work here. Thanks for any advice J
  24. RenderTexture has an awesome function called getPixels, which returns a Uint8Array of rgba pixel data. Is there a way to create a pixi object from a Uint8Array of pixels? I'm writing an image processor which works in a webworker. The Uint8Array is already perfect for passing to/from the webworker, I'm just not sure how to use that pixel data again when it comes back from the worker. On a related note, the main reason I'm looking into using the webworker is because I can't quite get webgl filters (fragment shaders) to perform fast enough for me. I am using a ColorReplaceFilter and applying it to a sprite approximately 12-15 times to create my desired artwork. This creates a lag of 40-80 ms on my system. I need to get this number down to less than 16 ms. It is only the step of rendering via rendertexture with filters that causes this lag. Fifteen might sound like a lot of filters, and I could certainly engineer a single filter that does the work of 15, but the majority of the performance hit comes from applying 1 filter, adding subsequent filters does not slow things down very much. I don't really know how webgl works, but seems to me like something is blocking the main thread and creating this lag. What is the source of this? I feel like there might be an easy solution somewhere. If the problem is that the filters need sent from the cpu to the gpu, perhaps I could send them one at a time (not sure how). If the problem is that PIXI.RenderTexture.render blocks the main thread until webgl has processed all the filters and rendering is complete, perhaps there could be a PIXI.RenderTexture.asyncRender that uses a callback and allows for some image processing magic. Any ideas?
  25. tomph

    Thumbnail screenshot issues

    Hi guys, sorry if this has been covered, but I've been going around in circles on Google today. I'm trying to render a selection of objects (screenshot) and resize said render/texture so that I can create a thumbnail. I have successfully been using renderTexture.render(), but I run into problems when calling 'resize' - basically, my texture is cleared. Excuse me if I'm being dumb. See code below. this works... var texture: PIXI.RenderTexture =, 768, "key", true);texture.render(container, new PIXI.Point(0, 0), true);//text.resize(width, height, true);,0,texture); this does not work... var texture: PIXI.RenderTexture =, 768, "key", true);texture.render(container, new PIXI.Point(0, 0), true);text.resize(width, height, true);,0,texture);