Jump to content

Search the Community

Showing results for tags 'renderTexture'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • HTML5 Game Coding
    • News
    • Game Showcase
    • Facebook Instant Games
    • Web Gaming Standards
    • Coding and Game Design
    • Paid Promotion (Buy Banner)
  • Frameworks
    • Pixi.js
    • Phaser 3
    • Phaser 2
    • Babylon.js
    • Panda 2
    • melonJS
    • Haxe JS
    • Kiwi.js
  • General
    • General Talk
    • GameMonetize
  • Business
    • Collaborations (un-paid)
    • Jobs (Hiring and Freelance)
    • Services Offered
    • Marketplace (Sell Apps, Websites, Games)

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Twitter


Skype


Location


Interests

  1. Hey there, I'm working on a site that requires to display a lot of small textures (12x140). I'm trying to display about 1000 of these... I've decided to recently switch to pixiJs to deal with performance issues I had with a regular html/css approach. So this is a bit new to me Everything works pretty smoothly on desktop, but I'm still having terrible performance on mobile (less than 10fps on an iOS 14 device). I've tried to make texture smaller (1x15) but no luck... I am loading textures as follow : this.thumbs.forEach((el, i) => { let texture = PIXI.Texture.from(this.loader.resources[el.finger.path].url) const finger = new PIXI.Graphics(); finger.beginTextureFill({texture: texture}); finger.endFill(); let fingerContainer = new PIXI.Container(); fingerContainer.addChild(finger) ... Here is a link to see where I'm at (you'll have to wait a bit before it display anything...) https://mire.studio/mire-pixi/ I feel a bit stuck here... Don't know if what I am doing is even possible ? Any advices on how I could make this works smoothly on mobile too would be appreciate. Many thanks
  2. Hi, I am trying to take a snapshot of the main container ( stage) of my application ,which I render on every frame ( customRenderer.render(stage)), and paste that snapshot on the topmost child container of the stage. The code looks like, const snapshot = this._customRenderer.generateTexture(this._stage) const sprite = new Sprite(snapshot ) this._stage.getChildByName("snapshotHolder").addChild(sprite) It takes snapshot alright, but if the stage is scaled down , the sprite even though of actual size of the stage ( lets say 1000x1000), the area covered of it by the snapshot is much less, rest of the area of the sprite is transparent. Not able to understand the logic behind this. I want to take the snapshot of the stage as it is visible (scaled or otherwise). Thanks for your help. -Arin
  3. Hi, I don't like to bother, but I don't sure what I'm doing and I need some guidance if it's possible... I have two containers, one for map sprites (320x180px) and the other for UI graphics (who it's resized to native resolution, in my case 1920x1080) My intention is to rescale map_sprites_containter and all its sprites to 1920x1080 in the most efficient way, every frame. So I came up with the idea of render a texture of map_sprites_container, convert it to a sprite and scale it. Is this the best way? Some pseudo code: var bitmap = new Bitmap(320, 180); var renderTexture = PIXI.RenderTexture.create(320, 180); renderer.render(map_sprites_containter, renderTexture); var canvas = renderer.extract.canvas(renderTexture); bitmap.context.drawImage(canvas, 0, 0); canvas.width = 0; canvas.height = 0; renderTexture.destroy({ destroyBase: true }); bitmap.baseTexture.update(); var render_map_sprites = new Sprite(); render_map_sprites.bitmap = bitmap; render_map_sprites.scale.x = 6 render_map_sprites.scale.y = 6 map_sprites_containter.addChild(render_map_sprites);
  4. Hi, I need to create an image in run-time and then convert it to a base64 string. I can create the image with Phaser.GameObject.RenderTexture, but does anyone know how to convert it to a base64 string? Thanks a lot. Tomas
  5. Hello, The answers to my questions may seem evident, please consider I'm a complete beginner to HTML and JS Phaser coding. I am having two problems with my code. The first is that renderTexture doesn't seem to work. I am trying to make a trail of my player, like the "trail" example on labs.phaser.io. I actually copied the code needed, but it does not work. Here is the line causing the problem: rt = this.make.renderTexture({ x: 0, y: 0, width: 800, height: 800 }).setOrigin(0, 0); in the create function. And here is what the console says: Uncaught TypeError: this.make.renderTexture is not a function at Scene.create ((index):301) at SceneManager.create (phaser.js:51412) at SceneManager.loadComplete (phaser.js:51329) at LoaderPlugin.emit (phaser.js:2622) at LoaderPlugin.processComplete (phaser.js:111210) at LoaderPlugin.removeFromQueue (phaser.js:111190) at LoaderPlugin.processUpdate (phaser.js:111171) at Image.data.onload (phaser.js:8947) _____________ The second issue affects the collision boxes of my obstacles. Those are static sprites, and the physics are "MatterJS". I have defined the shape of the hitboxes in PhysicsEditor, and have done exactly like in their tutorial to link the JSON to the sprites, but it doesn't work: the collision is not only triggered when the player enters the parts in the shape, but also in the rest of the rectangle that should not be considered. Furthermore, all physical properties I've set in the editor don't work. Then, I tried to do all of it without physicsEditor: I added required setters to do the work. While the physical properties are now correct, the hitboxes are still the whole rectangles instead of the chosen parts. Here is the code: //example var ob12 = this.matter.add.sprite(800-211, -6353, 'ob12').setStatic(true).setSensor(true).setFriction(0,0,0).setInteractive(new Phaser.Geom.Polygon([ [ { "x":132, "y":291 }, { "x":337, "y":498 }, { "x":423, "y":2 } ], [ { "x":50, "y":789 }, { "x":239, "y":979 }, { "x":337, "y":498 } ], [ { "x":0, "y":1220 }, { "x":423, "y":1639 }, { "x":239, "y":979 } ], [ { "x":239, "y":979 }, { "x":423, "y":1639 }, { "x":337, "y":498 } ], [ { "x":337, "y":498 }, { "x":423, "y":1639 }, { "x":423, "y":2 } ] ]), Phaser.Geom.Polygon.Contains); Thank you, Vainly
  6. Hey I am building quite huge tile map from hexagons. Basically for performance reasons I create single render texture and add a lot of sprites,and texts to it. It takes quite a lot of time to do so, but I get higher FPS and that texture is pretty much static. However there is another layer on top of this, which is dynamic, if I add another render texture to the game, FPS drops, so if I render everything to single texture FPS is still OK, however when dynamic part changes I need to rerendering whole texture and that takes too much time. So I was thinking to generate static texture and cache/clone it in the memory, and compose with with dynamic texture, the composition would take a bit of time, but end result - single texture would let me keep stable FPS. So when dynamic texture changes I would need to only render dynamic texture and compose it with already cached static texture, but not sure is it possible at all?
  7. Hi, first post and very new to PIXI! I'm trying to figure out how positioning works and how to do it properly. I have a few followup questions which I'll take later. Goal: Having a centered "mask area" where I can count the "unmasking progress" But first off; here is a fiddle. As you can see I have a centerContainer which I add all my sprites and bind all interactivity to. I center all the sprites relative to the app width and height values. I create the renderTexture with the same width and height as onTopOfMask and imageToReveal (400x400). The renderTextureSprite is not positioned and it results in only a corner being "unmasked" (clearly visible if I comment out the masking, it's positioned top left). If I position renderTextureSprite (fiddle) the same way as I've done with the sprites it works but then the brush/unmasking has some weird offset. 1. Is that the way to properly center the sprites etc or is it better to center the container instead? (like so: fiddle) or any other way? 2. When positioning, why does the mask have a weird offset? Fiddling around but not getting any wiser so help is greatly appriciated!
  8. Hi! I'm using pixi 4.5.3 and can't find a way to clear out an rectangle from my transparent render texture. Seems that all blend modes have gl.ONE_MINUS_SRC_ALPHA in them Is there any hack around this? I have a large renderTexture with mostly static bitmap text and I want to update only parts that have changed for better performance. Is there any other way to update only part of an rendertexture while retaining transparency? Thanks!
  9. I have this simple jsfiddle that displays the problem that I'm trying to solve. https://jsfiddle.net/4ke8k7hu/2/ It looks like the render texture get's cropped based on the position of the object passed to the renderTexture.render function. Is there a way around this?
  10. Hi, I'm trying to work out how to create a shape with graphics, cache it and add it to a sprite to use repeatedly I was trying to recreate the intro background on this video (see below, not the video thumbnail but the first thing you see when you actually play it). I realised drawing all the star graphics and fills directly over time would probably be too slow, so it would be better to cache the star & it's trail as a bitmap and then create multiple sprites from that. I want to create the graphic early on and pull it out of the cache later. I'm getting an error though. The key is there but it can't find the texture. Does it need longer to store before trying to pull it out again? Obviously I could use a PNG but i'd prefer to be able to generate the graphics at runtime thanks for any advice. J // draw graphics to renderTexturevar renderTexture = this.add.renderTexture(graphics.width, graphics.height);renderTexture.renderXY(graphics, 0, 0, true);// add renderTexture to cachethis.game.cache.addRenderTexture('starRT', renderTexture );// check starRT is in cacheconsole.log(this.game.cache.checkRenderTextureKey('starRT')) // true// Error: Phaser.Cache.getImage: Key "[object Object]" not found in Cache.var sprite = this.game.add.sprite(0,0,this.game.cache.getRenderTexture('starRT'));sprite.x=128;this works fine though, but obviously it's not being pulled from the cache var sprite2 = this.game.add.sprite(0,0,renderTexture);
  11. Apologies if this has either been asked before, or I'm misunderstanding how to use RenderTextures I have a renderTexture that's outputting the contents of a Phaser.Image(), which in turn contains an instance of Phaser.BitmapData(). In my update() loop, I want to update the position of some geometry drawn in the BitMapData, but those changes are never seen in the RenderTexture. What would be the reason for this? Like I said I could be misunderstanding how & why you might want to use a RenderTexture, but this portion of my code will be quite expensive & updated regularly - so it seemed like a good idea to pass this logic out to the GPU if possible, which is what a RenderTexture allows, right? Am open to some better ideas if I do indeed have the wrong idea! Below is the code: it's written in TypeScript and the class extends Phaser.Sprite(), so hopefully it makes sense even to those only familiar with JS. As you can see in my update() I'm redrawing a bitmapData.circle() with a new x position, then rendering back to the RenderTexture again. However, the circle never moves from its original position of 0. If I console.log out the x value, it's clearly update per tick. constructor(game: Phaser.Game, map: any) { const texture = new Phaser.RenderTexture(game, map.widthInPixels, map.heightInPixels, key); const bitmap = new Phaser.BitmapData(game, 'BITMAP', map.widthInPixels, map.heightInPixels); super(game, 0, 0, texture); this.bitmap = bitmap; this.image = new Phaser.Image(game, 0, 0, this.bitmap); this.texture = texture; // Other code to add this .Sprite() to the stage. } private update() { const bitmap = this.bitmap; bitmap.clear(); bitmap.circle(x, 100, 50, 'rgb(255, 255, 255)'); this.texture.render(this.image); x += 10; }
  12. I'm cutting up some large images into puzzle pieces, and I'd like to create spritesheets out of them directly from JavasScript (as opposed to cutting them up and saving them as images which are then loaded as a spritesheet). So far I've made a puzzle cutter in regular canvas. I did these operations in canvas instead of pixi because they are pretty easy in canvas. I can then draw all of these puzzle pieces to a new canvas, and via PIXI.RenderTexture I can begin to create a spritesheet. Ideally I'd like to gain access to a syntax like new PIXI.Sprite.fromFrame(puzzleId + '-' + puzzlePieceIndex) but I'm not sure how to achieve this. I also generate some other important images, like a stroked outline of the puzzle piece which I will later use in-game to great a highlight effect. I've been inspecting the objects created when importing a spritesheet via texturepacker. All I've figured out so far is that pixi creates a hash of image names matched to frame data... basically the name of an image as the key, and a rectangle, within the global pixi object. Is there some trick to this? Can I just create the strings and frame data and shove it all in there an expect it to work? E.g. 'nebula-2830' : { x: 500, y: 500, width: 48, height: 48 } (or whatever). How do I then actually wire it to the texture that I created via the render texture? The metadata created by texturepacker seems overly complicated for what I need here. Also, perhaps this doesn't change anything, but I might try for some really giant puzzles.. puzzles whose images are 4096 px squared or bigger -- for these I would cut them into multiple textures as only certain hardware would cope with those dimensions as is. Here are some images of the puzzle cutter's output in regular canvas: http://timetocode.tumblr.com/post/156841273346/javascript-jigsaw-puzzle-wip Thanks!
  13. Hello! I have an issue that I didn't understand too much. I'm trying to build a tiled-isometric-map loader for Phaser, where I have many issues about perfomance. So I will investigate different techniques to make them more lightweight for CPU/GPU trying to use some techniques like render the map layers in cropped RenderTextures (just draw the visible area of the map at once), using sprite-populated SpriteBatches/Groups as source (I tested with both). In simple words, the logical behind this is the following: Populate the SpriteBatch/Group with tile-based sprites from a cache-array (to prevent creating/destroying each time). Render that SpriteBatch/Group into the RenderTexture, clearing it before that. Cleaning the SpriteBatch/Group, putting the tile-based sprites back to the array and removing from the SpriteBatch/Group (without destroying them, off course). Create a Phaser.Image that show the RenderTexture in the screen. (Or create once, the texture are updated anyways). I tested that with one RenderTexture and works fine. The issue comes if I write more many of them (Assuming that 1 RenderTexture is equivalent at 1 Layer of the scenario), the screen starts showing y-inverted versions of the RenderTexture at random times. A important fact that this just occurs in WebGL mode (in Canvas the behaivour is the correct). I'm using Phaser CE 2.7.3. I coded a short example of the issue (each column is a different RenderTexture with correspondent Sprites), you're free to see and debug them : Canvas: http://elsemieni.net/inni/testCanvas/ WebGL: http://elsemieni.net/inni/testWebGL/ Some idea of what's going on there? Thanks in advance
  14. Hi there, im develong a game with ink. They are some planets and you shoot ink on them and then you conquer them. I use a temporal bitmapdata do draw render texture planets on it to use bitmapdata getpixel method to get if the planet is completely of a color (or almost). Im wondering if theres a better method because this is so laggy. Theres a way to get a pixel from renderTexture? conquer: function(p) { //I draw planet renderTexture on bmd bitmapdata to allow the getpixel function bmd.draw(p.capaPintura,0,0,(p.radio*2),(p.radio*2)); //Points of the planet to get pixels. col = []; col[0] = bmd.getPixelRGB(10,p.radio); //izquierda - centro col[1] = bmd.getPixelRGB(((p.radio*2)-10),p.radio); col[2] = bmd.getPixelRGB(p.radio,10); //centro - arriba col[3] = bmd.getPixelRGB(p.radio,((p.radio*2)-10)); if (col[0].rgba === col[1].rgba && col[2].rgba === col[0].rgba && col[0].rgba === col[3].rgba){ var c = 0; if(col[0].rgba === 'rgba(0,0,255,1)') c = 1; var colorPlanet = []; switch(c) { case 0: colorPlanet[0] = 0x550000; //Shadow color colorPlanet[1] = 0xAA0000; //Shadow color2 colorPlanet[2] = 0xFF0000; break; case 1: colorPlanet[0] = 0x000055; colorPlanet[1] = 0x0000AA; colorPlanet[2] = 0x0000FF; break; } //Shadow color pintData.beginFill(colorPlanet[0]); pintData.drawCircle(0, 0, p.radio*2); p.capaPintura.renderXY(pintData,p.radio,p.radio); //Shadow color 2 pintData.beginFill(colorPlanet[1]); pintData.drawCircle(0, 0, p.radio*2); p.capaPintura.renderXY(pintData,p.radio*1.1,p.radio*.9); //Planet color pintData.beginFill(colorPlanet[2]); pintData.drawCircle(0, 0, p.radio*2); pintData.endFill(); p.capaPintura.renderXY(pintData,p.radio*1.25,p.radio*0.75); } } Thats the code, i use it to check if the planet points (col[0], col[1], col[2] and col[3]) are of the same color, then if they are same color y draw on the planet renderTexture (p.capaPintura) colors of the conquerer ink. this is the planet after being conquerer by the blue team, the code works, but i have to use the conquer function once every time a planet is hit by ink or it wont work.. and its so laggy, theres a method to do it just with render textures?, i have to draw the planet renderTexture on bmd just to use getPixel, so if the planet is bigger then bigger is the lag i get because of that draw... .
  15. Hi people, I'm porting something from flash where I have a drawn map with roads, and I want to be able to select each road individually with the mouse. now in Flash I could simply use the hittest on the shapes of the roads, but in PIXI I can't. My solution was to draw all the roads, each with an unique RGB color, and do a get pixel on your mousecoords to see which road you've selected. Now my problem is, how do i get pixeldata from my sprite when I'm using a webgl renderer ? What I've come up with so far is creating a separate canvasrenderer and a separate stage to render my map to, and from that canvas (renderer.view) get the 2d context and get pixeldata.. // create separate canvas and canvasRenderer for this pixel mapvar canvasStage = new PIXI.Stage(0x000000);var canvasRenderer = new PIXI.CanvasRenderer(1024, 598, null, true);// create a texture that holds the mapvar texture = new PIXI.RenderTexture(1024, 598, this.canvasRenderer);texture.render(map);// create a sprite on our separate stage, otherwise the texture won't be renderedvar textureSprite = new PIXI.Sprite(texture);canvasStage.addChild(textureSprite);// render the stagecanvasRenderer.render(canvasStage);after that is done, I can use this to get my pixels var pixelData = canvasRenderer.view.getContext(("2d")).getImageData(posX, posY, 1, 1).data;Is this the correct way to do it? or is there a better way? regards, Martijn
  16. Yo! I've started developing a game and I've been learning a hell of a lot about WebGL for lighting and shadows, and I'm hoping to implement some sort of lighting pipeline, following a technique like this: http://ahamnett.blogspot.co.uk/2013/05/2d-shadows-shader.html This means I'll need to process textures in a fragment shader, and then process scene using the resulting bitmap in another shader. Just to test, I tried rendering a part of a game world to a render texture. This worked fine without any shaders (Phaser/PIXI filters) applied. The render texture is displayed using a Sprite on the right hand side, just so I could see the result. Cool. However, if I add filters to what I'm rendering, the perspective gets all messed up. For example, my lighting fragment shader: So instead of scratching my head over this, having done quite a bit of Googling the past couple of days, I thought I'd ask for help here. Is there a way to render to a render texture with shaders applied correctly? Is there a typical approach to setting up a pipeline like this? Essentially I'm looking to perform transformations as described in the above article so I can start casting some sweet shadows, but I can't quite see how yet. The perspective is fine if I apply shaders to the stage instead, but of course then I have no shader processing for the resulting render texture:
  17. HI, I am enthralled by the pixi render texture tutorial, but I'm getting an error from the update() function: I've changed the variable names, but I've triple checked that they are consistent. Here's some of my create() code to give an idea what I'm doing. I was hoping to use the render-texture tutorial on top of my working code. Is there a way to get the renderTexture working within an existing p2 game? Thank you. //game.stage.backgroundColor = "#f2f2f2"; game.world.setBounds(0, 0, 450, 800); game.physics.startSystem(Phaser.Physics.P2JS); // create two render textures.. // these dynamic textures will be used to draw the scene into itself render_texture1 = game.add.renderTexture(450, 800, 'texture1'); render_texture2 = game.add.renderTexture(450, 800, 'texture2'); current_texture = render_texture; // create a new sprite that uses the render texture we created above output_sprite = game.add.sprite(225, 400, current_texture); // align the sprite output_sprite.anchor.x = 0.5; output_sprite.anchor.y = 0.5; stuff_container = game.add.group(); stuff_container.x = 450/2; stuff_container.y = 800/2; // now create some items and randomly position them in the stuff container for (var i = 0; i < 4; i++) { var item = stuff_container.create( Math.random() * 400 - 200, Math.random() * 400 - 200, game.rnd.pick(game.cache.getKeys(Phaser.Cache.IMAGE)) ); item.anchor.setTo(0.5, 0.5); } // used for spinning! count = 0; // Turn on impact events for the world, // without this we get no collision callbacks game.physics.p2.setImpactEvents(true); game.physics.p2.updateBoundsCollisionGroup(); game.physics.p2.gravity.y = 0; game.physics.p2.restitution = 0.7;
  18. Hello, I've been playing around with what is on pixi.js dev-4.0.0 branch and have found everything to be working except RenderTexture and BaseRenderTexture. I just want to ask if these are still in development... or if I'm possibly using them wrong? RenderTexture.create = function(width, height, scaleMode, resolution) Overall awesome work on pixi-4 so far! We're ready to upgrade our game to v4
  19. We recently moved to Pixi.js v3 and we discovered that all the render textures were being rendered without anti-aliasing. All other Sprites and BitmapText object were just fine, and did look like they still had anti-aliasing. Has Pixi v3 explicitly disabled anti-aliasing for RenderTextures? Or is this a bug? See My JSFiddle Test Case Note: the Red Text is the RenderTexture.
  20. I try to render a container over a renderable texture, but the container transformations simply do not apply on the sprites inside. You can see it happens by taking the following code: var renderer = PIXI.autoDetectRenderer(800, 600,{backgroundColor : 0x1099bb}); document.body.appendChild(renderer.view); // create the root of the scene graph var stage = new PIXI.Container(); // create a new Sprite using the texture var texture = PIXI.Texture.fromImage('_assets/basics/bunny.png'); var bunny = new PIXI.Sprite(texture); // move the sprite to the center of the screen bunny.position.x = 200; bunny.position.y = 150; // create render texture and sprite var render_texture = new PIXI.RenderTexture(renderer, 800, 800); var render_texture_sprite = new PIXI.Sprite(render_texture); stage.addChild (render_texture_sprite); // create a container and add sprite to it var container = new PIXI.Container(); container.addChild(bunny); // these transformations will NOT apply, for some reason.. container.scale.x = 100; container.position.y = 100; // ???? // start animating animate(); function animate() { requestAnimationFrame(animate); render_texture.render(container); renderer.render(stage); } And paste it here http://pixijs.github.io/examples/ Is this a bug? or expected behavior? How do I make the container apply its transformations on the sprite? Thanks!
  21. Hi! I have been trying to get a second camera into my scene, so that this cameras view will overlay my main cameras view but just in the bottom left corner. Think if i wanted to create a mini-map, but to achieve this all i was doing was putting a camera way above the land looking down. Maybe best way to explain is please look at my example. http://www.babylonjs-playground.com/#1QXZAA#1 This is what i am trying to achieve. A second camera is looking at the same object but from a different angle. However, when you move the main camera with the mouse so that the objects in it are now positioned over the bottom left overlay, the objects go over top of my camera2 viewport since the objects are closer to camera1 then they are to camera2(z-buffering). How can i achieve this so that my bottom left overlay is always on top? is there a better way than view ports? like rendering the camera2's view to a texture that just overlays the canvas? Also, i had to create a third camera to force a white background on the viewport. Maybe there is a better way to do this? Thanks for any advice on how to proceed!
  22. In one state I'm using Phaser.RenderTexture, game.make.bitmapData and they reduce framerate in the next state. Is there a way to get rid of them completely? 1) I'm switching the state. 2) Using .destroy() for bitmapData, RenderTexture and sprite which uses RenderTexture. What else can be done? Remove them from cache? Thanks in advance. P.S. using CANVAS mode
  23. I'm struggling how to manipulate renderTexture while blitting. As of now, I have a burnout in a tilesheet, and the burnout animates when the arrow keys are pressed. My logic is to create a renderTexture of the burnout, have the renderTexture leave a copy of the burnout sprite behind the original burnout. This would mimic a car burnout. The problem I'm having is that the renderTexture sprite is moving with the original burnout and not staying behind. I have a burnout circle running around the screen instead of staying stuck on the stage where it was burnt out. My problem is trying to figure out how to make the renderTexture burnout stay in one position on the stage. Also I posted this in Coding and Game Design and realized it is more fitting in this forum. How can I delete the post in Coding and Game Design so there are not duplicates? <script src="pixi.min.js"></script><script src="spriteUtilities.js"></script><script src="keyboard.js"></script><script>//Test that Pixi is working//console.log(PIXI);var renderer = PIXI.autoDetectRenderer(970, 418, {antialias: false, transparent: false, resolution: 1});document.body.appendChild(renderer.view);var stage = new PIXI.Container();renderer.backgroundColor = 0xeeeeee;renderer.render(stage);loader = PIXI.loader .add("nasSkidtileSheetv2.png") .load(onAssetsLoaded);var framedSkid, renderTexture, sprite, state;var renderPosition = new PIXI.Point(100, 100);function onAssetsLoaded() { // Burnout sprite var skid = new PIXI.Sprite(PIXI.loader.resources["nasSkidtileSheetv2.png"].texture); PIXI.utils.TextureCache["nasSkidtileSheetv2.png"]; var skid = PIXI.utils.TextureCache["nasSkidtileSheetv2.png"]; var su = new SpriteUtilities(PIXI); var skidFrame = su.filmstrip("nasSkidtileSheetv2.png", 271, 269); framedSkid = new PIXI.extras.MovieClip(skidFrame); stage.addChild(framedSkid); framedSkid.animationSpeed = 0.5; framedSkid.x = 0; framedSkid.y = 0; framedSkid.vx = 0; framedSkid.vy = 0; // Burnout Copy renderTexture = new PIXI.RenderTexture(renderer, renderer.width, renderer.height); sprite = new PIXI.Sprite(renderTexture); stage.addChild(sprite); sprite.x = 0; sprite.y = 0; sprite.vx = 0; sprite.vy = 0; //Capture the keyboard arrow keys var left = keyboard(37), up = keyboard(38), right = keyboard(39), down = keyboard(40); //Left arrow key `press` method left.press = function() { //Change the cat's velocity when the key is pressed framedSkid.vx = -5; framedSkid.vy = 0; framedSkid.play(); }; //Left arrow key `release` method left.release = function() { //If the left arrow has been released, and the right arrow isn't down, //and the cat isn't moving vertically: //Stop the cat if (!right.isDown && framedSkid.vy === 0) { framedSkid.vx = 0; framedSkid.stop(); } }; //Up up.press = function() { framedSkid.vy = -5; framedSkid.vx = 0; framedSkid.play(); }; up.release = function() { if (!down.isDown && framedSkid.vx === 0) { framedSkid.vy = 0; framedSkid.stop(); } }; //Right right.press = function() { framedSkid.vx = 5; framedSkid.vy = 0; framedSkid.play(); }; right.release = function() { if (!left.isDown && framedSkid.vy === 0) { framedSkid.vx = 0; framedSkid.stop(); } }; //Down down.press = function() { framedSkid.vy = 5; framedSkid.vx = 0; framedSkid.play(); }; down.release = function() { if (!up.isDown && framedSkid.vx === 0) { framedSkid.vy = 0; framedSkid.stop(); } }; state = play; gameLoop();}function gameLoop() { renderTexture.render(framedSkid); renderTexture.render(framedSkid, renderPosition, false); sprite.texture = renderTexture; // render the stage container renderer.render(stage); requestAnimationFrame(gameLoop); state();}function play() { //Apply the velocity values to the sprite’s position to make it move framedSkid.x += framedSkid.vx; framedSkid.y += framedSkid.vy; sprite.x += framedSkid.vx; sprite.y += framedSkid.vy;}</script>
  24. Can I make a Rendertexture from a PIXI.Container with several PIXI.Container's in it? So, a texture from a Container of Containers? I'm trying it, but I cant get any results to show up. I've got islandStage.addChild(cloudsBack);islandStage.addChild(map1);islandStage.addChild(container);islandStage.addChild(guest_island);islandStage.addChild(cloudsFront);stage.addChild(islandStage);In my animation loop I have islandShape = new PIXI.RenderTexture(islandStage, islandStage.width, islandStage.height);cloudShadowSprite.texture = islandShape;But when I throw cloudShadowSprite to the outer container (which we call "stage", not to be confused with the older versions of PIXI's stage container), There's nothing that I can see... The resulting render texture it supposed to be quite large (over 1000 in both width and height). There are no errors in the code. Eventually, I want to turn cloudShadowSprite into a mask of another container all together. Then, large "cloud shadows" will roll over through this new container. Since the new container will have a mask made from shape if whatever is going on in islandStage's sub containers, the "cloud shadows" should only appear on the appropriate images (and not the background which would be too "far away" or off in the horizon to get a shadow). Can anyone tell me what I'm doing wrong? I haven't seen a PIXI v3 RenderTexture example. Thanks!
  25. Hi, I have 2 render textures swapping to create a trail, a bit similar to this concept http://phaser.io/examples/v2/display/pixi-render-texture does anybody know why my trail never fades out.. I guess it's because the value never actually reaches 0? you can see there is grey residue of the sprite everywhere http://phaser.io/sandbox/VWOOYqpU/play I've adjusted the second image in photoshop to show the trails as they're not always easy to see on a monitor, but that can depend on the colour used thanks J
×
×
  • Create New...