• Content Count

  • Joined

  • Last visited

Everything posted by hexus

  1. Finally sorted the issue that was driving me mad. That feel when one kind of tile has its vertexes defined counter-clockwise instead of clockwise... Version 0.2.0-alpha has been released with initial support for circular physics bodies. It still freaks out on some of the smallest tiles and sometimes on corners of tiles, but it's a good start. Also, many thanks to IkonOne for the new typescript definitions.
  2. Not sure if it's right to post issues on GitHub for the dev branch, so I'll mention some issues I'm having here. Applying a filter to game.world complains that a TilemapLayerGL instance doesn't have a getBounds method. It should have one, right? But I've noticed in the source it doesn't. Also, losing DOM focus on the game breaks camera follow for me, resetting camera position to the top left of the map. It seems like it causes the game world to resize too, as my player sprite can suddenly collide with all edges of the screen. Edit: This was caused by some bugs that I've fixed as part of a pull request. Is there any way to switch back to the canvas renderer in WebGL mode? Edit: It does work well as a drop-in replacement for the Arcade Slopes demo, just not my more complex, full screen project. I'm actually going to tinker with the Phaser source and see if I can fix anything.
  3. I've already started work on it, but it's proving trickier than expected. I'll definitely update everyone here when it's sorted.
  4. It's possible with the regular particle system for sure. I would assume that particle storm works similarly, but maybe not. http://phaser.io/examples/v2/particles/when-particles-collide
  5. Use this. I've started a project with it and it's a fantastic starting point. Starting to heavily modify the gulpfile though, might have to start modularising it!
  6. I'd like to expand on this for speed hacking, too. You could use a lenient range check on the player to make sure, according to their allowed speed, whether they're moving too quickly. You won't want to be too strict with this though, because latency will most definitely cause some inconsistencies.
  7. Yep, this bothers me too. It's always worked this way apparently, and wasn't built with proper zoom functionality intended, but in my mind if you scale a camera it shouldn't behave like this. A camera should define a consistent perspective into the world.
  8. Will do! If I do ever make it a plugin it might be difficult to tailor such a thing to different games; there are so many ways of doing things like this. But if I get comfortable with a solution, I'll be sure to! Only problem I'm facing with this approach right now is performance at large resolutions with many lights, this "problem", and colour banding.
  9. Yeah, if you truly only want to use one object on the map then you could just index into it directly to get the data you're interested in. var object = map.objects[0][0]; game.add.sprite('player', object.x, object.y); Probably best to make sure that you get the layer and object indexes correct though - they could be prone to change depending on how many object layers, and how many objects in the desired layer, in your map. There are other ways in the API to search for layers and objects with specific names, but I can't remember the methods off the top of my head. Check out the API docs for tilemaps.
  10. Phaser doesn't implement this because there are so many different ways of interpreting this data - it becomes game-specific very quickly. Try starting from something like this: // Create the Phaser tilemap var map = game.make.tilemap('map'); // Loop over each object layer for (var ol in map.objects) { // Loop over each object in the object layer for (var o in map.objects[ol]) { var object = map.objects[ol][o]; console.log(object); // Do something with the object data here; game.add.sprite(object.name) // for example, or even game.add[object.type](object.name) } }
  11. The plugin has been updated to support Phaser 2.4.9/2.5.0! Hurrah. Grab v0.1.1! Offset tilemap layers aren't properly supported yet, but the plugin now works with these versions. There was a signature change to an internal Phaser method that the plugin overrides.
  12. I've been working recently on a lighting system for my game, and I've noticed something awkward when passing render textures to fragment shaders. TL;DR: Why does uSampler act normally when any other RenderTexture passed into a shader needs vTextureCoord.y *= -1.0 to prevent it from being inverted on the Y axis? I'll try to explain with the code as concisely as I can, but I don't have a demo I can show for this issue. I'm drawing a game world to a render texture a little like this: // When booting the renderer: var diffuse = new Phaser.Sprite(game, 0, 0, new Phaser.RenderTexture( game, game.width, game.height, null, Phaser.scaleModes.NEAREST )); var cameraMatrix = new Phaser.Matrix(1, 0, 0, 1, -game.camera.x, -game.camera.y ); // Outside the renderer, after booting it: var filter = new Phaser.Filter(game, { diffuse: { type: 'sampler2D', value: diffuse.texture } }, game.cache.getShader('diffuse')); game.world.filters = [filter]; // Later on in the rendering pass: // Retain existing filters filters = game.world.filters; // Render the world to the texture without filters game.world.filters = null; diffuse.texture.render(game.world, cameraMatrix, true); // Set back the filters game.world.filters = filters; It's part of a sprite because that makes it easy to output on the stage for debugging (to view it, really). If I do this, it renders fine. Now, my lighting system acts on this diffuse texture to apply some shaders. I've finally got my system working, but unfortunately with a lot of uv.y *= -1.0; hacks to get around this Y inversion issue I've been finding when running a render texture through a shader sampler. I would rather remove these hacks and have the shaders read more cleanly. When I use a Filter to draw the resulting lit texture over the game world, it is inverted on the Y axis. This can be demonstrated with the following fragment shader and Phaser Filter: precision highp float; varying vec2 vTextureCoord; uniform sampler2D uSampler; uniform sampler2D diffuse; void main() { gl_FragColor = texture2D(diffuse, vTextureCoord); } filter = new Phaser.Filter(game, { diffuse: { type: 'sampler2D', value: diffuse.texture } }, game.cache.getShader('diffuse')); world.filters = [filter]; Why is it flipped here and uSampler isn't? For example, if I change the fragment shader to use uSampler: void main() { gl_FragColor = texture2D(uSampler, vTextureCoord); } It doesn't render upside down. I've tried looking for the reasoning behind this but can't find much. A pixi bug perhaps? Or should I just pass all textures to my shaders with flipY set to true? new Phaser.Filter(game, { diffuse: { type: 'sampler2D', value: diffuse.texture, textureData: { flipY: true } } }, game.cache.getShader('diffuse')); Is uSampler flipped automatically? That might explain. Cheers!
  13. Illuminated.js does support colours. Play with the demo some more. Maybe the Phaser plugin doesn't though, I'm not sure. I considered using it for a long time but I'd personally rather harness the power of WebGL. I've been working on a lighting system that does exactly this recently though, it's totally possible to use shaders for this. It's just a bit awkward working with Phaser filters and old Pixi v2 filters to get things to render. This is the way I'm intending it to work: Currently visible light objects in the game world are given to a Phaser filter (it's actually not a filter in my case currently but will be eventually) The filter uses a shader to render each light to a render texture, using diffuse/normal textures rendered from the currently visible game world Awkwardly use another filter with a basic diffuse shader that can render this resulting texture in front of the game world There is more to it than that, because I'm also doing shadow mapping, but that's the gist of it. Ultimately I've found it's pretty effective to attach a filter to the game world and adjust the shader uniforms based on object - camera positions to get the 0-1 values used in fragment shaders. The real pain for me right now is not having access to a filter's resulting texture. I want to reuse a single light filter/shader for every light instead of creating a filter for every single light, and there's a wasteful render of the scene before everything I mentioned above takes place. Here's a little 2D light shader I made on shader toy that I've been adapting heavily for my current project: https://www.shadertoy.com/view/MsyXz3
  14. I'm in the same boat really, I want to built a game that allows for free-roaming an enormous map. I'm planning on building a gulp task that breaks down one huge Tiled map into chunks that can be loaded dynamically as the world is traversed. When it comes to game entities, though, you don't want to be cycling through everything in the map - it's just not feasible to have every entity loaded and updated. So you have to decide how you want to break it down. Visiting an area could make these entities spawn, and if, for example, you destroy them, they could just respawn the same as before once you return to that chunk You could store some kind of state for each entity (with some global ID) to record a player's progress, making sure entities don't respawn once destroyed Or a mix of the two - some entities you might want to always respawn, some entities you might never want to respawn It would be nice if there was a plugin that does some of the upfront work for this without being game-specific (maybe that chunkmap plugin does), but I think solutions need to be tailored for each game for the most part.
  15. Seems like it was! I played around, modified the filter/shader resolution to that of the render texture (and catered for positioning too), then rendered the render texture. Then, the uniforms get set back to screen space for Phaser's own rendering, and voila, job done. Makes a lot of sense now, seems like it should have been obvious. Thanks for the help! Now I can start properly messing around with a lighting pipeline.
  16. Ah, actually I'm passing the screen resolution as the resolution uniform, so that the result indeed fits the screen. Maybe that's where this is going wrong: the screen resolution works fine for the game canvas, but when rendering to a render texture the resolution is going to be different - perhaps the transform matrix doesn't work the way I'd expect it to here. Later on I'll tinker with setting the resolution to the render texture size before calling the above .render().
  17. I apply the shader to the game world at the moment, using world.filters. The variables I use in my set up differ from those on shadertoy, but it's the resolution uniform passed in by Phaser/Pixi that I'm using. Oh right. For the soft edges I was planning on a gradual gaussian blur based on the distance, though I haven't thought how I'll do that in practice yet.
  18. Nice! That's exactly the sort of effect I'm going for, though I've noticed the shadows flicker around surfaces when the light moves. I'm actually drawing part of the entire game world, and no sprites have been scaled. What doesn't make sense to me is why only rendering it with a filter does the perspective get thrown off, when the filter renders fine regularly (not to a render texture). let matrix = new PIXI.Matrix( 1, 0, 0, 1, -object.x - object.width / 2 + this.distanceSprite.width / 2, -object.y - object.height / 2 + this.distanceSprite.height / 2 ); this.distanceTexture.render(this.world, matrix, true); That's how I'm rendering to the texture. Perhaps this is just a bug in Pixi v2, or I'm doing something wrong in my shader, which can be found on Shadertoy: https://www.shadertoy.com/view/MsyXz3
  19. Yo! I've started developing a game and I've been learning a hell of a lot about WebGL for lighting and shadows, and I'm hoping to implement some sort of lighting pipeline, following a technique like this: http://ahamnett.blogspot.co.uk/2013/05/2d-shadows-shader.html This means I'll need to process textures in a fragment shader, and then process scene using the resulting bitmap in another shader. Just to test, I tried rendering a part of a game world to a render texture. This worked fine without any shaders (Phaser/PIXI filters) applied. The render texture is displayed using a Sprite on the right hand side, just so I could see the result. Cool. However, if I add filters to what I'm rendering, the perspective gets all messed up. For example, my lighting fragment shader: So instead of scratching my head over this, having done quite a bit of Googling the past couple of days, I thought I'd ask for help here. Is there a way to render to a render texture with shaders applied correctly? Is there a typical approach to setting up a pipeline like this? Essentially I'm looking to perform transformations as described in the above article so I can start casting some sweet shadows, but I can't quite see how yet. The perspective is fine if I apply shaders to the stage instead, but of course then I have no shader processing for the resulting render texture:
  20. My plugin does the job for sloped tiles without curves, but that's it. It doesn't go further than tilemap collisions. Interestingly, I haven't tested it at all with graphics larger/smaller than the bodies, but I'm guessing it would work fine. Feel free to post an issue on GitHub if you get round to tinkering with it and find any problems. I'll be resuming its development soon (and make it compatible with 2.4.8+).
  21. Gah, that separateTile API change means I need to update the slopes plugin, but that's fine really. Good idea to support offsets like that.
  22. Pixi v4 sounds awesome. Is Phaser currently using a custom Pixi v2?
  23. You might find these examples quite useful http://phaser.io/examples/v2/category/tilemaps
  24. Sweet plugin! As far as I'm aware, circle body support is being worked on already for newer versions. Check the branches of Phaser's Github repo.
  25. Just pushed an update to the demo with a tonne of controls! http://hexus.github.io/phaser-arcade-slopes/ The "snap" feature is my experimentation with sticking bodies to tiles. It works by moving them in a direction by the specified amount if they didn't successfully collide with any tiles (if there are any near enough to collide with), testing for a collision again after the move, and moving them back to their original position if it fails. If the collision succeeds then, hey presto, they've snapped to some tile in that direction. This messes things up more than I'd like when distances are set too high, though, so I'm not sure if it's a decent permanent solution. In the near future, I intend to expand these controls a little further (velocity limits are still missing for example) and make them a little prettier.