Jump to content


  • Content Count

  • Joined

  • Last visited

About gordyr

  • Rank

Recent Profile Visitors

760 profile views
  1. The shader could be easily turned into a PIXI.AbstractFilter which could be applied to whole Containers/DisplayObjects in the standard .filters = ['PIXI.maskShader']; manner. You would simply then send the filter the mask coordinates and internally it would calculate the barycentric coordinates and send them to the filters fragment shader, masking anything outside of the coordinates in an anti-aliased manner. I could of course be greatly misunderstanding how PIXI works, but while working on my PIXI.Photo class i was able to mask whole containers full of objects fine. (I developed the sh
  2. Just in case anyone is interested, I have a first draft version of my PIXI.Photo and PIXI.PhotoRenderer class completed and they're working wonderfully. As you can see from the example image, when rotated the edges are beautifully anti-aliased even when webgl anti-aliasing and FXAA are turned off. When the imge is not rotated the edges remain perfectly sharp at any zoom level/scale. Furthermore, I can now apply perfectly anti-aliased rectangular masks to the sprite simply by changing the _frame attribute. On mobile devices we were getting slow-downs using the standard PIXI.Mask syste
  3. Excllent, thanks Ivan. That's exactly what i'll do. I think I will investigate calculating the expanded quad and barycentric coords in the vertex shader first however. Just in case we can preserve batching using that method. Although it will would still be broken in our case because of the other things we are doing to these sprites, but there are plenty of other uses I can think of where this would be preferable.
  4. Yes, we previously used the transparent border technique, but given that we have to constantly zoom in and out and resize sprites, it is not ideal. Leaving the edges too blurry when sprites are not rotated. This method appears preferable for our use case. We are not worried about it breaking batching as there will never be more than 20 or so of these sprites on screen at any given time, which you can probably see from the type of app from the screenshot above. Also, it is already broken in our case as we are applying individual masks (using this new method will be far faster) and individua
  5. Yes it could, as can be seen in the example image above. It is essentially applying a mask right now, albeit it transformed incorrectly and rotated in the opposite direction. You would ignore the current stencil buffer mask class in pixi.js altogether and simply pass in the coordinates of your mask frame into the custom shader instead. All the shader is doing is making pixels outside of the given frame transparent, and smoothing the edges based on a given texture coordinates distance to the edge of the frame uniform I am passing the custom shader. It would be far faster than the cur
  6. Well I've actually managed to nearly get there... Hopefully someone can give me the final push. What I've done is hooked into the SpriteRenderer class and exposed the four points of the quads, attaching them to the sprite object itself. Then in my application code I calculate the quad and line/edge coefficients as shown in the demo's source code above. I build the EdgeArray and then send them into a custom shader which is almost exactly the same as the one in the demo. This works and produced beautifully anti-aliased edges to my rotated sprites. However currently the transformation a
  7. Sorry guys I should have been more clear. The custom vertex/fragment shaders I am fine with. It's more about where and how, in PIXI internals, can I hack into, in order to get the vertices of the quad, manipulate them, and send them on to the shaders. I am quite happy to be hacking around inside PIXI. This is to be expected.
  8. Within our app we are dealing with sprites that are using non transparent, rectangular power of 2 textures only. Each texture is essentially a photograph (stretched to ensure it is pow2 in order to enable mipmapping) Using Webgl Anti-aliasing or FXAA is not an option for us for various reasons. Therefore in order to ensure that these sprites looks anti-aliased when rotated we render the texture to a canvas first leaving a few px wide transparent border around the edge so that the textures bi-linear filtering takes care of smoothing out the edges of these sprites. It works okay, but
  9. I'm afraid I'm not allowed to show you the app itself at the moment. Not until it's released anyway. I've literally just a had an idea. When the user begins text editing we could blit the stage to a 2d canvas once. Keep this in memory, then as the caret moves, continually sample the pixel data of the area behind the caret on the 2d canvas with getImageData(). This should still be fast as we would only be sampling a few pixels each time. Then draw the caret itself pixel by pixel based on the inverse of the sampled pixel data. Either in a custom fragment shader or by simply replacing the
  10. This is a strange title I know. I will try to explain. We have built a complete page layout/drawing/rich text editing application powered by PIXI. Think of a web based Microsoft publisher. Obviously within this we have the need to render a flashing caret/cursor when the user is editing text. Currently this caret is a simple sprite with a tiny single color texture rather than PIXI.Graphics (so that it gets naturally antialiased when the textbox is rotated). This sprite gets scaled vertically to match the size of a given font. Right now the texture is simply black. However we would li
  11. First it's best if I give some background... I have a single rectangular sprite. This sprite is a photograph, when loading this photograph as a texture we first draw it to a canvas stretching it to become the nearest power of 2 texture. Then in Pixi, we load the texture from the canvas and set the width/height properties of the sprite in order to return the sprite to the correct aspect ratio for the photograph. This photogaph needs to be rotated around an dynamic origin point which is controlled by the user. In order to achieve this we first began manipulating the sprite.anchor proper
  12. I've just read through the changelog for the recently released version 2.0 of Pixi.js and read with interest about the new ability to use custom shaders on sprites. Since It states that performance is faster than filters I would like to convert some of my custom filters into this format (if it makes sense to do so) I have a few questions regarding this though: How do I use this feature / How do I attach my custom shaders to the sprites? Can I chain multiple shaders to the same sprite? Why is performance better than filters? Are they not just simply difference ways of attaching a fragmen
  13. It's a very welcome addition in my opinion. Although it's not difficult to roll your own, it's nice to have this sort of stuff built in to Pixi.
  14. We are using pixi.js for several aspects of our app, one of which is as a base framework for a WebGL photo editing application. Although we are aware pixi was never intended for this, it has proven to be an excellent fit. We have written lots of fragment shaders offering all kinds of interesting photo manipulation effects and have built them as extensions to pixi's excellent filter class. (I have already contributed a convolution filter and intent to contribute the rest of our filters soon) Anyway, on to my question. One of the benefits of using Pixi and harnessing its simple access to
  • Create New...