Jump to content

Iavra

Members
  • Content Count

    17
  • Joined

  • Last visited

  • Days Won

    1
  1. It's actually an Rpgmaker plugin (though i'm using the latest PIXI version instead of the bundled one). The whole plugin can be found here: http://pastebin.com/qrFV6ThW It shouldn't be that hard to translate to a non-RM environment, though. Basically, you add an instance of IAVRA.LIGHTING.Layer to a scene and call its "update" function on each frame. "Graphics._renderer" would need to be substituted with however the PIXI.WebGLRenderer instance is called. Aside from that, i'm not using any RM specific classes. /edit: For now i'm working on per-vertex lighting, instead (example: http:/
  2. It seems i get a weird bug. Here's my test layer, that gets added to the scene (initialize gets called once, update gets called each update): $.LIGHTING.Layer.prototype = $._extend(Object.create(PIXI.Container.prototype), { initialize: function() { PIXI.Container.call(this); //this.addChild(this._demo = PIXI.Sprite.fromImage('img/occlusion.png')); this._occlusion = PIXI.Sprite.fromImage('img/occlusion.png'); this._test = new PIXI.Sprite(PIXI.RenderTexture.create(500, 500)); this.addChild(this._test); this._test.filters = [new $.LIGHTIN
  3. My only example image so far is this one: This is (part of) the occlusion map, which contains all shadowcasting objects. The whole process should work like this: For each light, pick a part of the occlusion map, that contains the outer circle of the light (the filter only works on the inner circle of the shadow texture, so we need to make sure to grab a bigger part). Assuming, that the center of the light is the center of the occlusion map, render it using both filters (for now on a RenderTexture). The last 2-3 lines in pass 1 need to be reversed, so we are drawing light,
  4. Indeed, if i take the unmapped coords in filter 2 and clamp them, it fixes the gap. So, i should be able to take the resulting texture, invert it (i can do that in pass 1, by drawing light instead of shadow) and use it to mask the light texture. I should probably blur the mask and use alpha masking, instead, to hide some of the lost precision. Though i still don't get, why the filtered texture is 260x260, while the original is 256x256. Somewhere in the code, i'm scaling the coordinates slightly. Maybe i should add a clamp to pass 1, too.
  5. It kinda "works", if i change the last line of filter 2 like this: gl_FragColor = texture2D(uSampler, (vec3(coord, 1.0) * unmappedMatrix).xy * 0.99); Though that also means i would need to add a 3rd render pass reverting that, losing further precision. It also feels really really dirty. /edit: I'm pondering, whether i should drop per-pixel shading and use raycasting, instead. Though that would either limit me to basic shapes or i'll need to cast a whole bunch of rays and still have jittery results.
  6. I already tried that and sadly it doesn't do anything
  7. I guess i have a small problem regarding precision when unwrapping. I'm rendering the occlusion map with these 2 filters, assuming the light is at the center of the texture: #define PI 3.1415926535897932384626433832795 varying vec2 vTextureCoord; uniform sampler2D uSampler; uniform mat3 mappedMatrix; uniform mat3 unmappedMatrix; const float h = 256.0; // height of the texture, currently hardcoded void main(void) { vec3 mappedCoord = vec3(vTextureCoord, 1.0) * mappedMatrix; for(float y = 0.0; y < h; y += 1.0) { if(y / h > mappedCoord.y) { break; } vec2 norm = v
  8. I see, this works for sprite.x = sprite.y = 0, but not for negative coordinates. Shouldn't matter, if i render the sprite on a RenderTexture, but i guess i would need to either modify the filterArea or part of the projectionMatrix to apply the filter on the whole sprite, not only the visible part?
  9. This is my filter so far: var Filter = function() { PIXI.Filter.call(this, null, [ '#define PI 3.14159265358979323846264', 'varying vec2 vTextureCoord;', 'uniform sampler2D uSampler;', 'uniform mat3 mappedMatrix;', 'uniform mat3 unmappedMatrix;', 'void main(void) {', ' vec2 mappedCoord = (vec3(vTextureCoord, 1.0) * mappedMatrix).xy;', ' vec2 norm = mappedCoord * 2.0 - 1.0;', ' float theta = PI + norm.x * PI;', ' float r = (1.0 + norm.y) * 0.5;', ' vec2 coord = vec2(-r * sin(theta), -
  10. Hmm, since i need to access the texture for the wrapping, i guess i would need to appy the mappedMatrix, calculate polar coordinates and transform them back into non-mapped ones to grab the correct pixels on the texture. How would i do this? Basically, i would: Apply the mappedMatrix to vTextureCoord to get coordinates in range 0.0-1.0. Calculate the polar coordinates. Divide out the matrix again to get the correct coordinates i need to access uSampler. Does this sound correct?
  11. I think it would be enough, if there is some way to port the filter listed in the OP over to v4, either as filter or shader (i think i actually used it as a shader back then). vTextureCoord used to be in range 0.0-1.0 over the whole sprite area, independent from the sprite's position on the screen. As of now, vTextureCoord seems to change, depending on the sprite's position and even more so if part of the sprite is outside the rendered part of the scene.
  12. Using calculateNormalizedScreenSpaceMatrix doesn't seem to work, the filter still behaves different depending on the sprite's position. I tried using filters as shaders, but it doesn't seem to work? Using the BlurFilter, i tried the following approaches: sprite.filters = [new PIXI.filters.BlurFilter()]; sprite.shader = new PIXI.filters.BlurFilter(); The second one doesn't seem to do anything and i couldn't find an example on how to use shaders in v4.
  13. Hmm, since my test image is 256x256, it should work fine with power-of-two, but i'll give it a try. Haven't worked with shaders in v4 so far, but i'll take a look at it, too. I want to use a RenderTexture in the end, so i can use the wrapped-modified-unwrapped occlusion map as a mask for the light. It will probably contain artifacts, since i lose out on precision during the process, but if i draw soft shadows i should be able to smoothen it out.
  14. In PIXI v2, i was using the following fragment filter to convert a texture from rectangular to polar coordinates, as part of a process to create a shadowmap: PIXI.AbstractFilter.call(this, [ '#define PI 3.14', 'precision mediump float;', 'varying vec2 vTextureCoord;', 'uniform sampler2D uSampler;', 'void main(void) {', ' vec2 norm = vTextureCoord * 2.0 - 1.0;', ' float theta = PI + norm.x * PI;', ' float r = (1.0 + norm.y) * 0.5;', ' vec2 coord = vec2(-r * sin(theta), -r * cos(theta)) / 2.0 + 0.5;', ' gl_Fra
  15. I have currently given up on the project, since PIXI 2.2.9 filters seem to be bugged when receiving additional textures as uniforms. Both textures are the same size, yet i can't see to select the exact same pixel in both. The predefined AlphaMaskFilter also doesn't work. Also, in this version i can't use sprites as masks, so the last step (creating the light) is currently impossible for me.
×
×
  • Create New...