Jump to content

Aliasing artifacts using DisplacementFilter


nicknack23
 Share

Recommended Posts

I've written a small parser in javascript to read Photoshop Liquify meshes so I can apply liquify warps created in Photoshop to images dynamically in a browser. I first wrote a crappy implementation using manual pixel displacement calculations in canvas. That worked in principle and resulted in images visually identical to what I got in Photoshop, but it was horribly slow. Then while googling displacement maps to figure out a way to speed up my code I stumbled across the PIXI demos and realized I could potentially get super fast performance using PIXI & WebGL.

The good news: the PIXI version is working and is several orders of magnitude faster than my canvas version. Astonishing!

The bad news: I'm getting pretty nasty aliasing artifacts in regions where the displacement is large. My canvas implementation also looked rough until I added bilinear interpolation, but I suspect the problem here is that the displacement vectors which are originally represented by 32 bit floats need to be translated into 8 bit integers in the PIXI displacement maps. I was hoping that interpolation could fix that in PIXI too, but maybe not.

Here's an imgur album that demonstrates the issue:  https://imgur.com/a/jiKo3bP

The test image is 800x800 pixels, and the maximum displacement of the warp is 371 pixels. The displacement of +/- 371 needs to be represented by an integer between 0 and 255, so every integer value of the displacement map corresponds to a jump of about 371*2/255 = 3 pixels. To me the aliasing artifacts look about 3 pixels wide, so maybe my suspicion is correct. (Another indication of this: the entire displaced image shifts up & left about 1.5 pixels even where there should be no displacement. The value 1.5 happens to be half of three, so maybe the zero displacement level is "halfway off center".)

So, my question: is there a way in PIXI to represent displacements more accurately? Maybe another method that doesn't rely on 8 bit color channels? If not, can these aliasing artifacts be reduced by adding or improving the interpolation?

Here are the relevant functions of the code I'm using:

// Translate a mesh object containing x & y coords of Photoshop's displacement vectors
// (displacement measured in pixels) to an 8 bit RGBA image with x & y displacements
// stored in red and green channels. Zero displacement has value 128. Scale so the
// 8 bits we have are used efficiently, i.e. the maximum displacement goes to 0 or 255.
function mesh2meshimage(mesh) {
    let absMax = 0;
    for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) {
        absMax = Math.max(absMax, Math.abs(mesh.x[pixelindex]), Math.abs(mesh.y[pixelindex]));
    }

    const meshimage = new Uint8ClampedArray(mesh.width * mesh.height * 4);

    for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) {
        const channelindex = pixelindex * 4;
        meshimage[channelindex + 0] = 128 + mesh.x[pixelindex]/absMax*127;     // R
        meshimage[channelindex + 1] = 128 + mesh.y[pixelindex]/absMax*127;     // G
        meshimage[channelindex + 2] = 128;      // B
        meshimage[channelindex + 3] = 255;      // A
    }

    const imagedata = new ImageData(meshimage, mesh.width, mesh.height);

    return {imagedata:imagedata, origwidth:mesh.origwidth, origheight:mesh.origheight, max:absMax};
}

// Now use PIXI to render the displaced image.
function liquify(meshimage) {
    const canvas = document.createElement('canvas');
    const context = canvas.getContext('2d');

    canvas.width = meshimage.imagedata.width;
    canvas.height = meshimage.imagedata.height;
    context.putImageData(meshimage.imagedata, 0, 0);

    const displacementSprite = PIXI.Sprite.from(canvas);

    const scalex = meshimage.origwidth/meshimage.imagedata.width;
    const scaley = meshimage.origheight/meshimage.imagedata.height;
    displacementSprite.scale.set(scalex, scaley);
    PX.app.stage.addChild(displacementSprite);

    const displacementFilter = new PIXI.filters.DisplacementFilter(displacementSprite);
    PX.sprite.filters = [displacementFilter];

    displacementFilter.scale.x = meshimage.max * scalex / 2;
    displacementFilter.scale.y = meshimage.max * scaley / 2;
}

 

Link to comment
Share on other sites

What version of pixi are you using and do you require webgl1 support?

Webg2l has support for much larger texture formats. RGBA32UI or RGBA32F would be much more precise.

Webgl1 only supports RGB, RGBA and RGBA4, might have others but those I found out with a fast check.

You could also do your own shader that uses 3 images (or 4 if alpha is needed also) and combines the RGBA -values from those to get 4 times more precision.

Link to comment
Share on other sites

WebGL2, interesting thought. If I'm reading the browser support listings correctly, then I would only have to sacrifice iOS users (assuming I direct MacOS people to Chrome or Firefox). They would have to fall back to WebGL1 and live with the aliasing. I guess that's acceptable. I'm currently using v5 but I'll use anything I have to if I can make this work. :)

But does Pixi support creating textures with larger color depth like RGBA32UI or RGBA32F? I'm completely new to Pixi and didn't see any mention about this in the demos or the documentation, but I may have missed it. Can someone point me to an example? And if I get that far, would PIXI.filters.DisplacementFilter work on deeper textures out-of-the-box?

Since I'm still a noob with even basic Pixi & WebGL stuff I'm pretty sure shader programming is out of my league for now. But out of curiosity, how much work would it be for an experienced shader programmer to modify the displacement code to take several images as input? Coming to think of it, since DisplacementFilter only requires R+G channels, wouldn't it be possible to use the B+A channels to double the precision? That sounds much easier to implement to me (but again, I'm a noob).

Link to comment
Share on other sites

That would be so cool if Float textures worked without rescaling! But it's not quite working yet for me, so let me ask for some clarifications. Are the following assumptions correct?

  • The array myfloat32array contains normalized displacements in order xyxyxy.... etc, so the total length is width*height*2.
  • I can change the shader so it calculates in pixels, but displacementfilter should work without change if I normalize myfloat32array values to max 1.0.
  • The sprite created by PIXI.Sprite.from(myfloat32array ... is like any other sprite and can be displayed using app.stage.addChild().
  • Scaling of the displacementfilter is as before (the max displacement in pixels).

I wanted to be sure about these steps because I get a WebGL warning and the end result looks weird. The sprite gets created but when I add it to the stage or call PIXI.filters.DisplacementFilter with it I get the warning "Error: WebGL warning: texImage2D: ArrayBufferView type not compatible with `type`." (at BufferResource.js:84:15), and it doesn't appear on screen. The warped image looks like this:  https://imgur.com/a/farMrr0.

Below is my new code, grateful for any suggestions!

EDIT (sorry, but syntax highlighting disappears when I hit the post button, even though I set it to javascript). Also fixed I bug I saw just as I posted. Same result as above though.

function mesh2meshimage_float(mesh) {
    let absMax = 0;
    for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) {
        absMax = Math.max(absMax, Math.abs(mesh.x[pixelindex]), Math.abs(mesh.y[pixelindex]));
    }

    const arr = new Float32Array(mesh.width * mesh.height * 2);

    for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) {
        const channelindex = pixelindex * 2;
        arr[channelindex + 0] = (mesh.x[pixelindex]/absMax + 1)/2;
        arr[channelindex + 1] = (mesh.y[pixelindex]/absMax + 1)/2;
    }

    return {array:arr, width:mesh.width, height:mesh.height,
                origwidth:mesh.origwidth, origheight:mesh.origheight, max:absMax};
}

function liquify(meshimage) {
    const canvas = document.createElement('canvas');
    const context = canvas.getContext('2d');

    const displacementSprite = PIXI.Sprite.from(meshimage.array,
                    { resourceOptions: {width:meshimage.width, height:meshimage.height}} );

    const scalex = meshimage.origwidth/meshimage.width;
    const scaley = meshimage.origheight/meshimage.height;

    // displacementSprite.scale.set(scalex, scaley);
    PX.app.stage.addChild(displacementSprite);

    const displacementFilter = new PIXI.filters.DisplacementFilter(displacementSprite);
    PX.sprite.filters = [displacementFilter];

    displacementFilter.scale.x = meshimage.max; // * scalex / 2;
    displacementFilter.scale.y = meshimage.max; // * scaley / 2;
}

 

Link to comment
Share on other sites

Nope, its supposed to be 4*width*height, we dont use G and A channels. 

OK, I forgot that we have to specify type. it doesn't automatically detect it :(

Here, one violet pixel (R=1.0, G=0.0, B=1.0, A=1.0), and its shown in 100x100 sprite. Yes, 4 floats means 16 bytes per pixel, it takes 16*600*800=  7.32M memory for size 800x600. Yes, we can actually use two more channels but we'll need different shader for it.

const arr = new Float32Array([1, 0, 1, 1]);
const spr =  PIXI.Sprite.from(arr,
                    { type: PIXI.TYPES.FLOAT, resourceOptions: {width:1, height:1}} );
spr.width = 100;
spr.height = 100;
app.stage.addChild(spr);

Just to remind: "Sprite.from" actually calls "Texture.from" which calls "BaseTexture.from". I hope you know the difference between Sprite Texture and BaseTexture :)

Link to comment
Share on other sites

Sorry I misunderstood. I got confused and thought the Float32Array would somehow replace the color channels. This makes much more sense.

I didn't manage to get the violet single pixel Float32Array sprite working using any variation of PIXI.Sprite.from(), but taking a detour by PIXI.BaseTexture() works fine:

https://www.pixiplayground.com/#/edit/oo5hdwmRyB7W8bpT3PFvp
https://www.pixiplayground.com/#/edit/vRktRf1v5CsF3yon0p6Ua

And now my warp is running again, but sadly the aliasing actually got worse! But at least it's "regularly irregular" now - there are no random wavy lines anymore, and no single-pixel shifts in zero displacement areas. Here it is:

https://imgur.com/a/gIg4zV6

The only thing I can think of is maybe the texture2D() function in the shader isn't interpolating displacement vectors. But it's supposed to, right? Any other ideas?

Link to comment
Share on other sites

Alright, here is my code. Everything is there except for the .msh-file parsing. Instead I include the displacement map as two *long* Base64 strings at the bottom, along with the two images. I couldn't find any way to include HTML + CSS in a pixiplayground so I dumped it all at Codepen instead. I suggest you click "Change view" and put the code along the side to see the whole image.

There are three demos that all do (or should do) the same thing. Example 1 uses Pixi displacementfilter with standard 8 bit color channels for the displacement map. Example 2 does the same thing with float32 channels. And example 3 uses manual CPU-calculated displacements in vanilla javascript, with bilinear interpolation of both displacement vectors and color samples. In all three examples you can click on the toggle button to compare with Photoshop Liquify's rendering of the same displacement map.

https://codepen.io/anon/pen/PrgBoN
https://codepen.io/anon/pen/JQVZQw
https://codepen.io/anon/pen/orOMjr

I really hope you can figure out what causes the aliasing issue, because Pixi renders the warp 100x faster than the canvas version! And that might just make it possible to *animate* the warps, even with fairly large displacement maps.

Link to comment
Share on other sites

  • 2 weeks later...
  • 4 weeks later...

I finally managed to learn (just barely) enough shader programming to fix this myself. The problem was as I suspected that the displacement map wasn't getting interpolated properly by just by calling texture2d. So I added bilinear interpolation manually in the fragment shader, like this:

varying vec2 vFilterCoord;
varying vec2 vTextureCoord;

uniform vec2 scale;
uniform mat2 rotation;
uniform sampler2D uSampler;
uniform sampler2D mapSampler;

uniform highp vec4 inputSize;
uniform vec4 inputClamp;
uniform vec4 mapSize;

vec4 texture2D_bilinear(sampler2D sampler, vec2 coord, vec4 samplerSize) {
    vec2 uv = coord*samplerSize.xy - 0.5;

    vec2 icoord = floor(uv);
    vec2 f = fract(uv);

    vec4 topleft = texture2D(sampler, (icoord + vec2(0.5,0.5)) * samplerSize.zw);
    vec4 topright = texture2D(sampler, (icoord + vec2(1.5,0.5)) * samplerSize.zw);
    vec4 bottomleft = texture2D(sampler, (icoord + vec2(0.5,1.5)) * samplerSize.zw);
    vec4 bottomright = texture2D(sampler, (icoord + vec2(1.5,1.5)) * samplerSize.zw);

    vec4 top = mix(topleft, topright, f.x);
    vec4 bottom = mix(bottomleft, bottomright, f.x);

    return mix(top, bottom, f.y);
}

void main(void)
{
    vec4 map = texture2D_bilinear(mapSampler, vFilterCoord, mapSize);

    map -= 0.5;
    map.xy = scale * inputSize.zw * (rotation * map.xy);

    gl_FragColor = texture2D(uSampler, clamp(vTextureCoord.xy + map.xy, inputClamp.xy, inputClamp.zw));
}

The vertex shader is the default DisplacementFilter shader. I also had to pass in a new uniform ("mapSize") to tell the fragment shader the dimensions of the displacement map. But it works 100% - the warped result is just as smooth and pixel perfect as my manual canvas version linked above. So now I can warp large images at 60 fps instead of taking roughly one second per frame. :)

But I still don't understand why this was needed. I was under the impression that Pixi used bilinear interpolation by default in texture2D calls (by passing GL_LINEAR). Or am I wrong about this?

To Pixi devs: feel free to add this as an boolean option to DisplacementFilter. Call it accurateDisplacement or smoothDisplacement or something. But it probably shouldn't replace the default shader since the bilinear interpolation makes it run slower than the default (4x the number of calls to texture2D).

Link to comment
Share on other sites

I completely forgot about your issue. again.

Yes, enabling "baseTex.scaleMode = PIXI.SCALE_MODES.LINEAR" only makes things worse in 5.0.4.

However, it works in 5.1.1 because we automaticaly enable FLOAT_LINEAR extension, lines 172 and 183 of https://github.com/pixijs/pixi.js/blob/dev/packages/core/src/context/ContextSystem.js#L183 

It doesn't work on all devices, you can see where's it actually available at webglstats.com . Its different extensions for webgl1 and webgl2. Your solution will work everywhere.

Congratulations with the achievement!

Link to comment
Share on other sites

I wasn't aware that you guys had done this in 5.1.1. So I enabled PIXI.SCALE_MODES.LINEAR and went back to the default DisplacementFilter and ... it worked perfectly. The new shader code I was so happy about is completely unnecessary (on desktops at least - maybe I'll still use it if I want to be ambitious about mobile support). Oh well, at least I learned a lot the past few days. :(

Don't worry about missing my issue, I'm sure I'll manage to wear you down with a million other questions in the next few months. In fact, I already have a tricky one in mind - I'll start a new thread right away. :)

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...