nicknack23

Members
  • Content Count

    5
  • Joined

  • Last visited

  1. Alright, here is my code. Everything is there except for the .msh-file parsing. Instead I include the displacement map as two *long* Base64 strings at the bottom, along with the two images. I couldn't find any way to include HTML + CSS in a pixiplayground so I dumped it all at Codepen instead. I suggest you click "Change view" and put the code along the side to see the whole image. There are three demos that all do (or should do) the same thing. Example 1 uses Pixi displacementfilter with standard 8 bit color channels for the displacement map. Example 2 does the same thing with float32 channels. And example 3 uses manual CPU-calculated displacements in vanilla javascript, with bilinear interpolation of both displacement vectors and color samples. In all three examples you can click on the toggle button to compare with Photoshop Liquify's rendering of the same displacement map. https://codepen.io/anon/pen/PrgBoN https://codepen.io/anon/pen/JQVZQw https://codepen.io/anon/pen/orOMjr I really hope you can figure out what causes the aliasing issue, because Pixi renders the warp 100x faster than the canvas version! And that might just make it possible to *animate* the warps, even with fairly large displacement maps.
  2. Sorry I misunderstood. I got confused and thought the Float32Array would somehow replace the color channels. This makes much more sense. I didn't manage to get the violet single pixel Float32Array sprite working using any variation of PIXI.Sprite.from(), but taking a detour by PIXI.BaseTexture() works fine: https://www.pixiplayground.com/#/edit/oo5hdwmRyB7W8bpT3PFvp https://www.pixiplayground.com/#/edit/vRktRf1v5CsF3yon0p6Ua And now my warp is running again, but sadly the aliasing actually got worse! But at least it's "regularly irregular" now - there are no random wavy lines anymore, and no single-pixel shifts in zero displacement areas. Here it is: https://imgur.com/a/gIg4zV6 The only thing I can think of is maybe the texture2D() function in the shader isn't interpolating displacement vectors. But it's supposed to, right? Any other ideas?
  3. That would be so cool if Float textures worked without rescaling! But it's not quite working yet for me, so let me ask for some clarifications. Are the following assumptions correct? The array myfloat32array contains normalized displacements in order xyxyxy.... etc, so the total length is width*height*2. I can change the shader so it calculates in pixels, but displacementfilter should work without change if I normalize myfloat32array values to max 1.0. The sprite created by PIXI.Sprite.from(myfloat32array ... is like any other sprite and can be displayed using app.stage.addChild(). Scaling of the displacementfilter is as before (the max displacement in pixels). I wanted to be sure about these steps because I get a WebGL warning and the end result looks weird. The sprite gets created but when I add it to the stage or call PIXI.filters.DisplacementFilter with it I get the warning "Error: WebGL warning: texImage2D: ArrayBufferView type not compatible with `type`." (at BufferResource.js:84:15), and it doesn't appear on screen. The warped image looks like this: https://imgur.com/a/farMrr0. Below is my new code, grateful for any suggestions! EDIT (sorry, but syntax highlighting disappears when I hit the post button, even though I set it to javascript). Also fixed I bug I saw just as I posted. Same result as above though. function mesh2meshimage_float(mesh) { let absMax = 0; for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) { absMax = Math.max(absMax, Math.abs(mesh.x[pixelindex]), Math.abs(mesh.y[pixelindex])); } const arr = new Float32Array(mesh.width * mesh.height * 2); for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) { const channelindex = pixelindex * 2; arr[channelindex + 0] = (mesh.x[pixelindex]/absMax + 1)/2; arr[channelindex + 1] = (mesh.y[pixelindex]/absMax + 1)/2; } return {array:arr, width:mesh.width, height:mesh.height, origwidth:mesh.origwidth, origheight:mesh.origheight, max:absMax}; } function liquify(meshimage) { const canvas = document.createElement('canvas'); const context = canvas.getContext('2d'); const displacementSprite = PIXI.Sprite.from(meshimage.array, { resourceOptions: {width:meshimage.width, height:meshimage.height}} ); const scalex = meshimage.origwidth/meshimage.width; const scaley = meshimage.origheight/meshimage.height; // displacementSprite.scale.set(scalex, scaley); PX.app.stage.addChild(displacementSprite); const displacementFilter = new PIXI.filters.DisplacementFilter(displacementSprite); PX.sprite.filters = [displacementFilter]; displacementFilter.scale.x = meshimage.max; // * scalex / 2; displacementFilter.scale.y = meshimage.max; // * scaley / 2; }
  4. WebGL2, interesting thought. If I'm reading the browser support listings correctly, then I would only have to sacrifice iOS users (assuming I direct MacOS people to Chrome or Firefox). They would have to fall back to WebGL1 and live with the aliasing. I guess that's acceptable. I'm currently using v5 but I'll use anything I have to if I can make this work. But does Pixi support creating textures with larger color depth like RGBA32UI or RGBA32F? I'm completely new to Pixi and didn't see any mention about this in the demos or the documentation, but I may have missed it. Can someone point me to an example? And if I get that far, would PIXI.filters.DisplacementFilter work on deeper textures out-of-the-box? Since I'm still a noob with even basic Pixi & WebGL stuff I'm pretty sure shader programming is out of my league for now. But out of curiosity, how much work would it be for an experienced shader programmer to modify the displacement code to take several images as input? Coming to think of it, since DisplacementFilter only requires R+G channels, wouldn't it be possible to use the B+A channels to double the precision? That sounds much easier to implement to me (but again, I'm a noob).
  5. I've written a small parser in javascript to read Photoshop Liquify meshes so I can apply liquify warps created in Photoshop to images dynamically in a browser. I first wrote a crappy implementation using manual pixel displacement calculations in canvas. That worked in principle and resulted in images visually identical to what I got in Photoshop, but it was horribly slow. Then while googling displacement maps to figure out a way to speed up my code I stumbled across the PIXI demos and realized I could potentially get super fast performance using PIXI & WebGL. The good news: the PIXI version is working and is several orders of magnitude faster than my canvas version. Astonishing! The bad news: I'm getting pretty nasty aliasing artifacts in regions where the displacement is large. My canvas implementation also looked rough until I added bilinear interpolation, but I suspect the problem here is that the displacement vectors which are originally represented by 32 bit floats need to be translated into 8 bit integers in the PIXI displacement maps. I was hoping that interpolation could fix that in PIXI too, but maybe not. Here's an imgur album that demonstrates the issue: https://imgur.com/a/jiKo3bP The test image is 800x800 pixels, and the maximum displacement of the warp is 371 pixels. The displacement of +/- 371 needs to be represented by an integer between 0 and 255, so every integer value of the displacement map corresponds to a jump of about 371*2/255 = 3 pixels. To me the aliasing artifacts look about 3 pixels wide, so maybe my suspicion is correct. (Another indication of this: the entire displaced image shifts up & left about 1.5 pixels even where there should be no displacement. The value 1.5 happens to be half of three, so maybe the zero displacement level is "halfway off center".) So, my question: is there a way in PIXI to represent displacements more accurately? Maybe another method that doesn't rely on 8 bit color channels? If not, can these aliasing artifacts be reduced by adding or improving the interpolation? Here are the relevant functions of the code I'm using: // Translate a mesh object containing x & y coords of Photoshop's displacement vectors // (displacement measured in pixels) to an 8 bit RGBA image with x & y displacements // stored in red and green channels. Zero displacement has value 128. Scale so the // 8 bits we have are used efficiently, i.e. the maximum displacement goes to 0 or 255. function mesh2meshimage(mesh) { let absMax = 0; for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) { absMax = Math.max(absMax, Math.abs(mesh.x[pixelindex]), Math.abs(mesh.y[pixelindex])); } const meshimage = new Uint8ClampedArray(mesh.width * mesh.height * 4); for (let pixelindex = 0; pixelindex < mesh.x.length; pixelindex++) { const channelindex = pixelindex * 4; meshimage[channelindex + 0] = 128 + mesh.x[pixelindex]/absMax*127; // R meshimage[channelindex + 1] = 128 + mesh.y[pixelindex]/absMax*127; // G meshimage[channelindex + 2] = 128; // B meshimage[channelindex + 3] = 255; // A } const imagedata = new ImageData(meshimage, mesh.width, mesh.height); return {imagedata:imagedata, origwidth:mesh.origwidth, origheight:mesh.origheight, max:absMax}; } // Now use PIXI to render the displaced image. function liquify(meshimage) { const canvas = document.createElement('canvas'); const context = canvas.getContext('2d'); canvas.width = meshimage.imagedata.width; canvas.height = meshimage.imagedata.height; context.putImageData(meshimage.imagedata, 0, 0); const displacementSprite = PIXI.Sprite.from(canvas); const scalex = meshimage.origwidth/meshimage.imagedata.width; const scaley = meshimage.origheight/meshimage.imagedata.height; displacementSprite.scale.set(scalex, scaley); PX.app.stage.addChild(displacementSprite); const displacementFilter = new PIXI.filters.DisplacementFilter(displacementSprite); PX.sprite.filters = [displacementFilter]; displacementFilter.scale.x = meshimage.max * scalex / 2; displacementFilter.scale.y = meshimage.max * scaley / 2; }