• Content Count

  • Joined

  • Last visited

  • Days Won


louislourson last won the day on August 7

louislourson had the most liked content!

1 Follower

About louislourson

  • Rank

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. All right, solved it thanks to your suggestions : new BaseTexture(canvas) was indeed the solution. However, I can't pass the images or canvas from the node canvas library directly, because they are not stricly speaking HTMLCanvasElement or HTMLImageElement, and so the auto ressource detector will not recognize them (went into the source code, it does an "instanceof" test). What worked was creating a custom ressource class "NodeCanvasResource" that inherits from BaseImageResource. The class itself is super simple, and just serves as a bypass of the autodetect : class NodeCanvasResource extends resources.BaseImageResource { constructor(source: any) { super(source); } } Then to actually load the image : async getImageFromCanvas(path: string) { console.log('loading image from file : ' + path) let image = await loadImage(path); let ressource = new NodeCanvasResource(image); let bt = new PIXI.BaseTexture(ressource); return bt; } As a note, this will not work without the polyfill from pixi-shim : if you don't use this code you'll get a "Canvas or Image expected error". Thank you all !
  2. Hi, thank you for your input I'm assuming you are talking about PIXI.Texture.fromBuffer. Unfortunately this does not seem to work in canvas yet. (this was one of the things I tested, fromBuffer works well on my frontend with webgl, but not on the backend with canvas.) I found a thread on this forum saying it was not implemented yet. I'll look into the texture ressource in the source code, thanks.
  3. Hi! Let me start by explaining my use case: I'm the developer for a website that lets you build and share blueprints for a game. The front end of the website uses pixi.js and it works really well! (website : blueprintnotincluded.com). One of the functionality of the website is that users can upload their blueprint for others to see, and as part of that, a thumbnail for their blueprint is generated and stored on the backend. My problem is this : right now, this thumbnail is generated on the frontend side, and uploaded to my backend. This is not ideal for several reasons : one, I have to trust the front end about what the thumbnail looks like. Although there are some controls in place, there is little stoping users from sending bad thumbnails to my backend for others to see. Two, if I ever need to regenerate the thumbnails for all existing blueprints for some reason, right now there is no simple way of doing that. Now that my use case is out of the way, the solution : I want to be able to generate thumbnails on my backend (a simple node / express server) using pixi.js. Here is what I have been able to achieve for now : using the fantastic pixi-shim library, I have been able to move all my drawing code to a shared library, and run my drawing code on the backend side without runtime errors. The problems start when I tried to export the generated drawing : I'm using something like this to extract my thumbnails from pixi / canvas : pixiApp.renderer.render(container, rt, true); let base64: string = PixiPolyfill.pixiPolyfill.pixiApp.renderer.plugins.extract.canvas(rt).toDataURL(); However, as noted in this issue, extracting data from pixi is not easy. All I was getting was a transparent image. After investigation, the problem seems to come from the texture loading on my backend. This code, that loads the image on my front end, does not work on the backend : let baseTexture = new PIXI.BaseTexture(imageUrl); I tested several solutions: I tried to use an absolute path as the url, I tried to add the image to my virtual dom using the jsdom-global and canvas npm libraries. I have also tried to used a base64 data url. In the end, the baseTexture I generate always has their "isValid" flag set to false. I should add that I'm using pixi.js-legacy (5.3.0), and all my drawing on the backend are using the canvas api, and not webgl. However, everything else in pixi seems to be working just fine. For example, I can create a graphics object, draw some rectangles on it, and the thumbnail generated will show those rectangles. So, here is the workaround I have found so far : for every texture I need to render for my thumbnails, when my server starts, I preload them : I read every pixel of the texture one by one, and using a graphics object, and small 1x1 rectangles of the correct color, I draw my texture to a render target. This render target can then be used in sprites, and the sprites will be rendered correctly in my thumbnails. This works well enough, and the only thing I had to change to my rendering code is the way my textures are loaded : async getImage(path: string) { let data: any = {}; let width = 0; let height = 0; console.log('reading ' + path); data = await Jimp.read(path); width = data.getWidth(); height = data.getHeight(); let brt = PixiPolyfill.pixiPolyfill.getNewBaseRenderTexture({width: width, height: height }); let rt = PixiPolyfill.pixiPolyfill.getNewRenderTexture(brt); let graphics = PixiPolyfill.pixiPolyfill.getNewGraphics(); let container = PixiPolyfill.pixiPolyfill.getNewContainer(); container.addChild(graphics); for (let x = 0; x < width; x++) for (let y = 0; y < height; y++) { let color = data.getPixelColor(x, y); let colorObject = Jimp.intToRGBA(color); let alpha = colorObject.a / 255; color = color >> 8; graphics.beginFill(color, alpha); graphics.drawRect(x, y, 1, 1); graphics.endFill(); } PixiPolyfill.pixiPolyfill.pixiApp.renderer.render(container, rt, false); // Release memory container.destroy({children: true}); container = null; rt.destroy(); rt = null; data = null; global.gc(); //console.log('render done for ' + path); return brt; } Using this method, I am able to generate my thumbnails in the backend. However, and this is where I need help, this method is very inefficient. I have about a hundred 1024x1024 spritesheet for my app. preloading them at the web server start is very slow (about 10 minutes). I could live with the delay since I only need to preload them once, however, it is also very memory intensive. When all the textures are preloaded in this way, my server sits at about 2.6Gb of ram. (The server takes about 200Mb when not loading any textures at all, and all my png textures weigh about 40Mb, so 2.6Gb seems excessive). I'm pretty sure my memory issue is not coming from the library I'm using to read the png files (jimp). I tested this by converting my png files into json bitmap data and loading these without calling jimp at all. So, here it is : I need help loading my png files on the backend, my current solution is slow (not a huge deal), but also very memory intensive (and this is a bit more problematic, as the RAM I need directly impacts the price of my web server...) Thank you very much for any help or advice you can provide.
  4. Hi, I would love for fromBuffer() to be implemented with canvas2d. I tested it in 5.3.0 and it was working with webgl, but not with canvas. Is there an issue number I can follow ? Thank you