Jump to content

Too GPU-Intensive for Pixi?


DungeonDame
 Share

Recommended Posts

I'm making a 2d platformer in Pixi that uses a lot of image files, similar to Cuphead (which is made in Unity). Although the demo I linked below easily runs at 60 fps on my desktop computer, I tried it on a few laptops of varying quality and when played in fullscreen, and it lags to about 10-20 fps on all of them. The lag disappears if I exit fullscreen and resize the window smaller. This leads me to believe it's lagging because it's too GPU-intensive. There are ~200 PNG-32 textures at ~300x300 size needed during gameplay in the current demo, but the final game would need more like 500. It seems to lag about the same even if I load an extra 300 textures. My math says 500 textures of those specs is only 0.36 gb of GPU memory.

I thought I was already taking adequate precautions to optimize performance...
- Spritesheets are scaled down as much as practically possible with as little wasted space as possible. I don't compress more than PNG-32 because I need a full transparency channel.
- If a spritesheet is larger than ~300x300, I split it up into multiple files. For example, when you kill a slime there's over 100 frames of animation when it dies. That's cut up into these spritesheets: [1, 2, 3, 4, 5, etc]. 
- I cycle through each texture and draw a single pixel on a hidden layer (3 images per second), to keep them loaded in GPU memory. That way automatic garbage cleanup doesn't remove them and require them to load again during gameplay.
- I unload textures from GPU that are not needed for the current game state.
- Only what's on screen is drawn (using tile-based logic).
- Scripting logic is already quite opimized and doesn't seem to be the issue.
- The minimap uses a render texture.

My game loop uses setInterval instead of requestAnimationFrame. I use setInterval because otherwise some devices seem to cap the framerate when at fullscreen.

Anyway, I was advised that due to Pixi's performance limitations, I should ditch the Pixi version (that I've already spent over 1000 hours on), and remake it in Unity instead (like Cuphead). Apparently if I did so, it would perform considerably better on a variety of devices, however I cannot easily run any tests to confirm this.

My main questions are...
- Was it a mistake to make it in Pixi when it requires this many textures?
- Do you see any way I could improve performance for weaker computers beyond what I'm already doing?
- Does it sound accurate that remaking it in Unity would be a big performance boost?

Any help is very much appreciated.


Gameplay sample video

Demo

 

screen.png

Edited by DungeonDame
Link to comment
Share on other sites

Did you look at it through SpectorJS? 

Look at how frame is formed, and how much texture mem you eat.

Look at usual chrome devtools profile, what actually consumes more resources, CPU or GPU, how big is idle, e.t.c.

> I don't compress more than PNG-32 because I need a full transparency channel.

PNG does help only for network optimization. For real stuff, to eat less video-memory, you have to use different formats, whether its in pixi or unity. DDS, BASIS, e.t.c. , DDS is usually gzipped on server, there are many details, cant explain it really in one sentence.

My math says 500 textures of those specs is only 0.36 gb of GPU memory.

sounds about right

Edited by ivan.popelyshev
Link to comment
Share on other sites

Using 300x300 sheets means you have to bind/unbind a lot of textures - why don't you combine them into 2048x2048 sprite sheets? If you don't want to fetch large images or don't know which sprite sheets you'll need, you can use something like @pixi-essentials/texture-allocator to dynamically combine them on the client.

Also, using `setInterval` may be problematic because you might be overloading the browser by rendering back-to-back (i.e. you script takes up the full 16ms frame or even more - overlapping frames).

Link to comment
Share on other sites

5 hours ago, ivan.popelyshev said:

Did you look at it through SpectorJS? 

Look at how frame is formed, and how much texture mem you eat.

Look at usual chrome devtools profile, what actually consumes more resources, CPU or GPU, how big is idle, e.t.c.

> I don't compress more than PNG-32 because I need a full transparency channel.

PNG does help only for network optimization. For real stuff, to eat less video-memory, you have to use different formats, whether its in pixi or unity. DDS, BASIS, e.t.c. , DDS is usually gzipped on server, there are many details, cant explain it really in one sentence.

My math says 500 textures of those specs is only 0.36 gb of GPU memory.

sounds about right

I tried profiling it in chrome devtools, attached are the results. At almost fullscreen, it was running at roughly 20 fps with random lag spikes, but it shows a large amount of idle and says activity is mostly scripting. When I resize the window smaller it immediately goes back to 60 fps with no lag spikes. I don't understand this because if it's scripting-related, I thought the window size wouldn't matter. And how could it be idling almost half the time and yet only running at 20 fps?

I'll look into SpectorJS and the other image formats you mentioned, thanks.

 

3 hours ago, Shukant Pal said:

Using 300x300 sheets means you have to bind/unbind a lot of textures - why don't you combine them into 2048x2048 sprite sheets? If you don't want to fetch large images or don't know which sprite sheets you'll need, you can use something like @pixi-essentials/texture-allocator to dynamically combine them on the client.

Also, using `setInterval` may be problematic because you might be overloading the browser by rendering back-to-back (i.e. you script takes up the full 16ms frame or even more - overlapping frames).

I originally had larger textures around that size when I first discovered the lag issues, so I tried cutting them up into smaller pieces to rule out the possibility that texture size was causing lag. In theory since they don't get unloaded from memory during gameplay, it shouldn't matter though.

I tried switching back to requestAnimationFrame, but it seems instead of skipping frames it just moves in slow motion and the performance profile is about the same.

profile1.png

profile2.png

Edited by DungeonDame
Link to comment
Share on other sites

 it was running at roughly 20 fps with random lag spikes, but it shows a large amount of idle and says activity is mostly scripting.

It means the problem is gpu-side. Look at spectorJS output, i dont remember if its possible to attach it as a file to forum post

 

EDIT: do you use any filters?

Edited by ivan.popelyshev
Link to comment
Share on other sites

34 minutes ago, ivan.popelyshev said:

EDIT: do you use any filters?

I use the filters really sparingly. Some AlphaFilter on the slime enemy and for fading out coins, and some ColorMatrixFilter for when an enemy gets damaged, but that's about it. I can remove everything that uses a filter and the performance is the same.

I do modify colors a lot using sprite.tint though.

Edited by DungeonDame
Link to comment
Share on other sites

Hi Ivan,

I included pixi-compressed-textures.js and I tried converting the textures to .dds format, with some weird results. My current version of Pixi is 5.2.4 and I chose the latest build of pixi-compressed-textures, which the readme says "Compressed textures and retina support for pixi v5."

Some of the textures work fine, and others are just a black square. I confirmed the ones that are working are in fact dds and not png.

When I view the files in explorer, I can see the dds files are not black squares, and look the same as the png versions.

There doesn't appear to be any noticeable reason why some work and some don't.

In this screenshot, both the stickman and its hair are animated spritesheets in dds format with dimensions of ~300x300, but the stickman shows up and the above/below hair textures are both black squares.

Any idea how I can fix the display of dds textures? I want to try converting all textures to dds and see if it helps performance.

I also tried adding the line loader.use(PIXI.compressedTextures.ImageParser.use); but this doesn't make a difference.

 

Thanks

ddsprobs.png

hair.png

Edited by DungeonDame
Link to comment
Share on other sites

4 hours ago, ivan.popelyshev said:

Then its not the texture problem. Need more information, look in SpectorJS chrome extension, look how many drawcalls are there

There are 33 instances of "drawElements", all something like this: "drawElements: TRIANGLES, 660, UNSIGNED_SHORT, 0"

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...