Jump to content

How to use different Graphics quality for desktop and mobile devices?


caymanbruce
 Share

Recommended Posts

If you have ever played slither.io you know what I mean. When playing on desktop the game will have better graphics quality but on a mobile browser it will fall back to coarse graphics and everything just looks much smaller. How can I do that in PIXI.js? It won't change automatically in PIXI.js, I probably need a condition check.  What is the best way to detect a mobile browser and a desktop browser in PIXI.js?

I want to implement this in my game.

Link to comment
Share on other sites

As a basic 'is this mobile or desktop' - https://github.com/kaimallea/isMobile is a decent lib which Pixi uses internally.

But a better option than just 'mobile or desktop' is to detect some device capabilities. At a base level 'do you support WebGL'. If not, set the lowest settings due to canvas fallback! But if it does support WebGL, you can query the hardware has to what is supported. Based off the official 'is webgl supported code' from https://github.com/pixijs/pixi.js/blob/dev/src/core/utils/index.js#L295 .... imagine i'm inside that 'if ( gl ) {' statement

if ( gl.getParameter( gl.SAMPLES ) ) {
	const maxSamples = gl.getParameter( gl.SAMPLES );
}

Now, 'samples' can equate to 'can I do multi-sampling anti-alising, and if so, how many samples can I do. Older device hardware won't support this, so will have a value as 0... whereas modern devices do. 

https://webglstats.com/ is a great website to show the available parameters and the stats for what the typical results. So in the above example, I could say "Well, if you support less than 4 samples, that's the lowest 13% performing of devices, you're all low quality". You can do this for a number of parameters, like texture units available, max texture size supported etc.

http://webglreport.com/ Is another useful website to get that kinda report on the hardware you are running. So if a user has a specific performance issue, send them there and see if you can lower quality according something their hardware doesn't support.

As for how to degrade quality, the 2 things that really effect performance and are easy to change are build time are supply lower resolution assets to lesser quality devices, and lowering the rendering resolution for lesser quality devices. 

Beyond that you're looking at profiling your code on lesser quality devices to see where the bottleneck is. WebGL filters, for example, can often be a good candidate to remove on lower quality devices that still support WebGL

 

Link to comment
Share on other sites

@themoonrat Thanks I will follow the list and try them out.

I am interested at the idea of supplying lower resolution assets on lesser quality devices. Thank you for answering my "secret question" that is hiding in my original question. But then I look at my game and I don't know where to start because the graphics are all created dynamically at runtime. I can't think of any way producing lesser quality than simply drawing some shape on the stage which I am already doing. Unless I draw some colourful thick curvy lines on the screen like that of slither.io, but the curvy line needs to move like a snake just as what it is doing for now. It seems there are tons of work ahead.

Link to comment
Share on other sites

Even if your textures are created dynamically at runtime, when you convert them to a texture, you can scale that generated texture down.

```const texture = renderer.generateTexture( displayObject, 0, 0.5 );```

for example will create that texture at half the resolution

Looking at your render tree (using pixi-inspector and PIXI.utils.BaseTextureCache in the console); because of all of this generated sprites, you're missing out on one of the tricks that makes WebGL so fast; rendering sprites from the same base texture. In WebGL, each time you change the baseTexture that the renderer has to render from, there is a slight penalty. If all images are from just a few base textures, then this penalty goes away. Each generate texture you're creating is from a different base canvas... so the optimisations that allows crazy levels of bunnies in the famous bunnymark can't occur in your game.  Is there a reason you have to generate the assets in game? If you _have_ to, then generate them all into one display object, convert that to a base texture and manually creating your own Textures from that?

 

 

Link to comment
Share on other sites

@themoonrat Thanks I originally designed it in Canvas mode. That's why it's full of canvas tricks.  But then I realise using Canvas gives very bad performance in my game so I lifted the restriction of webGL. I asked this a few months ago in this forum too. 

I have tried pixi-inspector before I remember it didn't work for pixi v4? Maybe I have used the wrong version. Now I can see some useful stuff with the tool. Thanks for pointing it out. 

Good question about why I have to generate the assets in game. I have researched slither.io source files but I can't find any assets that produce the snakes, except for the hats on their head and the background image. So I think I can also do that in my game. Turns out the performance on my local computer is OK. But I haven't tried a very crowded scene on the Internet. For every player I only use two textures. Then every time I create a new sprite I will add the texture to the sprite.

I also want players to have different patterns on their body so I think it's better to keep a texture for every player. But maybe this is inefficient. 

How to

Quote

generate them all into one display object, convert that to a base texture and manually creating your own Textures from that?

Is there any example? I just can't get my head around this idea.

Link to comment
Share on other sites

Personally I put together an app + a corresponding framework for asset management  (very similar in what you have in XCode which is where I got the idea). Basically all my assets are added in the app and managed at runtime by the framework, as for bitmaps I can set 5 different resolutions: 0.25, 0.5, 1, 2, 3 and either let the framework decide which one to use (on IOS it will use 2 or 3) or force lower resolutions for better performance. 

The code is simply this for bitmap:

var image = new ContextImage("myimage")

Where "myimage" is a bitmap in the managed assets and the ContextImage object picks the correct resolution defined at run time or forced by code. Resolution of any contextimage can be changed on the fly for example if I detect a drop in fps I might set some contextimage to a lower resolution.

I can't share any of this but that could give you an idea about how to handle different resolution in your project.

Link to comment
Share on other sites

The app allows the creation of groups, within each group all names (they can be different from the asset names) have to be unique for each type of asset (sound, bitmap, text, spritesheet, ect), if another group uses similar name they can be differentiated at runtime by using "groupname.name" but I try to keep all name unique across all groups which makes it easier to remember what they are for anyway. The system supports all text files, bitmaps, spritesheet/json/xml and sound (mp3 only), another cool thing the resolution thingy works just fine with spritesheets so I can load a low resolution of a spritesheet first (if not already cached) and then load the normal resolution one.

Link to comment
Share on other sites

Are you interested in runtime atlases? https://github.com/gameofbombs/pixi-super-atlas , example: https://github.com/gameofbombs/pixi-super-atlas/blob/master/test/checkpack.ts , work in progress. Right now it allows automatically merge all resources in one big atlas or use single texture for elements that dont fit. They can fit after repack. 

Handy if you want to test things without calling texturePacker. Supports mipmaps.

Link to comment
Share on other sites

Sorry I misread your previous question, yeah that could be interesting for some use cases. I tried to include a swf animation autoconversion in the app but had to give up since at least some animation features couldn't be correctly parsed (shape morphing) and just making a bitmap of it was defeating the purpose. So I create my spritesheet using 'normal' tools and then drop it in my app and click distribute which gives me a 0.5, 0.25 additional resolution (a sort of mipmaps) for progressive download at runtime.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...