Jump to content

autoDetectRenderer doesn't pick correct renderer on some machines.


pitforest_travis
 Share

Recommended Posts

It appears that the function chooses the WebGL renderer on some hardware/browsers which do not support it, and fails. I've tested our current game (it's up on www.pitforest.de/games/9blocks) on various machines and devices (various desktop PCs, as well as multiple versions of iPhone,iPad and android devices) and it works very well. But just now I got a bug report from a machine that apparently only has a simple CPU-integrated GPU, and the browser in question (latest version of Firefox), tries to create a WebGL renderer, which results in the pixi error (paraphrasing): "This browser does not support WebGL, try using canvas renderer instead". 

 

I assume I have to create a canvas renderer in this case manually. How can I detect in my code whether the client machine supports WebGL, and pick the canvas renderer accordingly?

Link to comment
Share on other sites

Note: I worked around this by simply trying to manually create a WebGLRenderer, and if an exception is thrown I catch it and create a canvas renderer instead. The root of the error is probably something with the the browser saying WebGL is there, but the driver saying "uh, no" when you actually try to use it. Perhaps PIXI should try include my approach?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...